Managing logs and figuring out tournament, season or even 'last 3 games' stats is a largely manual process. logs.tf shouldn't include this since its whole purpose is to be lightweight and fast. This utility makes it easy to grab these logs and interrogate them using R. I chose R mostly because its fastest for me to code in but also because it innately has the statistical framework for conversion to other packages like machine learning and forecasting.
You can find the package on github.
In terms of functionality you can do things like "logs <- queryLogstf(tournament = "Insomnia52")" and grab all the logs from the tournament then get a giant table of mean per match stats using something like "tab <- aggregateStats(logs)" A more complete use case can be found as a vignette or in a compiled HTML form.
Note about web scraping worries
I use the logs.tf API to get the raw stats and then parse them instead of going to the page and scraping that. So, yes you can track the "number of donks" per game. I do use comp.tf to get the log IDs for events, however, I cache the results in an archive so it isn't called again. If we share our archives by sending me the file or by submitting a pull requests, the archive could encompass all pro TF2 events.
In a related note, UGC and ESEA leagues are not supported due to logs not being up on comp.tf. Creative queries to the logs.tf search API can be done using the tools provided but top level querying will result in an error. I encourage everyone to help populate these fields and use the website cause its a standardized and easy way to look back at the pro TF2 landscape.
Future
- ss support
- Team queries
- Make a "Media notes" automatic document for upcoming games.