Can you add the ability to paginate starting at a specific logid? E.g. something like
https://logs.tf/api/v1/log?before=some_logid
. A naïve iteration will get duplicate logs. E.g. imagine if there are 25 logs and I have a limit of 10. The first request will get logs 24-15. Then someone uploads a new log. I then ask for another 10 logs, starting at offset 10. This gets me logs 6-15. Notice how log 15 gets requested twice. If the import lasts for a long time (and no corrective measures are taken), this can end up being a significant fraction of the requested logs.
I would also like to verify a few things about the JSON. Is info.date the upload time? In UTC? And rounds[].start_time in the local timezone? Do new log uploads get their own logid even if they are updates to another logid? Or do they keep the same logid as the log they update? Does info.date change if the log is updated? Is there a "statute of limitations" after which a log can no longer be updated?
And lastly, what in the world does players[].heal represent? On some logs it seems to represent healing done, and on others it seems to count self-healing. For example in this log, Kumis has 38634 heal and in this log he has 0, even though he played scout/heavy both times. Complicating matters, for medics the player[].heal mostly equals the sum of healspread[][], which implies that it is healing done.