mastercomsAimIsADickFrametimes can vary in each frame (e.g one frame could take 100ms to render while another could take 400ms, but 1⁄2 = 500 ms)
That first frame you mentioned is 1 frame / .1 seconds = 10FPS, and the second one you mentioned is 1 / .4 = 2.5FPS. See I made the FPS vary each frame too.
Thing is that frame rate would be extremely misleading here, since the variance is practically invisible with just frame rate. The frame rate would only match up the total amounts if all of the frames rendered in the same amount of time, which isn't always the case. I'd feels like I'm just repeating my self here tho…
mastercomsAimIsADickwhy bother deriving the frametime from frame rate when the frame time will vary anyways? I mean frame rate is counted over a second right?
This is like asking why the SI unit hertz exists when we have seconds. They're used for different calculations and comparisons.
Can you elaborate on this point actually? I don't understand what you mean.
Hang on a second. I thought we were arguing on the reliability of frame rate, but we're really arguing on whether frame time is the same metric as frame rate. I think I get your logic here (that since frame rate is a frequency, frametime can technically be derived from frame rate and thus be used to get the frame rate), but I just don't agree that derivative metric are the same as their parent metric.
If this turns out to be basic stuff, then sorry about that. I just didn't learn about this beforehand.