mastercomszfnThis command isn't in your config, but can you please explain what this does:No, prediction doesn't really affect hitreg, it just increases the smoothness and responsiveness of what info you send to the server (like weapon firing and movement).
cl_pred_optimize
The default value is 2, but felix set it to 1 on his config, claiming better hit reg.
cl_pred_optimize 1 reuses prediction info from the last time your client predicted if the server didn't update you yet about what's going on.
cl_pred_optimize 2 does 1 and also something similar to 1 if the server gives you an update and the prediction turns out to be true.
Most of the time, these optimizations will happen most often when your frames are higher than your updaterate but can also happen when the server legitimately doesn't have anything to tell you about where you are or that you were wrong on a prediction.
I found a small snip in reddit about cl_pred_optimize 1/2 debate I though it is worth reposting it there:
https://www.reddit.com/r/truetf2/comments/51h73b/how_much_does_interp_affect_projectiles_also/
Kairu927:
wareya's recommended settings:
...
..
cl_pred_optimize 1 // Slightly improve hitreg in certain situations at high framerates at expense of an extremely slight performance reduction (default is 2)
kk_64:
wait cl_pred_optimize 1, don't most people recommend cl_pred_optimize 2? I tried to find out exactly what this does but all I could find was:
Official cVar documentation* : - Optimize for not copying data if didn't receive a network update (1), and also for not repredicting if there were no errors (2).
Surely 2 is better since it doesn't bother repredicting when there are no errors?
Mao-C:
technically "no errors" refers to new data being close enough to old data that it can just reuse the old data for prediction; rather than actually being perfect.
i guess technically it would make prediction more accurate, but the game will only switch to prediction when it doesnt recieve network frames, which means its not gonna be accurate anyway. So making sure that the game adjusts for some 5 degree turn for a single frame which wont be accurate in the first place doesnt really help much.
Edit: actually i found this(https://github.com/ValveSoftware/source-sdk-2013/blob/master/mp/src/game/client/prediction.cpp) and it seems like it adjusts entities to their predicted locations if, after prediction, the actual location is close enough. So it might cause some desynchronization.
So based in your explanation pred_optimize 1 does recycle the usage of last sended network data to client received from server when client its not actually receiving from srv.
kk_64:
That's actually pretty interesting, sounds like 1 is better if you can handle the extra work from repredicting since it would make things more accurate, if I understood everything correctly.
wareya comment on tftv on cl_pred_optimize 2 from http://www.teamfortress.tv/25553/a-way-too-detailed-networking-config#6
//This is fine but I want to note that lowering this can THEORETICALLY improve hitreg (by imperceptible amounts), particularly in obscure computing situations, at the expense of framerate.
(Q :: What would happen in cl_pred_optimize 0 ,Would it fail to obtain network data in these "client-server offline moments " and missaccurately represent the game frames in each of these events causing jittery jerky gameplay ,how prediction would be delayed or affected?).
I guess it would be less resource intensive and could release some frames.
A Mix read between the reddit snip and your declaration makes me think proviosionally the idea that since :
mastercomsMost of the time, these optimizations will happen most often when your frames are higher than your updaterate but can also happen when the server legitimately doesn't have anything to tell you about where you are or that you were wrong on a prediction.
and this:
amazoccl_pred_optimize 1 // Slightly improve hitreg in certain situations at high framerates at expense of an extremely slight performance reduction (default is 2)
That the optimizations that takes place when frames are higher than updaterate(66) are way more accurate representations of current game than the optimizations that occur when client is not getting network packets from server and reusing recent packets as per pred_optimize 1 bcoz packets from frames in the 1st situation contain more legitimate positional info.
->
So now cl_pred_optimize 2 :
mastercomscl_pred_optimize 2 does 1 and also something similar to 1 if the server gives you an update and the prediction turns out to be true.
Valve documenation : Optimize for not copying data if didn't receive a network update (1), , and also for not repredicting if there were no errors (2).
So based on Mao-C (reddit):
What I understand is that "no-errors" in cl_pred_optimize 2 means that the reused old packet with cl_pred_optimize 1 is considered to be in the range to be closely accurate to the new data and accepted, skipping prediction and itself prediction resulting to be true as pointed , although still considered a bit innaccurate representation, well atleast based of what I understand in the reddit conversation. So what would happen if the last packetinfo is considered not-error free? In other words ,How repredicting works ? So following reddit post , Forcing to repredict even though its still fps expensive allows to get more "perfect-close position wise-" information that might be acquired with cl_pred_optimized 1 ,but based on your cl_pred_optimized 2 statement it does what .._optimize 1 does and when server does send packets, I though cl_pred_optimized 2 was based as added condition to pred_optimize 1 where only in the situation that the server does not send packets . Kinda confused.