Upvote Upvoted 3 Downvote Downvoted
latency guide
posted in Hardware
1
#1
0 Frags +

https://docs.google.com/document/d/1c2-lUJq74wuYK1WrA_bIvgb89dUN0sj8-hO3vqmrau4/edit#

does anyone know if this is all placebo or is it useful

https://docs.google.com/document/d/1c2-lUJq74wuYK1WrA_bIvgb89dUN0sj8-hO3vqmrau4/edit#

does anyone know if this is all placebo or is it useful
2
#2
1 Frags +

mostly placebo
disabling SMT on intel and AMD is not a bad idea for gaming and security but i am 100% sure it doesnt improve latency
also OC the infinity fabric only if you want to improve latency between CCXs

mostly placebo
disabling SMT on intel and AMD is not a bad idea for gaming and security but i am 100% sure it doesnt improve latency
also OC the infinity fabric only if you want to improve latency between CCXs
3
#3
3 Frags +

Disabling all but 1 CCX for AMD users (so you potentially disable 50-75% of your cores) is a terrible idea as you can easily accomplish the same thing with process affinities, which can be automated with Process Lasso or a launch option.

Some motherboards and laptops just have bad latency due to bad audio drivers, WiFi or other devices so this won't make a difference. Using LatencyMon, or Windows ADK for a more in-depth analysis, to find the troublesome driver or device is the better way instead of shooting in the dark.

Worrying about removing a couple microseconds of input latency isn't worth it and you will get more benefit from upgrading your peripherals, especially when you significantly increase the risk of reducing system stability. Some of this stuff is already set to the best values so you don't need to do anything. If you can, overclocking the CPU to the maximum speed will get the best results along with a clean windows install de-bloated.

Disabling all but 1 CCX for AMD users (so you potentially disable 50-75% of your cores) is a terrible idea as you can easily accomplish the same thing with process affinities, which can be automated with Process Lasso or a launch option.

Some motherboards and laptops just have bad latency due to bad audio drivers, WiFi or other devices so this won't make a difference. Using LatencyMon, or Windows ADK for a more in-depth analysis, to find the troublesome driver or device is the better way instead of shooting in the dark.

Worrying about removing a couple microseconds of input latency isn't worth it and you will get more benefit from upgrading your peripherals, especially when you significantly increase the risk of reducing system stability. Some of this stuff is already set to the best values so you don't need to do anything. If you can, overclocking the CPU to the maximum speed will get the best results along with a clean windows install de-bloated.
4
#4
6 Frags +
denisbit13does anyone know if this is all placebo or is it useful

100% placebo. Nanoseconds of latency mean absolutely NOTHING.

Makemakedisabling SMT on intel and AMD is not a bad idea for gaming and security but i am 100% sure it doesnt improve latency
also OC the infinity fabric only if you want to improve latency between CCXs

It doesn't do anything on AMD. AMD's SMT implementation is vastly superior to Intel's and doesn't have the same security holes.

[quote=denisbit13]
does anyone know if this is all placebo or is it useful[/quote]
100% placebo. Nanoseconds of latency mean absolutely NOTHING.
[quote=Makemake]
disabling SMT on intel and AMD is not a bad idea for gaming and security but i am 100% sure it doesnt improve latency
also OC the infinity fabric only if you want to improve latency between CCXs[/quote]
It doesn't do anything on AMD. AMD's SMT implementation is vastly superior to Intel's and doesn't have the same security holes.
5
#5
8 Frags +

Seems like a compiled list of other people's advice, not necessarily based on any deep insight on the author's part. Some of the advice is good, but some is based on a misunderstanding that's stuck around for a while.

ie. Timer resolution is something you can set and it does affect tf2, but it shouldn't improve tf2's fps any -- afaik the only waits that appear in tf2 is for the fps cap (and the same is true for most other games too), and that just means when you hit your fps cap your frame times will be more precise (it has a similar mean but smaller standard deviation than without changing the timer resolution). Which really doesn't matter, slightly less variation in your fps won't do anything.

The main advice is to reduce interrupt-to-DPC latency, which has a small impact. If your DPC latency was 1.0 microseconds and you reduced it down to 0.5 microseconds, that's roughly equivalent to increasing fps from 60 to 66 (using some quick testing, you'd have to check your DPC count to calculate it for yourself). The guide mentions 0.4 being good but 0.3 being ideal, and that difference is about the jump from 60 to 61.25. All in all we're talking relatively small gains.

I'm not recommending against the advice, but if you're going to great lengths for any fps increase you can find, you can get bigger gains by running tf2 under linux where most of this guide doesn't apply anyway.

CBT100% placebo. Nanoseconds of latency mean absolutely NOTHING.

The input latency in the guide is measured in microseconds, which (when you consider that there's usually a few thousand interrupts a second) will translate to some amount of milliseconds every second. Not much, but it's not next-to-nothing like nanoseconds would be.

Seems like a compiled list of other people's advice, not necessarily based on any deep insight on the author's part. Some of the advice is good, but some is based on a misunderstanding that's stuck around for a while.

ie. Timer resolution is something you can set and it does affect tf2, but it shouldn't improve tf2's fps any -- afaik the only waits that appear in tf2 is for the fps cap (and the same is true for most other games too), and that just means when you hit your fps cap your frame times will be more precise (it has a similar mean but smaller standard deviation than without changing the timer resolution). Which really doesn't matter, slightly less variation in your fps won't do anything.

The main advice is to reduce interrupt-to-DPC latency, which has a small impact. If your DPC latency was 1.0 microseconds and you reduced it down to 0.5 microseconds, that's roughly equivalent to increasing fps from 60 to 66 (using some quick testing, you'd have to check your DPC count to calculate it for yourself). The guide mentions 0.4 being good but 0.3 being ideal, and that difference is about the jump from 60 to 61.25. All in all we're talking relatively small gains.

I'm not recommending against the advice, but if you're going to great lengths for any fps increase you can find, you can get bigger gains by running tf2 under linux where most of this guide doesn't apply anyway.[quote=CBT]100% placebo. Nanoseconds of latency mean absolutely NOTHING.[/quote]
The input latency in the guide is measured in microseconds, which (when you consider that there's usually a few thousand interrupts a second) will translate to some amount of milliseconds every second. Not much, but it's not next-to-nothing like nanoseconds would be.
6
#6
11 Frags +

Honestly if a guide recommends to disable half the cores you paid for you should ask yourself if that's advice you want to take. Even before considering that we're talking about half a microsecond.

Always remember that USB mice are polling based at 1000 Hz (unless you do weird shit). That means the input is only processed in 1ms chunks. Which means your average input lag from the mouse can never be lower than half a millisecond. The interrupt latency changes is 1/1000 of that. Do you really care that much?

Honestly if a guide recommends to disable half the cores you paid for you should ask yourself if that's advice you want to take. Even before considering that we're talking about half a [b]micro[/b]second.


Always remember that USB mice are polling based at 1000 Hz (unless you do weird shit). That means the input is only processed in 1ms chunks. Which means your average input lag from the mouse can never be lower than half a [b]milli[/b]second. The interrupt latency changes is 1/1000 of that. Do you really care that much?
7
#7
0 Frags +
CBTdenisbit13does anyone know if this is all placebo or is it useful100% placebo. Nanoseconds of latency mean absolutely NOTHING.Makemakedisabling SMT on intel and AMD is not a bad idea for gaming and security but i am 100% sure it doesnt improve latency
also OC the infinity fabric only if you want to improve latency between CCXs
It doesn't do anything on AMD. AMD's SMT implementation is vastly superior to Intel's and doesn't have the same security holes.

it is way better, but due to intel dominance the windows scheduler is optimized for intels HT so you may get ~5% better fps if you disable it on amd

[quote=CBT][quote=denisbit13]
does anyone know if this is all placebo or is it useful[/quote]
100% placebo. Nanoseconds of latency mean absolutely NOTHING.
[quote=Makemake]
disabling SMT on intel and AMD is not a bad idea for gaming and security but i am 100% sure it doesnt improve latency
also OC the infinity fabric only if you want to improve latency between CCXs[/quote]
It doesn't do anything on AMD. AMD's SMT implementation is vastly superior to Intel's and doesn't have the same security holes.[/quote]
it is way better, but due to intel dominance the windows scheduler is optimized for intels HT so you may get ~5% better fps if you disable it on amd
8
#8
2 Frags +

It's not optimized for Intel. It took Microsoft years to get the scheduling right for Intel HT/SMT, it took them years for AMD CMT and to no one's surprise AMD SMT/CCX scheduling wasn't perfect from day one either.

It's not optimized for Intel. It took Microsoft years to get the scheduling right for Intel HT/SMT, it took them years for AMD CMT and to no one's surprise AMD SMT/CCX scheduling wasn't perfect from day one either.
9
#9
0 Frags +

anyone else tried LatencyMon? are my latency pikes normal?

https://i.imgur.com/fEXGoYL.jpg

anyone else tried LatencyMon? are my latency pikes normal?
[img]https://i.imgur.com/fEXGoYL.jpg[/img]
10
#10
2 Frags +
sageanyone else tried LatencyMon? are my latency pikes normal?
https://i.imgur.com/fEXGoYL.jpg

From what I've seen, with any Intel CPU you can lower those values to 0.8/0.3.
If you're running a Ryzen you should expect higher values, I think it depends what gen you have. The guy who introduced me to this whole thing has Gen 1 Ryzen and after optimizing, he has 5.0/2.0. You can see in the guide itself that Gen 2 Ryzen can have 2.5/0.8. As an owner of Gen 3 Ryzen CPU myself, lowest I could get is 2.1/0.7, but I know some dude with the same CPU who managed to get 1.7/0.7 because he's running a better Windows version.

In short, those values are ok from a standpoint of someone who doesn't care about tiny improvements to latency. Tho I would say they're pretty bad unless it's Gen 1 Ryzen.

[quote=sage]anyone else tried LatencyMon? are my latency pikes normal?
[img]https://i.imgur.com/fEXGoYL.jpg[/img][/quote]
From what I've seen, with any Intel CPU you can lower those values to 0.8/0.3.
If you're running a Ryzen you should expect higher values, I think it depends what gen you have. The guy who introduced me to this whole thing has Gen 1 Ryzen and after optimizing, he has 5.0/2.0. You can see in the guide itself that Gen 2 Ryzen can have 2.5/0.8. As an owner of Gen 3 Ryzen CPU myself, lowest I could get is 2.1/0.7, but I know some dude with the same CPU who managed to get 1.7/0.7 because he's running a better Windows version.

In short, those values are ok from a standpoint of someone who doesn't care about tiny improvements to latency. Tho I would say they're pretty bad unless it's Gen 1 Ryzen.
11
#11
1 Frags +
sageare my latency pikes normal?

Yeah, if you look at what processes are actually causing the high latency interrupts, it's mostly networking / ports / graphics stuff that you'd expect to spike (especially when we're talking about <1ms spikes).

Interrupts should only ever really be a problem if they start distorting audio (what LatencyMon is intended to diagnose), or if you want to squeeze out that extra couple fps.

[quote=sage]are my latency pikes normal?[/img][/quote]
Yeah, if you look at what processes are actually causing the high latency interrupts, it's mostly networking / ports / graphics stuff that you'd expect to spike (especially when we're talking about <1ms spikes).

Interrupts should only ever really be a problem if they start distorting audio (what LatencyMon is intended to diagnose), or if you want to squeeze out that extra couple fps.
Please sign in through STEAM to post a comment.