I've been venturing into ubuntu (xubuntu) for the first time in a long time and I've decided to check out if TF2 ran any better since last time i tried.
This small guide specific to linux will be complimentary to the general performance guide provided by Comanglia
First off, forget about -dxlevel since directX is propietary microsoft software and TF2 on linux will be running on OpenGL.
Graphics cards
Make sure you have the latest drivers.
For AMD and nVidia you have propietary drivers available with instructions in their respective sites, wich are generally faster than the open source counterparts.
For intel graphics, their official drivers are open source already and should be installed by default. In fact intel integrated graphics are likely to perform better in linux than in windows for TF2.
Where is the Steam folder ?
The steam folder in linux is hidden by default in /home/user/.steam
You should be able to visualize hidden folders and files by simply pressing "Ctrl + H" on your home folder.
Desktop enviroments
- Don't use Unity, Gnome or KDE4 for gaming. The 3D desktop compositing will reduce performance and could add overhead (input / display lag). Only the latter can have it fully disabled without advanced tinkering afaik.
- Recommended to use xfce or the Steam Big Picture login mode wich use very light window managers.
- You might experience a wierd windowed no-border style thing if you try a non-native resolution in Source games. Your screen resolution wont actually change and it will be like a stretched image, while still using your desktop resolution.
- Installing SteamOS is not recommended since it doesnt give partitioning options and wipes your drive without asking or giving any options
Mouse sensitivity
Disable mouse acceleration in Ubuntu 16
Vsync
If you think you may have Vsync enabled, you can quickly disable it with the following command
export vblank_mode=0
Updated: 22/07/2017
I've been venturing into ubuntu (xubuntu) for the first time in a long time and I've decided to check out if TF2 ran any better since last time i tried.
This small guide specific to linux will be complimentary to the general [url=http://www.teamfortress.tv/25328/comanglias-config-fps-guide]performance guide provided by Comanglia[/url]
[b]First off[/b], forget about -dxlevel since directX is propietary microsoft software and TF2 on linux will be running on OpenGL.
[b]Graphics cards[/b]
Make sure you have the latest drivers.
For AMD and nVidia you have propietary drivers available with instructions in their respective sites, wich are generally faster than the open source counterparts.
For intel graphics, their official drivers are open source already and should be installed by default. In fact intel integrated graphics are likely to perform better in linux than in windows for TF2.
[b]Where is the Steam folder ?[/b]
The steam folder in linux is hidden by default in [i]/home/user/.steam[/i]
You should be able to visualize hidden folders and files by simply pressing "Ctrl + H" on your home folder.
[b]Desktop enviroments [/b]
[list]
[*] Don't use Unity, Gnome or KDE4 for gaming. The 3D desktop compositing will reduce performance and could add overhead (input / display lag). Only the latter can have it fully disabled without advanced tinkering afaik.
[*] Recommended to use xfce or the [url=http://www.omgubuntu.co.uk/2012/11/how-to-login-to-steam-big-picture-mode-in-ubuntu]Steam Big Picture login[/url] mode wich use very light window managers.
[*] You might experience a wierd windowed no-border style thing if you try a non-native resolution in Source games. Your screen resolution wont actually change and it will be like a stretched image, while still using your desktop resolution.
[*] Installing SteamOS is not recommended since it doesnt give partitioning options and wipes your drive without asking or giving any options
[/list]
[b]Mouse sensitivity[/b]
[url=http://askubuntu.com/questions/794185/how-to-disable-mouse-acceleration-in-ubuntu-16-04]Disable mouse acceleration in Ubuntu 16 [/url]
[b]Vsync[/b]
If you think you may have Vsync enabled, you can quickly disable it with the following command[code]export vblank_mode=0[/code]
[i]Updated: 22/07/2017[/i]
in my experience, the -dxlevel flag actually does change visual settings
also you can disable compositing on kde at least, which improves fps. But it's still better to use xfce or somethings like that.
in my experience, the -dxlevel flag actually does change visual settings
also you can disable compositing on kde at least, which improves fps. But it's still better to use xfce or somethings like that.
I used arch linux with gnome 3 a while back and followed a method similar to this to disable lag: http://superuser.com/questions/407043/is-it-possible-to-run-graphical-applications-such-as-firefox-without-installing
I can't for the life of me remember how I did it though.
I used arch linux with gnome 3 a while back and followed a method similar to this to disable lag: http://superuser.com/questions/407043/is-it-possible-to-run-graphical-applications-such-as-firefox-without-installing
I can't for the life of me remember how I did it though.
Tommyin my experience, the -dxlevel flag actually does change visual settings
also you can disable compositing on kde at least, which improves fps. But it's still better to use xfce or somethings like that.
Ive tried using -dxlevel 81 but it didnt show any apparent difference from stock and still showed "mat_dxlevel 90" on console
Are you sure you tried that on the linux version and not on the windows version running under wine in linux?
[quote=Tommy]in my experience, the -dxlevel flag actually does change visual settings
also you can disable compositing on kde at least, which improves fps. But it's still better to use xfce or somethings like that.[/quote]
Ive tried using -dxlevel 81 but it didnt show any apparent difference from stock and still showed "mat_dxlevel 90" on console
Are you sure you tried that on the linux version and not on the windows version running under wine in linux?
Disabling CPU powersaving also might help (setting CPU frequency scaling governor to "performance" instead of default). I use cpufreq-set from cpufrequtils package:
for ((i=0;i<$(nproc);i++)); do cpufreq-set -c $i -r -g performance; done
I made benchmark using governor "ondemand", which is default for me:
2639 frames 21.300 seconds 123.90 fps ( 8.07 ms/f) 9.436 fps variability
2639 frames 21.197 seconds 124.50 fps ( 8.03 ms/f) 10.121 fps variability
2639 frames 21.456 seconds 122.99 fps ( 8.13 ms/f) 9.636 fps variability
and using governor "performance":
2639 frames 20.154 seconds 130.94 fps ( 7.64 ms/f) 8.499 fps variability
2639 frames 20.087 seconds 131.38 fps ( 7.61 ms/f) 8.380 fps variability
2639 frames 20.187 seconds 130.73 fps ( 7.65 ms/f) 8.477 fps variability
Disabling CPU powersaving also might help (setting [url=https://wiki.archlinux.org/index.php/CPU_frequency_scaling#Scaling_governors]CPU frequency scaling governor[/url] to "performance" instead of default). I use cpufreq-set from [url=https://packages.debian.org/search?keywords=cpufrequtils]cpufrequtils[/url] package:
[code]for ((i=0;i<$(nproc);i++)); do cpufreq-set -c $i -r -g performance; done[/code]
I made benchmark using governor "ondemand", which is default for me:
[code]
2639 frames 21.300 seconds 123.90 fps ( 8.07 ms/f) 9.436 fps variability
2639 frames 21.197 seconds 124.50 fps ( 8.03 ms/f) 10.121 fps variability
2639 frames 21.456 seconds 122.99 fps ( 8.13 ms/f) 9.636 fps variability
[/code]
and using governor "performance":
[code]
2639 frames 20.154 seconds 130.94 fps ( 7.64 ms/f) 8.499 fps variability
2639 frames 20.087 seconds 131.38 fps ( 7.61 ms/f) 8.380 fps variability
2639 frames 20.187 seconds 130.73 fps ( 7.65 ms/f) 8.477 fps variability
[/code]
sage78Don't use Unity, Gnome or KDE4 for gaming. They have 3D desktop compositing stuff that will reduce performance and add overhead (input / display lag). Only the latter can have it fully disabled without advanced tinkering afaik.
This is bullshit, every DE disables compositing for full screen applications.
[quote=sage78]Don't use Unity, Gnome or KDE4 for gaming. They have 3D desktop compositing stuff that will reduce performance and add overhead (input / display lag). Only the latter can have it fully disabled without advanced tinkering afaik.[/quote]
This is bullshit, every DE disables compositing for full screen applications.
I dont know if its bullshit but i remember running stock Unity and playing TF2 on fullscreen, and getting some crazy input lag / overhead.
maybe they changed it since then but i wouldnt personally recommend a bloated DE anyway just to see a few "cool" 3d effects anyway.
dark_overlord1337Disabling CPU powersaving also might help (setting CPU frequency scaling governor to "performance" instead of default).
and using governor "performance":
great tip ! this worked in my laptop making the average clockspeed higher on the "performance" mode but it still varies (480 MHz to 2.16 GHz on intel N3050)
do you know how to make it fixed?
I dont know if its bullshit but i remember running stock Unity and playing TF2 on fullscreen, and getting some crazy input lag / overhead.
maybe they changed it since then but i wouldnt personally recommend a bloated DE anyway just to see a few "cool" 3d effects anyway.
[quote=dark_overlord1337]Disabling CPU powersaving also might help (setting [url=https://wiki.archlinux.org/index.php/CPU_frequency_scaling#Scaling_governors]CPU frequency scaling governor[/url] to "performance" instead of default).
and using governor "performance":
[/quote]
great tip ! this worked in my laptop making the average clockspeed higher on the "performance" mode but it still varies (480 MHz to 2.16 GHz on intel N3050)
do you know how to make it fixed?
sage78great tip ! this worked in my laptop making the average clockspeed higher on the "performance" mode but it still varies (480 MHz to 2.16 GHz on intel N3050)
do you know how to make it fixed?
Arch Wiki says that new Intel CPUs (like N3050) use intel_pstate driver which uses dynamic frequency scaling, so it should be fine (source):
Note: The intel_pstate driver supports only the performance and powersave governors, but they both provide dynamic scaling. The performance governor should give better power saving functionality than the old ondemand governor.
and also this article says that performance using intel_pstate shouldn't be an issue:
we are seeing very significant power/performance improvements with the 3.9/3.10rc code over using ondemand, and a much smaller performance gap with the "performance" governor in terms of performance.
[quote=sage78]great tip ! this worked in my laptop making the average clockspeed higher on the "performance" mode but it still varies (480 MHz to 2.16 GHz on intel N3050)
do you know how to make it fixed?[/quote]
Arch Wiki says that new Intel CPUs (like N3050) use intel_pstate driver which uses dynamic frequency scaling, so it should be fine ([url=https://wiki.archlinux.org/index.php/CPU_frequency_scaling#Scaling_governors]source[/url]):
[quote]Note: The intel_pstate driver supports only the performance and powersave governors, but they both provide dynamic scaling. The performance governor should give better power saving functionality than the old ondemand governor.[/quote]
and also [url=https://www.phoronix.com/scan.php?page=news_item&px=MTM3NDQ]this[/url] article says that performance using intel_pstate shouldn't be an issue:
[quote]we are seeing very significant power/performance improvements with the 3.9/3.10rc code over using ondemand, and a [b]much smaller performance gap with the "performance" governor in terms of performance.[/b][/quote]
Might be slightly off-topic but has anyone figured out how to get the equivalent of 6/11 Windows sensitivity on Linux? Its the only thing stopping me from changing to Linux as a daily driver because I don't want to relearn my sensitivity for games that don't support raw input.
Might be slightly off-topic but has anyone figured out how to get the equivalent of 6/11 Windows sensitivity on Linux? Its the only thing stopping me from changing to Linux as a daily driver because I don't want to relearn my sensitivity for games that don't support raw input.
Shanky
Sensitivitity is 1:1 (6/11) by default. I think what you're talking about is mouse acceleration. If you're using Xfce or KDE, you can disable it in the GUI.
But if you're using Gnome 3 or you're not using Desktop Environment at all, you can try disabling mouse acceleration by using xinput (you need to have xinput package installed), but it might not work on the newest version of X.org. First, you need to get your mouse name or id:
user@localhost~$ xinput --list
⎡ Virtual core pointer id=2 [master pointer (3)]
⎜ ↳ Virtual core XTEST pointer id=4 [slave pointer (2)]
⎜ ↳ YOUR MOUSE NAME HERE id=8 [slave pointer (2)]
...
then you disable acceleration (with second command you can adjust your "real" mouse sensitivity, "1" is the equivalent of 6/11):
xinput --set-prop "YOUR MOUSE NAME HERE" "Device Accel Profile" -1
xinput --set-prop "YOUR MOUSE NAME HERE" "Device Accel Constant Deceleration" 1.000000
If that doesn't work, you can try using xset command, but it also might not work:
xset m 0 0
Finally, you can disable mouse acceleration completely in the X.org config as described in this guide.
P.S. Actually, my "gaming" mouse had acceleration disabled by default, but my cheap office mouse didn't.
P.P.S. Also there are guides for setting up mouse sensitivity if you want it:
https://bugs.freedesktop.org/attachment.cgi?id=17772
http://510x.se/notes/posts/Changing_mouse_acceleration_in_Debian_and_Linux_in_general/
[quote=Shanky][/quote]
Sensitivitity is 1:1 (6/11) by default. I think what you're talking about is mouse acceleration. If you're using Xfce or KDE, you can disable it [url=https://i.imgur.com/Vu0YrKb.png]in the GUI[/url].
But if you're using Gnome 3 or you're not using Desktop Environment at all, you can try disabling mouse acceleration by using xinput (you need to have xinput package installed), but it might not work on the newest version of X.org. First, you need to get your mouse name or id:
[code]
user@localhost~$ xinput --list
⎡ Virtual core pointer id=2 [master pointer (3)]
⎜ ↳ Virtual core XTEST pointer id=4 [slave pointer (2)]
⎜ ↳ YOUR MOUSE NAME HERE id=8 [slave pointer (2)]
...
[/code]
then you disable acceleration (with second command you can adjust your "real" mouse sensitivity, "1" is the equivalent of 6/11):
[code]
xinput --set-prop "YOUR MOUSE NAME HERE" "Device Accel Profile" -1
xinput --set-prop "YOUR MOUSE NAME HERE" "Device Accel Constant Deceleration" 1.000000
[/code]
If that doesn't work, you can try using xset command, but it also might not work:
[code]xset m 0 0[/code]
Finally, you can disable mouse acceleration completely in the X.org config as described [url=https://wiki.archlinux.org/index.php/Mouse_acceleration#Disabling_mouse_acceleration]in this guide[/url].
P.S. Actually, my "gaming" mouse had acceleration disabled by default, but my cheap office mouse didn't.
P.P.S. Also there are guides for setting up mouse sensitivity if you want it:
https://bugs.freedesktop.org/attachment.cgi?id=17772
http://510x.se/notes/posts/Changing_mouse_acceleration_in_Debian_and_Linux_in_general/
I just wouldn't recommend gnome or KDE because they're crap lmao
also for xfce just use Compton with comptray. Ez for toggling on and off before games
I just wouldn't recommend gnome or KDE because they're crap lmao
also for xfce just use Compton with comptray. Ez for toggling on and off before games
You don't have to disable accel if you have raw input enabled. The only difference compared to Windows is that, when raw input is enabled, because of a bug in SDL in-game sensititivity is double in TF2. So with the same DPI, if you use 2.6 in Windows you should use 1.3 in Linux.
Bug: https://github.com/ValveSoftware/Source-1-Games/issues/1834
You don't have to disable accel if you have raw input enabled. The only difference compared to Windows is that, when raw input is enabled, because of a bug in SDL in-game sensititivity is double in TF2. So with the same DPI, if you use 2.6 in Windows you should use 1.3 in Linux.
Bug: https://github.com/ValveSoftware/Source-1-Games/issues/1834
Use openbox, HUUGE help.
Just use raw unput, works for me on debian + openbox
Use openbox, HUUGE help.
Just use raw unput, works for me on debian + openbox
sage78Tommyin my experience, the -dxlevel flag actually does change visual settings
also you can disable compositing on kde at least, which improves fps. But it's still better to use xfce or somethings like that.
Ive tried using -dxlevel 81 but it didnt show any apparent difference from stock and still showed "mat_dxlevel 90" on console
Are you sure you tried that on the linux version and not on the windows version running under wine in linux?
Yes I am sure. Maybe something has changed, because I experienced that 1+ years ago and I noticed they have been changing some things regarding dx recently (mm updates come to mind)
In my experience, openbox/i3/ light window managers are better for gaming (for multiple reasons) than kde/gnome, and I have tried both.
KDE is great for when you want to be sure that the computer has everything it needs already (perfect for reviving an old computer for a family member)
the xset method for macc has works for me.
[quote=sage78][quote=Tommy]in my experience, the -dxlevel flag actually does change visual settings
also you can disable compositing on kde at least, which improves fps. But it's still better to use xfce or somethings like that.[/quote]
Ive tried using -dxlevel 81 but it didnt show any apparent difference from stock and still showed "mat_dxlevel 90" on console
Are you sure you tried that on the linux version and not on the windows version running under wine in linux?[/quote]
Yes I am sure. Maybe something has changed, because I experienced that 1+ years ago and I noticed they have been changing some things regarding dx recently (mm updates come to mind)
In my experience, openbox/i3/ light window managers are better for gaming (for multiple reasons) than kde/gnome, and I have tried both.
KDE is great for when you want to be sure that the computer has everything it needs already (perfect for reviving an old computer for a family member)
the xset method for macc has works for me.
If anyone is testing different settings and wants an less tedious method to benchmark, here's a bash script I made when I was testing what worked well on my machine. It runs a demo 3 times, tries to scrape the settings used, along with the system specs and then appends the results to a file after the game closes. It's not exactly pretty, but it works.
Results look like this
Show Content
2016-10-11 23:28:00
--- Results ---
2639 frames 18.360 seconds 143.73 fps ( 6.96 ms/f) 12.826 fps variability
2639 frames 17.233 seconds 153.13 fps ( 6.53 ms/f) 10.980 fps variability
2639 frames 17.049 seconds 154.79 fps ( 6.46 ms/f) 10.242 fps variability
--- Settings used ---
Linux 4.7.7-1-ck x86_64 GNU/Linux
CPU : Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz (4 GHz)
RAM : 7874.86 MB
GPU : GeForce GTX 960/PCIe/SSE2
Driver : OpenGL ES 3.2 NVIDIA 370.28
Resolution : 1920x1080
Model Detail : Low
Texture Detail : Low
Shader Detail : Low
Water : Simple Reflections
Shadow Detail : Disabled
Antialiasing Mode : Disabled
Texture Filtering : 16x
Vertical Sync : Disabled
HDR : Disabled
Multicore Rendering : Queued Multi Thread
FPS Cap : Disabled
Glow Outline Effect : Disabled
Player HUD Model : Disabled
Weather Effects : Disabled
Sprays : Disabled
Eyes : Enabled
Flex : Enabled
Teeth : Enabled
Ragdolls : Disabled
Gibs : Disabled
comment: 960
2016-11-15 01:08:00
--- Results ---
2639 frames 14.515 seconds 181.81 fps ( 5.50 ms/f) 15.873 fps variability
2639 frames 14.368 seconds 183.68 fps ( 5.44 ms/f) 14.621 fps variability
2639 frames 14.411 seconds 183.12 fps ( 5.46 ms/f) 14.577 fps variability
--- Settings used ---
Linux 4.8.4-1-ck x86_64 GNU/Linux
CPU : Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (4.2 GHz)
RAM : 15982.1 MB
GPU : GeForce GTX 760/PCIe/SSE2
Driver : OpenGL ES 3.2 NVIDIA 370.28
Resolution : 1920x1080
Model Detail : Low
Texture Detail : Low
Shader Detail : Low
Water : Simple Reflections
Shadow Detail : Disabled
Antialiasing Mode : Disabled
Texture Filtering : Trilinear
Vertical Sync : Disabled
HDR : Disabled
Multicore Rendering : Queued Multi Thread
FPS Cap : Disabled
Glow Outline Effect : Disabled
Player HUD Model : Disabled
Weather Effects : Disabled
Sprays : Disabled
Eyes : Disabled
Flex : Disabled
Teeth : Disabled
Ragdolls : Disabled
Gibs : Disabled
comment: cpu
2016-12-06 17:45:36
--- Results ---
2639 frames 13.884 seconds 190.07 fps ( 5.26 ms/f) 13.100 fps variability
2639 frames 13.627 seconds 193.66 fps ( 5.16 ms/f) 10.425 fps variability
2639 frames 13.597 seconds 194.08 fps ( 5.15 ms/f) 10.556 fps variability
--- Settings used ---
Linux 4.8.12-1-ck-haswell x86_64 GNU/Linux
CPU : Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (4.2 GHz)
RAM : 15958.3 MB
GPU : GeForce GTX 760/PCIe/SSE2
Driver : OpenGL ES 3.2 NVIDIA 375.20
Resolution : 1920x1080
Model Detail : Low
Texture Detail : Low
Shader Detail : Low
Water : Simple Reflections
Shadow Detail : Disabled
Antialiasing Mode : Disabled
Texture Filtering : Trilinear
Vertical Sync : Disabled
HDR : Disabled
Multicore Rendering : Queued Multi Thread
FPS Cap : Disabled
Glow Outline Effect : Disabled
Player HUD Model : Disabled
Weather Effects : Disabled
Sprays : Disabled
Eyes : Disabled
Flex : Disabled
Teeth : Disabled
Ragdolls : Disabled
Gibs : Disabled
2016-12-06 17:49:57
--- Results ---
2639 frames 20.425 seconds 129.20 fps ( 7.74 ms/f) 7.336 fps variability
2639 frames 21.825 seconds 120.92 fps ( 8.27 ms/f) 6.757 fps variability
2639 frames 21.088 seconds 125.14 fps ( 7.99 ms/f) 6.619 fps variability
--- Settings used ---
Linux 4.8.12-1-ck-haswell x86_64 GNU/Linux
CPU : Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (4.2 GHz)
RAM : 15958.3 MB
GPU : GeForce GTX 760/PCIe/SSE2
Driver : OpenGL ES 3.2 NVIDIA 375.20
Resolution : 1920x1080
Model Detail : High
Texture Detail : Extra High
Shader Detail : High
Water : Reflect World
Shadow Detail : Medium
Antialiasing Mode : 16xQ CSAA
Texture Filtering : 16x
Vertical Sync : Disabled
HDR : Full HDR Enabled
Multicore Rendering : Default
FPS Cap : Disabled
Glow Outline Effect : Disabled
Player HUD Model : Disabled
Weather Effects : Disabled
Sprays : Disabled
Eyes : Enabled
Flex : Enabled
Teeth : Enabled
Ragdolls : Enabled
Gibs : Enabled (burning)
comment: cinema
Also here's a script that can change the governor to performance if any of the listed processes are running, and sets it to powersave if they're no longer running. However, you need cpupower to be exempted in the /etc/sudoers file so it can be elevated without a prompt. You can do that by adding something like this to the end of the file
your_username_here ALL = NOPASSWD: /usr/bin/cpupower
If anyone is testing different settings and wants an less tedious method to benchmark, [url=https://gist.githubusercontent.com/xiyzzc/3c524bd08ab6fd3eb388b3f8e89b321f/raw/556d2dc29b4ef68d50fa1e626957ee365d808567/tf2-benchmark.sh]here[/url]'s a bash script I made when I was testing what worked well on my machine. It runs a demo 3 times, tries to scrape the settings used, along with the system specs and then appends the results to a file after the game closes. It's not exactly pretty, but it works.
Results look like this
[spoiler]
2016-10-11 23:28:00
--- Results ---
2639 frames 18.360 seconds 143.73 fps ( 6.96 ms/f) 12.826 fps variability
2639 frames 17.233 seconds 153.13 fps ( 6.53 ms/f) 10.980 fps variability
2639 frames 17.049 seconds 154.79 fps ( 6.46 ms/f) 10.242 fps variability
--- Settings used ---
Linux 4.7.7-1-ck x86_64 GNU/Linux
CPU : Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz (4 GHz)
RAM : 7874.86 MB
GPU : GeForce GTX 960/PCIe/SSE2
Driver : OpenGL ES 3.2 NVIDIA 370.28
Resolution : 1920x1080
Model Detail : Low
Texture Detail : Low
Shader Detail : Low
Water : Simple Reflections
Shadow Detail : Disabled
Antialiasing Mode : Disabled
Texture Filtering : 16x
Vertical Sync : Disabled
HDR : Disabled
Multicore Rendering : Queued Multi Thread
FPS Cap : Disabled
Glow Outline Effect : Disabled
Player HUD Model : Disabled
Weather Effects : Disabled
Sprays : Disabled
Eyes : Enabled
Flex : Enabled
Teeth : Enabled
Ragdolls : Disabled
Gibs : Disabled
comment: 960
2016-11-15 01:08:00
--- Results ---
2639 frames 14.515 seconds 181.81 fps ( 5.50 ms/f) 15.873 fps variability
2639 frames 14.368 seconds 183.68 fps ( 5.44 ms/f) 14.621 fps variability
2639 frames 14.411 seconds 183.12 fps ( 5.46 ms/f) 14.577 fps variability
--- Settings used ---
Linux 4.8.4-1-ck x86_64 GNU/Linux
CPU : Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (4.2 GHz)
RAM : 15982.1 MB
GPU : GeForce GTX 760/PCIe/SSE2
Driver : OpenGL ES 3.2 NVIDIA 370.28
Resolution : 1920x1080
Model Detail : Low
Texture Detail : Low
Shader Detail : Low
Water : Simple Reflections
Shadow Detail : Disabled
Antialiasing Mode : Disabled
Texture Filtering : Trilinear
Vertical Sync : Disabled
HDR : Disabled
Multicore Rendering : Queued Multi Thread
FPS Cap : Disabled
Glow Outline Effect : Disabled
Player HUD Model : Disabled
Weather Effects : Disabled
Sprays : Disabled
Eyes : Disabled
Flex : Disabled
Teeth : Disabled
Ragdolls : Disabled
Gibs : Disabled
comment: cpu
2016-12-06 17:45:36
--- Results ---
2639 frames 13.884 seconds 190.07 fps ( 5.26 ms/f) 13.100 fps variability
2639 frames 13.627 seconds 193.66 fps ( 5.16 ms/f) 10.425 fps variability
2639 frames 13.597 seconds 194.08 fps ( 5.15 ms/f) 10.556 fps variability
--- Settings used ---
Linux 4.8.12-1-ck-haswell x86_64 GNU/Linux
CPU : Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (4.2 GHz)
RAM : 15958.3 MB
GPU : GeForce GTX 760/PCIe/SSE2
Driver : OpenGL ES 3.2 NVIDIA 375.20
Resolution : 1920x1080
Model Detail : Low
Texture Detail : Low
Shader Detail : Low
Water : Simple Reflections
Shadow Detail : Disabled
Antialiasing Mode : Disabled
Texture Filtering : Trilinear
Vertical Sync : Disabled
HDR : Disabled
Multicore Rendering : Queued Multi Thread
FPS Cap : Disabled
Glow Outline Effect : Disabled
Player HUD Model : Disabled
Weather Effects : Disabled
Sprays : Disabled
Eyes : Disabled
Flex : Disabled
Teeth : Disabled
Ragdolls : Disabled
Gibs : Disabled
2016-12-06 17:49:57
--- Results ---
2639 frames 20.425 seconds 129.20 fps ( 7.74 ms/f) 7.336 fps variability
2639 frames 21.825 seconds 120.92 fps ( 8.27 ms/f) 6.757 fps variability
2639 frames 21.088 seconds 125.14 fps ( 7.99 ms/f) 6.619 fps variability
--- Settings used ---
Linux 4.8.12-1-ck-haswell x86_64 GNU/Linux
CPU : Intel(R) Core(TM) i5-4690K CPU @ 3.50GHz (4.2 GHz)
RAM : 15958.3 MB
GPU : GeForce GTX 760/PCIe/SSE2
Driver : OpenGL ES 3.2 NVIDIA 375.20
Resolution : 1920x1080
Model Detail : High
Texture Detail : Extra High
Shader Detail : High
Water : Reflect World
Shadow Detail : Medium
Antialiasing Mode : 16xQ CSAA
Texture Filtering : 16x
Vertical Sync : Disabled
HDR : Full HDR Enabled
Multicore Rendering : Default
FPS Cap : Disabled
Glow Outline Effect : Disabled
Player HUD Model : Disabled
Weather Effects : Disabled
Sprays : Disabled
Eyes : Enabled
Flex : Enabled
Teeth : Enabled
Ragdolls : Enabled
Gibs : Enabled (burning)
comment: cinema
[/spoiler]
Also [url=https://gist.githubusercontent.com/xiyzzc/361cf3bace9b7e94db16e1ac6a0082e3/raw/aa6c52d76f749106b1d9a1fa9a501ea8efbf6457/cpu-governor-auto.sh]here[/url]'s a script that can change the governor to performance if any of the listed processes are running, and sets it to powersave if they're no longer running. However, you need cpupower to be exempted in the /etc/sudoers file so it can be elevated without a prompt. You can do that by adding something like this to the end of the file
[code]your_username_here ALL = NOPASSWD: /usr/bin/cpupower
[/code]
One common problem that I would like to ask regarding the gamma though. changing the brightness slider in settings does not change the actual brightness in game, and I have already set fullscreen. Is there a solution for it?
One common problem that I would like to ask regarding the gamma though. changing the brightness slider in settings does not change the actual brightness in game, and I have already set fullscreen. Is there a solution for it?
FireIsnt it generally a bad idea to allow sudo powers without a password ?
This only allows a single user to avoid the prompt on a specific program.
newDiGiTOne common problem that I would like to ask regarding the gamma though. changing the brightness slider in settings does not change the actual brightness in game, and I have already set fullscreen. Is there a solution for it?
I can change it when I first start the game, but if I change it after joining a server it doesn't do anything until I restart the game.
[quote=Fire]Isnt it generally a bad idea to allow sudo powers without a password ?[/quote]
This only allows a single user to avoid the prompt on a specific program.
[quote=newDiGiT]One common problem that I would like to ask regarding the gamma though. changing the brightness slider in settings does not change the actual brightness in game, and I have already set fullscreen. Is there a solution for it?[/quote]
I can change it when I first start the game, but if I change it after joining a server it doesn't do anything until I restart the game.
newDiGiTOne common problem that I would like to ask regarding the gamma though. changing the brightness slider in settings does not change the actual brightness in game, and I have already set fullscreen. Is there a solution for it?
I am very sorry to necro this dead thread but I have not seen any solutions elsewhere.
So I recently found a solution to those facing the problem. So what you need to do is:
1. Install xrandr
2. go to your tf2 directory
~/.local/share/Steam/steamapps/common/Team Fortress 2/
and edit the file hl2.sh then add the line:
xrandr --output output_name --gamma 1.6:1.6:1.6
just above
STATUS = 42
3. and then add
xrandr --output output_name --gamma 1:1:1
above
exit $STATUS
------------------------------------------------------------------------------------------------------------------------------------------
You can find your output_name by running:
$ xrandr
-------------------------------------------------------------------------------------------------------------------------------------------
Here is my file incase you need referencing:
#!/bin/bash
# figure out the absolute path to the script being run a bit
# non-obvious, the ${0%/*} pulls the path out of $0, cd's into the
# specified directory, then uses $PWD to figure out where that
# directory lives - and all this in a subshell, so we don't affect
# $PWD
GAMEROOT=$(cd "${0%/*}" && echo $PWD)
#determine platform
UNAME=`uname`
if [ "$UNAME" == "Darwin" ]; then
# Workaround OS X El Capitan 10.11 System Integrity Protection (SIP) which does not allow
# DYLD_INSERT_LIBRARIES to be set for system processes.
if [ "$STEAM_DYLD_INSERT_LIBRARIES" != "" ] && [ "$DYLD_INSERT_LIBRARIES" == "" ]; then
export DYLD_INSERT_LIBRARIES="$STEAM_DYLD_INSERT_LIBRARIES"
fi
# prepend our lib path to LD_LIBRARY_PATH
export DYLD_LIBRARY_PATH="${GAMEROOT}"/bin:$DYLD_LIBRARY_PATH
elif [ "$UNAME" == "Linux" ]; then
# prepend our lib path to LD_LIBRARY_PATH
export LD_LIBRARY_PATH="${GAMEROOT}"/bin:$LD_LIBRARY_PATH
fi
if [ -z $GAMEEXE ]; then
if [ "$UNAME" == "Darwin" ]; then
GAMEEXE=hl2_osx
elif [ "$UNAME" == "Linux" ]; then
GAMEEXE=hl2_linux
fi
fi
ulimit -n 2048
# enable nVidia threaded optimizations
export __GL_THREADED_OPTIMIZATIONS=1
# and launch the game
cd "$GAMEROOT"
# Enable path match if we are running with loose files
if [ -f pathmatch.inf ]; then
export ENABLE_PATHMATCH=1
fi
# Do the following for strace:
# GAME_DEBUGGER="strace -f -o strace.log"
# Do the following for tcmalloc
# LD_PRELOAD=../src/thirdparty/gperftools-2.0/.libs/libtcmalloc_debug.so:$LD_PRELOAD
#gamma correction
xrandr --output HDMI-1 --gamma 1.6:1.6:1.6 #play with values if required
STATUS=42
while [ $STATUS -eq 42 ]; do
if [ "${GAME_DEBUGGER}" == "gdb" ] || [ "${GAME_DEBUGGER}" == "cgdb" ]; then
ARGSFILE=$(mktemp $USER.hl2.gdb.XXXX)
echo b main > "$ARGSFILE"
# Set the LD_PRELOAD varname in the debugger, and unset the global version. This makes it so that
# gameoverlayrenderer.so and the other preload objects aren't loaded in our debugger's process.
echo set env LD_PRELOAD=$LD_PRELOAD >> "$ARGSFILE"
echo show env LD_PRELOAD >> "$ARGSFILE"
unset LD_PRELOAD
echo run $@ >> "$ARGSFILE"
echo show args >> "$ARGSFILE"
${GAME_DEBUGGER} "${GAMEROOT}"/${GAMEEXE} -x "$ARGSFILE"
rm "$ARGSFILE"
else
${GAME_DEBUGGER} "${GAMEROOT}"/${GAMEEXE} "$@"
fi
STATUS=$?
done
xrandr --output HDMI-1 --gamma 1:1:1
exit $STATUS
[quote=newDiGiT]One common problem that I would like to ask regarding the gamma though. changing the brightness slider in settings does not change the actual brightness in game, and I have already set fullscreen. Is there a solution for it?[/quote]
I am very sorry to necro this dead thread but I have not seen any solutions elsewhere.
So I recently found a solution to those facing the problem. So what you need to do is:
1. Install xrandr
2. go to your tf2 directory [code] ~/.local/share/Steam/steamapps/common/Team Fortress 2/ [/code] and edit the file hl2.sh then add the line:
[code] xrandr --output output_name --gamma 1.6:1.6:1.6 [/code]
just above [code]STATUS = 42[/code]
3. and then add
[code] xrandr --output output_name --gamma 1:1:1 [/code]
above [code] exit $STATUS [/code]
------------------------------------------------------------------------------------------------------------------------------------------
You can find your output_name by running:
[code]$ xrandr[/code]
-------------------------------------------------------------------------------------------------------------------------------------------
Here is my file incase you need referencing:
[code]
#!/bin/bash
# figure out the absolute path to the script being run a bit
# non-obvious, the ${0%/*} pulls the path out of $0, cd's into the
# specified directory, then uses $PWD to figure out where that
# directory lives - and all this in a subshell, so we don't affect
# $PWD
GAMEROOT=$(cd "${0%/*}" && echo $PWD)
#determine platform
UNAME=`uname`
if [ "$UNAME" == "Darwin" ]; then
# Workaround OS X El Capitan 10.11 System Integrity Protection (SIP) which does not allow
# DYLD_INSERT_LIBRARIES to be set for system processes.
if [ "$STEAM_DYLD_INSERT_LIBRARIES" != "" ] && [ "$DYLD_INSERT_LIBRARIES" == "" ]; then
export DYLD_INSERT_LIBRARIES="$STEAM_DYLD_INSERT_LIBRARIES"
fi
# prepend our lib path to LD_LIBRARY_PATH
export DYLD_LIBRARY_PATH="${GAMEROOT}"/bin:$DYLD_LIBRARY_PATH
elif [ "$UNAME" == "Linux" ]; then
# prepend our lib path to LD_LIBRARY_PATH
export LD_LIBRARY_PATH="${GAMEROOT}"/bin:$LD_LIBRARY_PATH
fi
if [ -z $GAMEEXE ]; then
if [ "$UNAME" == "Darwin" ]; then
GAMEEXE=hl2_osx
elif [ "$UNAME" == "Linux" ]; then
GAMEEXE=hl2_linux
fi
fi
ulimit -n 2048
# enable nVidia threaded optimizations
export __GL_THREADED_OPTIMIZATIONS=1
# and launch the game
cd "$GAMEROOT"
# Enable path match if we are running with loose files
if [ -f pathmatch.inf ]; then
export ENABLE_PATHMATCH=1
fi
# Do the following for strace:
# GAME_DEBUGGER="strace -f -o strace.log"
# Do the following for tcmalloc
# LD_PRELOAD=../src/thirdparty/gperftools-2.0/.libs/libtcmalloc_debug.so:$LD_PRELOAD
#gamma correction
xrandr --output HDMI-1 --gamma 1.6:1.6:1.6 #play with values if required
STATUS=42
while [ $STATUS -eq 42 ]; do
if [ "${GAME_DEBUGGER}" == "gdb" ] || [ "${GAME_DEBUGGER}" == "cgdb" ]; then
ARGSFILE=$(mktemp $USER.hl2.gdb.XXXX)
echo b main > "$ARGSFILE"
# Set the LD_PRELOAD varname in the debugger, and unset the global version. This makes it so that
# gameoverlayrenderer.so and the other preload objects aren't loaded in our debugger's process.
echo set env LD_PRELOAD=$LD_PRELOAD >> "$ARGSFILE"
echo show env LD_PRELOAD >> "$ARGSFILE"
unset LD_PRELOAD
echo run $@ >> "$ARGSFILE"
echo show args >> "$ARGSFILE"
${GAME_DEBUGGER} "${GAMEROOT}"/${GAMEEXE} -x "$ARGSFILE"
rm "$ARGSFILE"
else
${GAME_DEBUGGER} "${GAMEROOT}"/${GAMEEXE} "$@"
fi
STATUS=$?
done
xrandr --output HDMI-1 --gamma 1:1:1
exit $STATUS
[/code]