crespiIT professional
codemonkey/dev/software engineer = pc build guru omegalul
Account Details | |
---|---|
SteamID64 | 76561198028691174 |
SteamID3 | [U:1:68425446] |
SteamID32 | STEAM_0:0:34212723 |
Country | Ukraine |
Signed Up | June 2, 2015 |
Last Posted | May 19, 2024 at 5:38 PM |
Posts | 490 (0.1 per day) |
Game Settings | |
---|---|
In-game Sensitivity | 1.58 |
Windows Sensitivity | m_mousespeed 0 |
Raw Input | raw input |
DPI |
800 |
Resolution |
1920x1080 |
Refresh Rate |
165 |
Hardware Peripherals | |
---|---|
Mouse | g305 |
Keyboard | rk84 brown switches |
Mousepad | QCK+ |
Headphones | dt770 pro 250ohm focusrite scarlett solo 3rd gen |
Monitor | benq ex2710s FPS sharpness 10 AMA 2 BR on |
crespiIT professional
codemonkey/dev/software engineer = pc build guru omegalul
yak404is no one talking about how fucking disgustingly dirty this toilet is
if you ever lived in a dorm you see a pristine bowl
aiera
and yet we milk the same phrases/current memes for months until they die out only to repeat the cycle with the current thing
Wandumas much as most people on tftv like to give tf2center shit there is no way to look at this as anything other than a GIANT L to the entire comp scene
im not totally familiar with how it works in na, but tf2center was and is such a vital stepping stone for so many of the people who play the game now, across every single level (including myself)
nah, you will live, dare I say move on
while at the time it was instrumental for beginners to get a taste of 6s without commiting to a team and at a time most convenient to them, for the past 5 years its has been a shell of itself and basically the place for people to zone off at 3am getting high/drunk or otherwise unable to find themselves doing anything productive at their office job during daytime playing NA lobbies so definitely ruling this out as a great loss present day, UGC is a would be bigger loss
let it rest
p.s. tf2stadium is still open for business
ps2 fuck masternoob
Wandumi'd tend to agree, but if you compare the amount of seasons won (when playing) papi is currently only a single season behind kaidus
https://i.imgur.com/e5gTTbO.png
You memoryholed season 9, 33 and cpg18 on behalf of kaidus
I'm gonna memoryhole all the dogshit spring&winter insomnias like i47 and i50
yeah well reading the comments on the french deals website it apparently ran out of stock yesterday(?) so eh, its on another website still for 299 according to pcpartpicker or 350 on spanish amazon, every other amazon is 400+
https://fr.pcpartpicker.com/product/WWcG3C/lg-27gp850-b-270-2560x1440-165-hz-monitor-27gp850-b
https://www.fnac.com/Ecran-PC-Gaming-LG-UltraGear-27GP850-B-27-LED-QHD-Noir-mat/a16147192/
says ships from Dec 2nd
most of these were cheaper before black friday(30-100eur), make of that what you will
Iiyama GB2770QSU-B1(2021) - 290eur G2770QSU-B1(2022) - 300eur, the only difference between the two is the stand and probably a slightly better panel on the newer revision idk
wont blow you out of the water, Okayge
Samsung Odyssey G5 S27AG500NU(2021) - 254eur amazon.pl, seems like just a lower binned, slower LG panel, no idea
B+
Acer Nitro VG270UPbmiipx(2018) ~250 eur
pros: slightly cheaper than the rest, went on sale for 200eur twice in the past month and a half
cons: shit panel from 2018, no color accuracy, no response time, 144hz, one of the HDMI ports is 1.4
skip
Gigabyte G27Q(2020) - 290eur 144hz native, 165 overclock, so expect pisspoor picture quality or overshoot at 165hz
Okayge
AOC Q27G2S(2020) - 290eur, neither fast nor color accurate
ASUS TUF VG27AQ(2019) - 350eur starting price for this is questionable, its just a gaming model, color gamut 8bit which is not ideal considering all of the aforementioned models and its only faster than the acer nitro which is hot garbage
TUF VG27AQ1A(2020) - a newer revision that has 10bit but cuts down on brightness from 320 nits on the 2019rev to 250 on this one making them dimmer than the acer nitro
neither worth bothering
lg is the one that is most worthwhile, then samsung/gigabyte
iiyama if you get a good deal is not the worst thing in the world
acer and asus are just not worth considering
I strongly urge you to read and watch some reviews on youtube of these remaining monitors before deciding, these people actually thoroughly test response times, motion clarity and color accuracy with 800eur calibrators
skip the casual reviewers that only show B-roll and quickly demonstrate the monitor like its an ad read
rtings.com and hardware unboxed on yt is a good starting point, there are a few more credible reviewers
full disclosure I've only looked at LG for an extended period of time earlier this year and it still seems to be the price to performace king this year
https://www.dealabs.com/bons-plans/ecran-de-pc-27-lg-ultragear-lg-27gp850-b-2442164
https://www.boulanger.com/ref/1163044
review
pros: good response time/motion clarity; 10 bit panel(1bil colors vs 16.7mil on 8bit or 6bit+FRC); decent feature amount be it connectivity or variable refresh rate;
cons: I think red subpixels are a tiny bit slow like on most quantum dot panels but I dont think its a dealbreaker, should perform better than your or any other VA panel in gaming regardless; the stand doesnt swivel left right(horizontal) like on expensive benq's so check to see if your keyboard wont get in its way on the desk
LeonhardBrolerFeel free to suggest a completely different model if you think it's worth it, notably even IPS ones if you think I would be better off having IPS instead of VA
yes
LeonhardBrolerso unless IPS has some good arguments for a change I would probably go for VA again
super laymans terms and probably lacking elaboration/full picture but essentially VA has good contrast and blacks for super cheap(thank you samsung very cool) at the cost of motion clarity (MSRP). VA is prone to have black smearing, ghosting and no overdrive setting will make it better, the undershoot is horrible
any questions? ask away, i cba
SetsulYes, the numbers I pulled out of my ass aren't gospel.
Thank you for your blunt honesty
SetsulAnd no, the frames aren't just scaled up. That would be DLSS.
Yes I'm aware they arent magically "scaled up" but rendered at a substantially higher resolution, using more bandwidth and CPU time. I was simply making an exaggerated oversimplification.
SetsulYou can disagree with how games work, it's not going to change the facts.
Neither will it change the fact that somehow CPU heavy games still react to GPUs getting better?
SetsulNice rant about nVidia. What is your point?
That we finally get meaningful uplifts thanks to the g̸̻̜̬̰͈̳̊̈̄̍e̵͚̪͕̹̮̰̒n̷̞̘͉̜͚̞̩̞̥̓̈́͠ę̵̡̱̹̠̤͌͒́̿̊͠r̸̢͖̟͖̥̜̺̖͖̃͜͝a̸̢͇̗̱̱̱͚̟͖̗̓͂́̇͒͋̂̿͂͝t̴̹̒͋̆̓ͅḭ̶̡̧̰̦̜̮̥̘̾̀͛̒͠ö̴̡̺̠́̓̆̎́̕͝͝n̴̥̳͚̺̞̺̜̑̽͜͝ą̷̬̫̳̩̼͇̣̠̑̌͐͂͝ͅl̴̤̬̭̾̀̈́́̃̄̚ ̷̠͛͛̊i̵̧̦̺̬̙͇̩͂͑̾̈́͋̅̅̚m̶̛̱̼̞͚̘̦̟̦̹̥͊̈́͛̊̄̉͝p̷̡̹̼͉̔̂̎͗͑ŗ̵̱͙͎̭̜̱̹̤̟̀̀̃̍͌̓ȍ̷̺̘̺̱͙̲͊͜v̴̧̜̅̒e̴̮̱̽̌̈́̎͛̚m̸̛̛̘̞̅̾̎̍̀̔̃͜e̵̡̺̖̻͗̊̾̈́͆̕̚n̴̞̪̔̈̉͘͠t̴̨̛̗̾͑̈́̚̚s̸̨̝̹̜̱̦̯̟͚͊͐̊͝ that can likely improve fps even in CPU heavy shit we play but uh
SetsulCan you shut up about "generational improvements"? No one cares.
SetsulAre you arguing that anyone with a 120 Hz monitor should cap TF2 at 120 fps?
Do I? No. But mentioning the current upper bandwidth cap - that is a byproduct of me trying to tell you that we neither have the capable screens or the interfaces to run 4K 480hz. When in fact we don't even want to, at least yet. AKA "No one cares". Then why did you show 4K results in the first place when 1080 is all the majority cares about in competitive shooters.
SetsulWhy are you so upset about high resolution benchmarks when you link them yourself because there's nothing else available?
I'm upset because you pull up with 4K results when and especially in Source
No one cares.
pull some numbers out of your ass and then
Are you going to see a noticeable difference between a 1080 and a 4090? No.
when the card is three times faster
SetsulYes, the difference between 1080p and 4K or low and high settings is more than 5fps on the same card. That is literally the opposite of what we're arguing about. Again, why bother linking all that? What is your point?
Frankly I don't find it completely opposite of what we were arguing about: Right off the bat you show me 4K High results, where its obvious that the card instead of pushing more frames at 1080p has to render frames 4 times the size we are interested in, and then I find equally scuffed 4K results, we see the performance difference between the top cards at that resolution is tiny at best if present at all. I try and make a point that
SetsulNo, we saw a 39 fps difference in a completely fucked up environment.
Trying to extrapolate from behaviour on a 7950X with 6000MHZ CL36 RAM to an 12900KS with unknown clockspeed, unknown RAM, on a completely different benchmark and comparing that with yet another 12900K setup is completely meaningless.
The 39fps at equal conditions indicate there is a difference in the first place. 4090 managed an even greater difference and especially at a lower resolution:1440p. Therefore it is safe to assume that
Now regarding to the mysteriously clocked 12900ks. It was more of an comparison to the
12900k 5.7 3090ti DDR5 7200 30-41-40-28 1080p Low 928.18fps
12900k 5.5 3090 DDR5 6400 30-37-37-26 1080p Low 836.43fps
I was aiming to eliminate the 39fps out of the equation because it was present in the "scuffed benchmark". Very well. 12900KS stock clock (I've found other game benchmarks on the channel) that according to intel ark should boost to 5.5 give or take (which should be in line with the 5.5 12900k with the 3090) with DDR5 JEDEC and a 3060ti scored 699fps at 1080p Low. So then you are telling me that by simply going from DDR5 JEDEC to 6400 gives you 136 fps, and at 7200, +200 on the clock and dropping the eficiency cores nets you 228fps total uplift. That is an australian christmas fucking miracle don't you find. I may be delusional but there is no way going from 3060ti to 3090 isnt responsible for at least 35% of that total gain at 1080p low.
The funniest thing is: the best screen tech we have on hand today is 360hz, which we can drive with a 12/13600k, so not even I fully understand why am I shilling 30 and 40 series when those cost more than the computer able to drive the game at 360hz. In my defence however I was romantacising the possibility of 480hz gaming in the very near future and it appears to me that if the monitor was to be released tomorrow, the only way to get the last remaining performance droplets is by getting a two fucking thousand dollar pcb with a side topping of silicon covered in two kilos of alluminum and copper.
SetsulSo you complained about 4K High benchmarks and pulled out 4K and 1440p Very High benchmarks instead. Not impressed, I must say.
What the fuck is a Very High? CSGO doesnt have rendering distance sliders, hairworks, weather, NPC density, ray tracing etc. There is only one possible thing that can be set to "Very High" and it is the model quality. And let me tell you right now the difference that makes is 100% pure snake oil. All the reviewers and benchmarkers make up their own boogie way of saying Highest/Ultra Settings/Ultra High/Very High, so for all intents and purposes "High" in Source is essentially Ultra Settings.
SetsulHigher settings and higher resolution also mean more CPU load. The difference in GPU load is usually larger, so you're more likely to be GPU bound, but it's not always the case. Source engine is especially bad about this, where way too much is done on the CPU.
I disagree. If at GPU heavy titles many different CPUs level out their performance at 4K and half the time 1440p, while at 1080p show incremental differences, then why should a CPU heavy game care whatsoever about your resolution if no new objects are being drawn, only each frame being scaled up in the resolution by the GPU increasing its load. So from my standpoint at CPU heavy titles, differences in performance from GPUs come either from a class step-up/downs or architectural differences.
SetsulLet's go through the benchmarks in reverse order:
We're not going to play some mystery guessing game, so the LTT benchmarks are completely worthless.
As you can see, at 4K the 4090 driver is fucked, at 4K the 6950XT is the fastest and at 1440p it falls behind even the 3090. So the only apples to apples comparison are the 3090 Ti and 3090 and their difference is ... 39 fps, regardless of resolution and total fps. Logically the prediction would be that at 1080p Very Low we'd see something like 1000 fps vs 1039 fps. Not very impressive.
Setsul>Driver fucked
Pretty obvious, thats why we didnt see any difference whatsoever on neither your screenshot or LTT's at 4K.
Setsul>So the only apples to apples comparison are the 3090 Ti and 3090
I'm happy that you accept that comparison graciously and acknowledge the 39fps difference, but it is essentially two of the same cards, except the one clocks marginally higher and thats why they sell it overpriced with a Ti label slapped onto it. Shouldn't we also expect the same level of performance scaling down to 3080 and 3080Ti? What about 2080Ti that has been found to be slower than 3070Ti? What about 1080Ti. Lets revisit your initial claim
E.g. if a 1070 gets 480 fps and a 1080 490 fps and a 2080 495 fps and a 2080 Ti 496 fps then you can damn well be sure that a 4090 isn't going to get 550 fps but rather 499 fps.
But then miraculously 3090 and 3090ti have 39fps between in the same generation, so what gives? According to that quote, the gap should have been 5fps, or as you said, going from 2080ti to 4090 will be 496->499 fps? Isn't it reasonable to deduct that the generational leap is much greater than that, especially starting 30 series, and now again at 40 series. Because 2080Ti was like intel in the past decade, feeding you 10% more at twice the price. 30 series were a reasonable performance bump at a cost of heavily inflated power draw, but finally the new product actually was meaningfully faster. Now the same thing is happening with the 40 series, but the pricing is outrageous.
SetsulThen you got a bunch of benchmarks showing that lower details/resolution lead to higher fps. No one is suprised by that. See right at the start, GPUs aren't magic devices that pull more pixels out of the aether. Higher details/resolution also mean more CPU load.
Well, yes. That's the point in the first place, all of these show that at 1080p the performance is vastly different from 4K, and that settings play albeit a smaller role as well. Your initial post fed me 4K High results bordering 480fps. Why are you explaining to me the gains are negligible while showing 4K High results when:
And the point is that at 1080p or even at 1440p the performance difference is pronounced, not in your 5fps increments.
SetsulThe 3080 vs 4090 is pretty scuffed because the average the 4090 shows 450-500fps average on 1080p low and 500-550 average on high, no idea how the current fps are so much higher than the average for the entire run, the 3080 shows 650-ish on low and 670-ish on high. So I'm not sure what're you trying to show.
No idea where you're getting 46% more fps from, but it's most definitely not an identical setup and not even the same benchmark.
I was looking at the current framerate without looking at the average, forgot to mention that and was fully prepared to discard that benchmark alltogether because again, its not even uLLeticaL but a guy running around a bot filled dust2 so I'm gonna concede this.
SetsulWhere could the difference be coming from? I don't know, maybe it's the fact that the 3080 was run with 4800 MHz RAM and the 4090 with 6200 MHz? CPU clock is also different.
In regards to the lower average I think that the NVIDIA driver is at play once again. So allegedly +200mhz on the all core and faster and tighter RAM, yet the average is lower by a 100fps while current fps is 300higher? 1% and .1% lows driver shit is at play again.
Setsul12900K + 3090 vs 12900K + 3090 Ti.
Except one is DDR5 6400 30-37-37-26; 12900K (SP103) / P Cores 5.5Ghz / E Cores 4.3Ghz / Cache 4.5Ghz
the other is DDR5 7200 30-41-40-28 12900K (SP103) / P Cores 5.7Ghz / E Cores off / Cache 5.2Ghz
And there's your answer.
CS:GO getting scheduled on E-Cores sucks and source engine loves fast RAM and Cache. That's all there is to it.
https://www.youtube.com/watch?v=-jkgFwydrPA
Alright, 12900KS unspecified clock, allegedly turbos to 5.5, unspecified DDR5 RAM, lets pretend that its JEDEC 4800, and a 3060ti at 1080p Low.
So according to previously I consider agreed upon data, a 3090 and a 3090ti have about 39fps between them in sterile environment. Which should mean the difference the faster DDR5 kit, +200 allcore and E-Cores off provide is 52.75fps with the remaining 39fps taken out of the equation. Then how does a 3060ti score 136.67 fps below a 3090 with identically shitty JEDEC kit of RAM at the same resolution and graphical settings with the main or even the only difference being a stepdown in a GPU class?
dm me your discord or something we are clogging the thread with this nerd shit, understandable should you not choose to engage with me out of time practicality or own mental well being concerns
Alright, lets take console output from uLLeticaL benchmark test map as the indicator of performance, since that is what seems every benchmark is basing their Average FPS off of. Sorry couldn't get 100% matching specs.
While neither is this a 12900k nor 3090/3090ti/4090, this is finally the video that did test 1080p, 1440p and 4K at High and Low each all on an one given setup.
Finally. While its out of the aforementioned criteria (uLLeticaL) here is a video of an identical setup, only difference is the GPU but the testing methodology is questionable and I understand if you find it inadmissible.
Gamers Nexus didnt benchmark 4090 in CSGO at all. When they do benchmark CSGO they do it for their CPU reviews, but with a 3090ti at 1080p High, so again not really throwing me a bone here.
Hardware Unboxed skimped out on CSGO in their 4090 review as well. Like you said, they did not want to skew their game averages with CSGO results. They did however benchmark the 4090 in CSGO in their 13900k review, but not on uLLeticaL instead on a demo from a pro game, which obviously has a lot more variation than a static map benchmark.
LTT again, did their own fancy benchmark that doesnt utilize neither uLLeticaL nor does it test 1080p at all. But looking at the screenshots of their 3090ti and 4090 results at 4K they look basically on par to those that you have used in #697, don't you think.
But then, as soon as they drop the resolution to 1440p it behaves predictably, with 4090 being on top, followed by a 3090ti and finally 3090. Surely the GPU should not have been influencing framerates that much by your account. Sure, the increase from 3090 to 4090 is just 12.3% and from 3090ti to 4090 only 5.8%, but we are talking several dozen FPS difference, not "a couple more" FPS as i understand from #697 and thats not even taking Low settings or 1080p into the account.