dashnerIf anyone is curious about how we got to this stage
https://thundergaming.com/news/thunder-gaming-partners-shadow-blade-debut-cloud-computing-esports-test-site/
TLDR there are no PCs because they think cloud PCs in a data center with severe input lag and other severe problems can replace physical PCs on site on lan
Jesus what a great way to kill a esport center for actual tournaments. Competitive FPS? added input lag (however small) will be noticeable and no body will want to use it. Speedrunning? For games that rely on frame perfect glitches/maneuvers it's going to be a nightmare for anyone who didn't practice on a identical setup. Fighting Games? Again for people who practice frame perfect maneuvers they're fucked on this.
Now this would be good for a MMO convention or something though.
Beyond all of that from the pure technical aspect of this, this is mind blowingly retarded.
Now I'm going to use the number posted by Dashner earlier for 85 PCs. Which means a possible 85 users all trying to Cloud Stream games. Since this is a tournament location I would assume all the monitors would be 144Hz+. Most users play at 1920x1080.
In order to get high quality video of 1080p 144fps you'll need @ minimum 30Mbps* per user just to stream the game capture to them. For 85 users all simultaneously using this would require ~2.6Gbps and depending on implementation they might need to do that when just streaming people on the desktop and browsing the web. Note this isn't including any of the traffic created by those 85 users other than them trying to interface with a cloud system.
Now I'm going to assume that the Cloud Server is onsite for minimal input lag, and less network cost to whomever their ISP is since they would need at minimum of 5Gbps for just 85 users (imagine 200 or more rofl). Now beyond the straight up bandwidth of the connection you also have to account for 85 users requiring 17.7Mb every ~7ms the kind of equipment that support the packets per second to handle that kind of traffic are NOT cheap and can easily overwhelm switches that are a lot more affordable but would be "close" to being able to support that. Basically this is bordering on being a shit show if that kind of traffic happens.
*30Mbps is a highly conservative number here, the image quality would be noticeably worse during any kind of action than if it had been rendered on the local machine and overall color quality would be degraded to say the least. For reference single-link DVI which has a max data rate of 3.96Gbps (approx 130x 30Mbps) can't support 1080p 144Hz.
In Summary there will be MASSIVE compromises.
possible compromises
- Resolutions will be much lower than what some people use.
- Color quality will be degraded.
- High motion scenes will be blurry/smeared.
- Locked framerates and or sub 100fps gameplay
- Input lag would be horrible.
- Horrible stutters from network congestion or oversaturation of server hardware.
- most likely most/all of the above
Thunder GamingThunder Gaming is at the forefront of gaming technology and will leverage the performance of Shadow Computing, which is fully compatible with Fiber, DSL, 4G, Ethernet and Wifi, starting from 15 Mbps.
To backup my numbers above where 1080p 144 FPS would require ~30Mbps (not including audio), My initial guess would be 1080p 60fps would be their initial estimate for bandwidth which would require ~12.8Mbps with audio.
oh and AFAIK the amount of lag added by cloud gaming (at absolute minimum) is Frametime + Network Latency so 144 fps and a Cloud server on LAN would be ~7.0-7.5ms of added input lag in the absolute best case scenario.
-edit-
So I found out the kind of hardware Shadow used/uses
2.1GHz Intel Xeon E5-2620 v4 (Broadwell), 12GB of memory, and an Nvidia Quadro P5000
https://www.pcworld.com/article/3256318/gaming/hands-on-blades-shadow-cloud-gaming-service.html
even with a dxlevel 81 + fps config, and performance settings in windows + no hats + no explosions I'm pretty sure at 2.1GHz people would be dipping below 144fps in team fights.