NVIDIA’s GRID Game Streaming Service Rolls Out 1080p60 Support
by Ryan Smith on May 12, 2015 5:50 PM EST- Posted in
- GPUs
- Shield
- NVIDIA
- GRID
- Cloud Gaming
Word comes from NVIDIA this afternoon that they are rolling out a beta update to their GRID game streaming service. Starting today, the service is adding 1080p60 streaming to its existing 720p60 streaming option, with the option initially going out to members of the SHIELD HUB beta group.
Today’s announcement from NVIDIA comes as the company is ramping up for the launch of the SHIELD Android TV and its accompanying commercial GRID service. The new SHIELD console is scheduled to ship this month, meanwhile the commercialization of the GRID service is expected to take place in June, with the current free GRID service for existing SHIELD portable/tablet users listed as running through June 30th. Given NVIDIA’s ambitions to begin charging for the service, it was only a matter of time until the company began offering the service, especially as the SHIELD Android TV will be hooked up to much larger screens where the limits of 720p would be more easily noticed.
In any case, from a technical perspective NVIDIA has long had the tools necessary to support 1080p streaming – NVIDIA’s video cards already support 1080p60 streaming to SHIELD devices via GameStream – so the big news here is that NVIDIA has finally flipped the switch with their servers and clients. Though given the fact that 1080p is 2.25x as many pixels as 720p, I’m curious whether part of this process has involved NVIDIA adding some faster GRID K520 cards (GK104) to their server clusters, as the lower-end GRID K340 cards (GK107) don’t offer quite the throughput or VRAM one traditionally needs for 1080p at 60fps.
But the truly difficult part of this rollout is on the bandwidth side. With SHIELD 720p streaming already requiring 5-10Mbps of bandwidth and NVIDIA opting for quality over efficiency on the 1080p service, the client bandwidth requirements for the 1080p service are enormous. 1080p GRID will require a 30Mbps connection, with NVIDIA recommending users have a 50Mbps connection to keep from any other network devices compromising the game stream. To put this in perspective, no video streaming service hits 30Mbps, and in fact Blu-Ray itself tops out at 48Mbps for audio + video. NVIDIA in turn needs to run at a fairly high bitrate to make up for the fact that they have to all of this encoding in real-time with low latency (as opposed to highly optimized offline encoding), hence the significant bandwidth requirement. Meanwhile 50Mbps+ service in North America is still fairly rare – these requirements all but limit it to cable and fiber customers – so at least for now only a limited number of people will have the means to take advantage of the higher resolution.
NVIDIA GRID System Requirements | ||
720p60 | 1080p60 | |
Minimum Bandwidth | 10Mbps | 30Mbps |
Recommended Bandwidth | N/A | 50Mbps |
Device | Any SHIELD, Native Or Console Mode | Any SHIELD, Console Mode Only (no 1080p60 to Tablet's screen) |
As for the games that support 1080p streaming, most, but not all GRID games support it at this time. NVIDIA’s announcement says that 35 games support 1080p, with this being out of a library of more than 50 games. Meanwhile I’m curious just what kind of graphics settings NVIDIA is using for some of these games. With NVIDIA’s top GRID card being the equivalent of an underclocked GTX 680, older games shouldn’t be an issue, but more cutting edge games almost certainly require tradeoffs to maintain framerates near 60fps. So I don’t imagine NVIDIA is able to run every last game with all of their settings turned up to maximum.
Finally, NVIDIA’s press release also notes that the company has brought additional datacenters online, again presumably in anticipation of the commercial service launch. A Southwest US datacenter is now available, and a datacenter in Central Europe is said to be available later this month. This brings NVIDIA’s total datacenter count up to six: USA Northwest, USA Southwest, USA East Coast, Northern Europe, Central Europe, and Asia Pacific.
Source: NVIDIA
61 Comments
View All Comments
chizow - Thursday, May 14, 2015 - link
@yannigr2's usual backpedaling, deflecting, stupidity when called on his BS:1) Yes it was a bug, but given AMD fanboys' low standards, they would rather have a buggy, faster solution that skipped an entire lighting pass! LOL. BF4 Mantle was another great example of this, I guess it should be fast if its not rendering everything it should. Remember BF4 Fanboy Fog TM? :D
http://techreport.com/news/14707/ubisoft-comments-...
"In addition to addressing reported glitches, the patch will remove support for DX10.1, since we need to rework its implementation. The performance gains seen by players who are currently playing Assassin’s Creed with a DX10.1 graphics card are in large part due to the fact that our implementation removes a render pass during post-effect which is costly. "
2. Yes, you obviously are, because you have no idea what competition actually means, and when a company competes your fanboy favorite into the ground, suddenly competition is bad and they are competing too hard. Ouch! Stop it! Imma cry. Competition hurts! :'(
3) Mantle was a failure, to any non-Fanboy. Complete disaster for AMD. And for what? You're Greek, you should know damn well what a Pyrrhic Victory is. So AMD fanboys can claim dead Mantle lives on in spiritual successor Vulkan (hey look, another Greek reference!), but who gives a crap when Vulkan will be irrelevant as well and AMD pumped hundreds of millions into a dead-end API. Funds that would have been better spent elsewhere!!!!!
Pay for what? LOL. Like I pay for super awesome monopoly approved Intel processors for the last 9 years since AMD got Conroe'd!!! Let go of the fanboy reigns and enjoy what the tech world has to offer! Free yourself from the bondage of the dying techbottom feeders known collectively known as AMD fanboy and enjoy!
yannigr2 - Thursday, May 14, 2015 - link
Oh my. Chizow the Nvidia Fanboy just gone full overclock. So much BS in your comment, so much admiration for Nvidia, so much hate for AMD, so many favorable conclusions, so much one sided (para)logic. DX 10.1 was a bug. Still hilarious. DX10.1 stopped being a bug after Nvidia supported it of course.chizow - Thursday, May 14, 2015 - link
Oh my yannigr2, ignorantly commenting as usual, ignoring relevant links with the entire back story with comments from both the vendors and the game developers. But this is the usual MO for AMD and their fanboy. Launch a bunch of promises on a slide deck, let misinformation/FUD grow and bloom, then ignore relevant actual proof to the contrary.I am sure you will tell me how you are enjoying how Free FreeSync is flashing all your monitors firmware enjoying 9-240Hz refresh rates on every cheap monitor on the market that has no additional hardware or cost huh?
LMAO, idiots.
PS. Nvidia never supported DX10.1, like Mantle, it was another irrelevant early-adoption effort from AMD. After it rolled its features into DX11 however, Nvidia did as they always do, they did DX11 done right and of course, killed AMD in one of the main DX10.1 features AMD was trumpeting the whole time: tesselation. Then of course, suddenly tesselation isn't important to AMD and Nvidia is competing too hard, devs are using too much tesselation etc. etc. lol
QQ more AMD fanboy.
yannigr2 - Friday, May 15, 2015 - link
Yeah right. Game developers. Ubisoft. LOL LOL LOL and more LOLs. Everyone knows Ubisoft and everyone knows their relationship with Nvidia."PS. Nvidia never supported DX10.1"
Low end Nvidia 200 series (205, 210, 220 and 240) and OEM 300 series is DX10.1 moron. What? The technical department failed again to inform you of the marketing department?
chizow - Friday, May 15, 2015 - link
Except this happened long before GameWorks, and it is a DIRECT quote from the developer with independently verified links showing graphical anomalies, so yes, keep burying your fanboy head in the sand as I am sure you will once again stupidly claim AMD's buggy (literal) solutions are better lol.PS: Clearly I don't care about low-end tech, so yeah great, low-end junk OEM parts supported DX10.1 but that doesn't change the fact Nvidia did not care to support it and it became irrelevant until it rolled into DX11, at which suddenly DX10.1 features were bad for AMD fanboys because Nvidia did it better. :)
close - Wednesday, May 13, 2015 - link
An analogy isn't meant to be perfect, just similar enough. But still I don't think you understood the main issue and what people are complaining about. Nobody said anything about running Nvidia software on AMD GPU. It's about being able to run Nvidia software on Nvidia GPU while also having an AMD GPU in your system. Disabling Nvidia components that I paid for just because it detects that I also have an AMD card is plain wrong. It's a crappy way of fighting competition forcing me to remove an AMD card from my system just so I can use something that I have already paid for. And no, you don't see this on the box.What if Intel decides to disable some feature on their CPUs or the software support as soon as it detects an AMD card? How can that be considered fair practice?
So I'm not hoping to run BMW software on a merc. I am demanding however to be allowed to run the BMW as I bought it regardless of the fact that I also want to own another different brand. I bought a complete set of components + features and I wand to use them all as long as promised.
D. Lister - Wednesday, May 13, 2015 - link
@closeWhat if Intel decides to disable some feature on their CPUs or the software support as soon as it detects an AMD card? How can that be considered fair practice?
An intel CPU with an AMD GPU isn't the same as two different GPUs in the same system, sharing the same OS, with their individual drivers competing for the same resources. Would Intel provide support for a 3rd party dual CPU board, which had one socket for Intel and the other for AMD? Now if Nvidia GPU was not doing CUDA with an AMD CPU, that would be a different matter altogether.
I am demanding however to be allowed to run the BMW as I bought it regardless of the fact that I also want to own another different brand. I bought a complete set of components + features and I wand to use them all as long as promised.
Let me reiterate, a car is a standalone system. A GPU is a component of a system, you can't compare apples with oranges.
It seems like you don't want to own two systems (two cars) and run them side-by-side, what your analogy suggests you really want to do is take a car, mod it's engine with unauthorized parts, and then expect the OEM to let their engine timing software be run in such a system, and your argument is that since some people managed to do that without the car catching on fire so the software shouldn't be locking anyone out. From an individual's point of view, that seems to make sense, since it is your money spent and you should be able to do whatever you please with it. From a business' point of view though, things get a lot more complicated with liability thrown into the mix.
Also, if it was really that risk free for GPUs from the two companies to work in a CUDA environment, you can bet that AMD would have sued Nvidia's bollocks off for anti-competitive business practices and won. If they haven't still, then it means that Nvidia may just have a legally sound excuse for doing what they're doing.
close - Wednesday, May 13, 2015 - link
@D. Lister: You forget that Intel also has the GPU on die. This is good enough reason to disable stuff on that die when detecting an AMD or Nvidia GPU or disable support for QuickSync, or Optimus, or whatever. Because. This should be a good enough analogy even for a completely non-IT person. Like someone who thinks the graphic subSYSTEM (sub because it's part of an even larger system) is not actually a SYSTEM, like a car... you know... ;) They just decided to call it that for fun.Regardless, I think you're just trolling because it's impossible to "not get it" after so many tries. I paid for a card with certain features. As long as there's no clear statement on the box that those features will be arbitrarily disabled in case any competing product is found in the system then this is just crappy conduct put in place and supported by people with little to no respect for their customers. Just like the 970 4GB RAM issue, which was neither transmitted to the buyers, nor fully assumed by the company.
There is no excuse for crippling a product that I paid in full just because I decided to also use a competing product or a product that looks like it's a competing one.
AMD has no reason to sue Nvidia because they are not entitled to use that proprietary tech. Also, I don't care about AMD, you can replace this with any other driver as other guys commented (like an USB monitor). The customers have a right to sue because they paid for something and it's been disabled. I paid for a card and it's not behaving as advertised. It's as simple as that. The problem is it's too much of a hustle for a consumer to sue a company like Nvidia for this. I can't use my cards features because it finds traces of other drivers in my system so it artificially shuts down the whole thing.
And related to the cars vs. GPUs, yet again you fail to realize that I AM talking about a system. It's composed out of GPU, graphic RAM, drivers and supporting software. It's called a graphic subsystem for God's sake, isn't that a good enough clue that it's a system? And in that GPU you have multiple blocks doing different things. One talks to the memory. One schedules instructions. Etc. It's a system but since it's integrated (hence integrated circuits) people unrelated to the domain tend to think it's "just there". They forget about all the parts that make it work. The software just arbitrarily disables a feature present in the hardware after a simple system check. "IF non-Nvidia driver detected THEN disable X".
yannigr2 - Wednesday, May 13, 2015 - link
No no no. Either YOU DON'T GET IT, or you deliberately try to make it look completely different. In any case you are wrong.Crunchy005 - Wednesday, May 13, 2015 - link
Hmmm, Nvidia hacking other computers on the LAN to see if any non Nvidia cards are in use and turning off PhysX and CUDA on the Nvidia computer. I can see that happening.No one here is asking Nvidia tech to run on AMD hardware, just that Nvidia disables their proprietary stuff if it detects anything other than Nvidia in the system. I wonder if this happens if your running an AMD APU with an nvidia card but the graphics on the APU aren't being used at all but are detected. Bye bye PhysX and CUDA, because that makes sense.