NVIDIA’s GeForce 700M Family: Full Details and Specs
by Jarred Walton on April 1, 2013 9:00 AM ESTIntroducing the NVIDIA GeForce 700M Family
With spring now well under way and the pending launch of Intel’s Haswell chips, OEMs always like to have “new” parts across the board, and so once more we’re getting a new series of chips from NVIDIA, the 700M parts. We’ve already seen a few laptops shipping with the 710M and GT 730M; today NVIDIA is filling out the rest of 700M family. Last year saw NVIDIA’s very successful launch of mobile Kepler; since that time, the number of laptops shipping with NVIDIA dGPUs compared to AMD dGPUs appears to have shifted even more in NVIDIA’s favor.
Not surprisingly, with TSMC still on 28nm NVIDIA isn’t launching a new architecture, but they’ll be tweaking Kepler to keep it going through 2013. Today's launch of the various 700M GPUs is thus very similar to what we saw with the 500M launch: everything in general gets a bit faster than the previous generation. To improve Kepler NVIDIA is taking the existing architecture and making a few moderate tweaks, improving their drivers (which will also apply to existing GPUs), and as usual they’re continuing to proclaim the advantages of Optimus Technology.
Starting on the software side of things, we don’t really have anything new to add on the Optimus front, other than to say that in our experience it continues to work well on Windows platforms—Linux users may feel otherwise, naturally. On the bright side, things like the Bumblebee Project appear to be helping the situation, so now it's at least possible to utilize the dGPU and iGPU under Linux. As far as OEMs go, Optimus has now matured to the point where I can't immediately come up with any new laptop that has an NVIDIA GPU and doesn't support Optimus; we're now at the point where an NVIDIA equipped laptop inherently implies Optimus support.
The second software aspect is NVIDIA’s GeForce Experience software, which allows for automatic game configuration based on your hardware. You can see the full slide deck in the gallery at the end with a few additional details, but GeForce Experience is a new software tool that’s designed to automatically adjust supported games to the “best quality for your hardware” setting. This may not seem like a big deal for enthusiasts, but for your average Joe that doesn’t know what all the technical names mean (e.g. antialiasing, anisotropic filtering, specular highlighting, etc.) it’s a step towards making PCs more gamer friendly—more like a console experience, only with faster hardware. ;-) GeForce Experience is already in open beta, with over 1.5 million downloads and counting, so it’s definitely something people are using.
Finally, NVIDIA has added GPU Boost 2.0 to the 700M family. This is basically the same as what’s found in GeForce Titan, though with some tuning specific to mobile platforms as opposed to desktops. We’re told GPU Boost 2.0 is the same core hardware as GPU Boost 1.0, with software refinements allowing for more fine-grained control of the clocks. Ryan has already covered GPU Boost 2.0 extensively, so we won’t spend much more time on it other than to say that over a range of titles, NVIDIA is getting a 10-15% performance improvement relative to GPU Boost 1.0.
Moving to the hardware elements, hardware change only applies to one of the chips. GK104 will continue as the highest performing option in the GTX 675MX and GTX 680M (as well as the GTX 680MX in the iMac 27), and GK106 will likewise continue in the GTX 670MX (though it appears some 670MX chips also use GK104). In fact, for now NVIDIA isn’t announcing any new high-end mobile GPUs, so the GTX 600M parts will continue to fill that niche. The changes come for everything in the GT family, with some of the chips apparently continuing to use GK107 while a couple options will utilize a new GK208 part.
While NVIDIA won’t confirm which parts use GK208, the latest drivers do refer to that part number so we know it exists. GK208 looks to be largely the same as GK107, and we’re not sure if there are any real differences other than the fact that GK208 will be available as a 64-bit part. Given the similarity in appearance, it may serve as a 128-bit part as well. Basically, GK107 was never available in a 64-bit configuration, and GK208 remedies that (which actually makes it a lower end chip relative to GK107).
91 Comments
View All Comments
crypticsaga - Monday, April 1, 2013 - link
What possible reason would intel have for pushing a product like that? In fact if some sources are correct they are trying to do the exact opposite by bottlenecking even internal dGPU by limiting available PCIe lanes in Broadwell.shompa - Monday, April 1, 2013 - link
That problem have been solved for over a year with Thunderbolt. Use a Thunderbolt PCIE with graphic card.fokka - Monday, April 1, 2013 - link
afaik this is not entirely correct since even thunderbolt is too slow to properly utilize a modern graphics card.this is not surprising, since thunderbolt is based on 4x pci-e 2.0 (2GB/s) and current desktop class graphics are using 16x pci-e 3.0 (~16GB/s) which is about eight times as fast.
so i wouldn't say the problem is completely solved throughput-wise, but thunderbold sure was an important step in the right direction.
MojaMonkey - Monday, April 1, 2013 - link
No, shompa is correct, it has been solved with Thunderbolt and I'm personally using a GTX 680 connected externally. Works great.Wolfpup - Monday, April 1, 2013 - link
Ugh. You're either ignorant or reaaaaally generous with the hyperbole. "20+ lbs notebooks"? Really?In real life, mid-range notebooks/GPUs do fine for gaming, and high end notebooks/GPUs do...REALLY fine. When you can max out today's games at 1080p, that isn't "performing poorly", and is orders of magnitude better than Intel's video.
If YOU guys don't want high end notebooks, fine, but I don't see how they're hurting you.
lmcd - Tuesday, April 2, 2013 - link
My cheap A8m (Trinity) can play Rage at high texture res at panel res (1366x768), just for starters. And that's $400 level I think right now.superjim - Wednesday, April 10, 2013 - link
I can confirm this. An A8-4500M does really well for $400 or below on 1366x768. Now if the A10 would come down to $400 we'll really be in good shape.xenol - Monday, April 1, 2013 - link
I had a laptop with discrete graphics that lasted for over 9 hours on battery, while surfing the web. It was a laptop with an early form of Optimus (you had to manually switch), but still, you can have graphical performance xor battery life if you don't need the performance. But asking for both? Now that's silly.As for your issue with marketing the 680M as it is when it can't outperform a midrange desktop card... You do realize that this is a different market segment? Also you should tell "shame on you" to all the display companies who mislead customers into think they're buying a panel that can do 16 million colors (which last I checked, 18-bits is not 16 million) or have a 1000000:1 contrast ratio (which you need to be in a pitch black room and being shown a black/white checkerboard pattern to see).
Wolfpup - Monday, April 1, 2013 - link
"Modest performance increase"? I wouldn't call my GTX 680m a "modest performance increase" over Intel video lolAre you KIDDING?!? Notebook hardware is ALWAYS worse than desktop. This applies obviously to CPUs too, which you're inexplicably not complaining about. You always pay more to get the same performance. That doesn't mean it's "dishonest" or the like.
And quite obviously integrated video can never catch up with a discreet part so long as they make high end discreet parts, so the time is "never", not "near".
****
Regarding the article...Optimus...eh, Nvidia's driver team is impressive as always, but literally the first program I ran that I wanted to run on the GPU wouldn't run on the GPU...thankfully my notebook lets you turn off Optimus.
JarredWalton - Monday, April 1, 2013 - link
Which program did you run that you wanted on the GPU? Please be very specific.