I need to see if I can get Anand (or someone else) to walk me through the entire Chromium compilation process. I tried to get that working at one point and after wasting several days with nothing to show for it I gave up. LOL. (Yes, I followed a guide, but even then it didn't work for me.) If someone has a step-by-step guide that they: A) know works, and B) is free (open source) for all needed tools, and C) works with Windows 7 64-bit... post a link. Remember, not only do I need to figure out how to run the compile test, but so does Dustin and anyone else we have doing laptop reviews. :-)
I realize it is not as demanding as many of the other games, but so many people play it (and will play it for years to come) that is sure seems like it deserves a spot.
:-(
Still, thanks for keeping your benchmark suites fresh and relevant!
PS if anybody has tips/suggestions on the best <a href="http://www.jdhodges.com/2011/06/best-starcraft-2-g... for SC2</a> please let me know, thanks! [not talking about bells and whistles...but rather comfort, performance and reliability]
Ultimately, Dustin and I decided to drop SC2 from the list because it:
1) doesn't scale at all with more than 2 CPU cores 2) is rather a pain to benchmark 3) is DX9 only 4) runs more than fast enough at all but the most demanding settings
Basically, it's more of a CPU benchmark in most cases, as even with lots of stuff happening on screen the two-core nature of the engine doesn't allow it to scale that well. Sure, at Ultra quality with 4xAA forced on in your drivers (another pain to deal with), it drops FPS down to around 23 on the G74SX, but that still doesn't address the other items I'm concerned with. I figure if Civ5 and TWS2 run on a laptop, SC2 will run fine as well.
It makes a lot more sense to me now to drop it, since it sounds like a real PITA to benchmark :-) Also, you raise a good point that if higher end games play okay on a particular laptop then SC2 should be good to go...
Could you consider including Brink? Admittedly it's id Tech 4, but heavily modified to make use of OpenGL 3.1. It also uses virtual texturing like RAGE. The upcoming Prey 2 also uses id Tech 4 so performance in Brink does have forward looking relevance.
Brink's really not any better. It's a multiplayer game that no one plays (and I say that as an owner), and because it's a MP game there's no practical way to structure a repeatable benchmark.
We would have definitely liked to include an OpenGL game, but even I have to admit that OpenGL just isn't very important right now. Maybe Doom 4 or Prey 2 will change that, but with id not licensing Tech 5, OpenGL is quickly becoming an API that's only applicable to id games.
"as where many wouldn’t notice the difference between a web page loading in two seconds and a web page loading in one second"
That's not all that true,there are some actual numbers from Google and Amazon about how page loading time relates to sales/searches: -Amazon : every 100 ms increase in load time of Amazon.com decreased sales by 1% -Google: a change in a 10-result page loading in 0.4 seconds to a 30-result page loading in 0.9 seconds decreased traffic and ad revenues by 20%. In the end it might be a more usefull test than some synthetic tests already on the list.
Intel low power CPUs choke when both GPU and CPU are active and maybe the 17W Trinity SKUs will too.Maybe there should be a test that reflects that,besides the gaming ones since it's a rather important piece of info.
Finally,the most important test that is missing is WIFI perf,it is a mobile device after all.
It's so funny that Google knows about the numbers and does even provide tools to measure the load times and make suggestions how to improve. However when I run the performance analysis against one of my websites, some of the suggestions to improve are for the Google Analytics, and Google Adsense scripts (like not allowing caching of the script, or only allowing for less than a week, or scripts loaded from redirections, etc.).
I also see many web pages waiting for serves that load google AdSense or analytics.
Is it just me, or should Google start to eat it's own dog food?
What would you like us to test with WiFi? Maximum throughput? Part of that will be determined by the chipset, a larger part more than likely will come from the choice of router, and the rest is noise and interference between runs. I do make a point of listing the WiFi chipset, which tells you about 90% of what you need to know.
(Hint: 1x1:1 MIMO solutions are the bottom of the barrel and come in most laptops; the 2x2:2 solutions are okay, and if you have a 5GHz router that can really help remove interference. I've only tested about three laptops with 3x3:3 MIMO over the years, and sadly my current router can't even support the maximum throughput.)
You guys are already testing WIFI for phones and tablets,it's easy to just apply the same methodology.I do have a hard time understanding why do it there and not here and this internet thing is something that tends to be used a lot. I guess testing notebooks started by doing the same thing as on desktops and it didn't seemed obvious that this should be tested. Listing the part used helps only folks that know what it means and you get what you expect only if the OEM does things right. I don't have a laptop example but look at Asus Transformer Prime and it's WIFI and GPS problems, can you be sure that there aren't a bunch of laptops that offer a lot less than they should? Poor WIFI perf could also be a deal breaker for many,if they knew about it and maybe,just maybe,some manufacturers would pay more attention to it.
For smartphones, you're looking at 1x1:1 2.4GHz solutions with a very small device that can easily run into connectivity problems if, for instance, the casing is all aluminum. For laptops, it's usually not much of a concern (e.g. they're plenty large to route wires in a sensible fashion for the antennae). WiFi speeds usually aren't a concern unless you're transferring large files. If you're doing that, then you're going to typically be at the limits of the chipset, not the laptop, and you'd be far better off with Ethernet regardless.
Anyway, there are other problems with trying to test laptop wireless speeds. One is that we have at least two different test locations, so that necessitates getting identical hardware on both sides -- router for sure, and testing location won't be identical. Even with the right hardware, outside interference (from neighboring networks) is a potential problem.
The better solution IMO is a separate review of wireless chipsets. I tried to do that with the Killer Wireless-N review a while back, and Brian is working on a more complete roundup of wireless chipsets. Outside of that review, I'll see what Anand thinks about getting us equipped with the necessary hardware to test wireless.
maybe this helps http://www.smallnetbuilder.com/wireless/wireless-f... Anyway design matters here too and ofc it impacts range too not only throughput.For using Ethernet,you don't always have it,maybe you use your laptop mostly as a desktop replacement but in the end that's not it's main purose for many. The differences are there and i don't know how many folks wouldn't care at least about range.
Please, please, please add OpenGL benchmarks for professional users. OpenCL would be quite good also, specifically data transfer between the main mem to graphics memory.
We generally run SpecViewPerf on workstation GPUs (Quadro FX), but is there really a desire to see those tests run on consumer graphics as well? Even a basic Quadro FX will generally outperform the fastest consumer cards in professional OpenGL testing, simply because of the drivers/firmware.
For OpenCL, Ryan tests that on GPUs to some extent, but I'm not sure how many people are seriously looking at OpenCL performance on laptops. Would you suggest using the same tests as Ryan is using, or do you have some specific OpenCL benchmark you'd like us to run?
PowerDirector 10 uses OpenCL to render it's video effects. Video editing seems like a good use case that most users can relate to rather than say fluid simulation or Monte Carlo calculations.
PowerDirector 10 also supports AMD APP Acceleration (which is OpenCL I suppose), nVidia CUDA, and Intel QuickSync for final encoding so could be useful to compare each platform's ideal accelerated encoding method.
The upcoming WinZip 16.5 is supposed to be OpenCL accelerated for compression, decompression, and encryption making another benchmark with a use case that is applicable to most users.
I don't know anyone that uses PowerDirector 10, so I'd be curious about how it's viewed (note: I'm not a video editor by any stretch). WinZip on the other hand is a far more interesting option; I'll keep an eye out for that one. :-)
I'd note that we dropped our video encoding benchmark on GPU Bench midway through the year last year, because GPU accelerated video encoding was actually CPU limited. Performance quickly plateaued at GTX 460/Radeon 5750 levels, as at that point the GPUs outran the CPU.
Would it be possible to add the screen size to the specs listed for each system in Bench? It's kinda silly to be missing since that's one of the primary criteria people use to narrow down models.
Wouldn't it make more sense to just use medium presets for the medium benchmark, high for high, ultra (or very high) for ultra and just drop the "low" benchmarks altogether?
Basically, we've done what you suggested, only we call the settings Low, Medium, and High rather than Medium, High, and Ultra. It just seems weird to call test settings Medium/High/Ultra--or take it to another level and test at High/Very High/Extreme--when we can call the settings Low/Med/High. It's just semantics. Anyway, the settings were selected for two things:
1) Get reasonable quality for the target res/setting (Min/Low is often insufficient) 2) Make sure there's a difference between the detail settings (this is why we don't test DiRT 3 at Ultra and Ultra + 4xAA for instance).
In several games, the difference between many of the settings is negligible, both in terms of quality and in terms of performance. We don't feel there's much point in testing at settings where games run at 100+ FPS unless there's no other option (e.g. Portal 2), and likewise we didn't want to have results where it was basically same quality, different resolution (unless we couldn't find a better option). Batman is another example of this, as 1366x768 at Low settings is only slightly slower than 1366x768 at Very High settings. Anyway, the main thing was to let people know exactly how we plan to test in one location, so that I can just link it in future reviews.
Have you looked at changing the names to providing some sort of meaning other than just the level of the test? Something along the lines of "Playable, High Quality, Max Quality"
Changing the names from Medium, High, and Ultra will be jarring for me. When skimming I will see "Low" and think the minium settings needed to run the game. Which is different than the "playable" or "medium" settings you are presenting.
While I can learn to adjust to this change, irregular AT readers might not and walk away with the wrong impression of what the test was representing.
I agree that when you use the terms "low / medium / high" there is an implication that you may be referring to the in-game settings rather than your interpretation of the different settings that are worth benchmarking. A careless reader may not notice the difference.
To me, it makes sense to compare a cheap laptop to the cheap level and an expensive laptop to the expensive level (obviously I mean expensive gaming laptop since for a gaming benchmark). So I would suggest dividing it by market segment like so: low -> value settings medium -> mainstream settings high -> performance settings
Our charts will continue to list the game settings we use for testing, plus I intend to link back to this article on the gaming section so that new readers can understand exactly what we're testing. We could also call the settings "Value/Mainstream/Performance" or something, but all that really says to me is "they are using custom settings so make sure you know what is being tested". Which isn't necessarily a bad thing.
I think at some point I need to go through the games and capture screenshots at our Low/Med/High settings as well to give a better indication of what the various settings mean -- and maybe add a "minimum" screenshot as well to show why we're skipping that in most titles. That probably won't happen until post-CES though.
"they are using custom settings so make sure you know what is being tested"
That's basically what I'm pushing for. If would be ok if your medium test was very similar to the medium setting, but since almost all of your tests have a naming conflict with in game settings (low test = medium settings) I would find it helpful to call them something different.
Okay, I've gone ahead and renamed the benchmarks to Value/Mainstream/Enthusiast, as I can see where confusion might otherwise result. Hopefully I caught all the references, but if not I'm sure someone will set me straight. :-)
When I read the headline... nice fancy name there for just a reshuffle of what you already do. And there's enough of that going around already isn't there...
I skim through AT reviews despite superior production quality. I read notebookcheck reviews in full despite frequent editing and translation errors because (aside from much preferring monolithic-single-page-rendering...) they consider so many more aspects to the product in front of them.
They also take the effort to disassemble and put up a lot more pictures of the unit and thorough screen (incl viewing angle) assessment. If I google a model for reviews, I want to read new stuff (not most of the same stuff as every other review) and NBC gets my browser time most of the time for that reason.
And please please please incorporate some of the methodology from http://techreport.com/articles.x/21516 rather than just simplistic FPS measures, especially for the SLI/Xfire setups...
Fancy headline for a "reshuffle"? How about, I thought it would be useful to open up discussion to see if there's anything we missed that people would like to see us include, along with a detailed list of the benchmark settings we're using for games (so that I don't have to try and put this all into each laptop review)? And you do realize that this is an article specific to laptop testing, so we're not going to go into more detail on gaming performance and we rarely test SLI/CF setups, right? As for the pictures, I'm not sure what more you'd want from us. We do everything you mention -- screen angle shots, pictures from lots of angles, etc.
Anyway, as I note in the article, we're open for suggestions on what you'd like to see added that isn't already there. Notebookcheck has a rundown of each laptop that's pretty much just regurgitating the spec sheet, so I think we're covered there. We run a standardized set of benchmarks that includes more detailed graphs, though perhaps some would prefer the NBC approach (e.g. just show the scores from the laptop being tested with a "heat map" below showing the spread and frequency of other scores)?
Consider the scope of the review and by all means let us know which aspects of laptop reviews you'd like us to cover more. About the only major test that NBC runs that we don't have is CrystalDisk, but I'm not sure how useful that really is. SSDs are much faster than HDDs, and the differences between HDDs are largely meaningless by comparison. I generally figure anyone after fast storage for a laptop will be looking to upgrade to an SSD regardless, and if that's the case they'll be reading our SSD reviews after determing which laptop they want. But let me throw this out there:
Are there others that would like us to run one of these "quick and dirty" storage benchmarks on the laptops we test? Is the PCMark 7 Storage score insufficient in what it reports? I'm not going to add a test because of one request for it, but if a lot of you would like some additional tests let me know.
You would think most laptop gamers would be either playing World of warcraft, or Starcraft 2. I'm not being a fanboy or anything but why would they not include those in the games list? Most of the kids i see playing games on my campus are playing Blizzard titles.
They're older games and DX9 titles as well. If the gaming suite we test runs sufficiently fast on a laptop, I can pretty much guarantee SC2 and WoW will run.
When testing the battery please include a moderate amount of FLASH only sites the likes of Tag Heuer or famous car brands flash minisites. Flash is an important part of the web and would make your tests more realistic.
" I’m not sure most users would notice the difference between a 2GHz Core 2 Duo or Athlon X2 laptop and a quad-core i7-2760QM. This is why battery life is such an important element, as where many wouldn’t notice the difference between a web page loading in two seconds and a web page loading in one second, they’re far more likely to notice two hours of battery life versus four or eight hours. "
I can honestly tell you that one second feels for me like a lifetime in computing. If i where to load 100 pages during a surfing-session, if i can call it that (my English ain't the best), I would most certainly prefer those pages to load in 1 second instead of two. And, in the werry moment you get to a bit more complex page, you are without a doubt gona notice a difference between lets say a Intel Sandy Bridge based CPU versus any AMD CPU. Most users also do other stuff than just surf a web-page. They also extract zip,rar and other files, and even here youre gona notice a difference between Athlon X2 and a i7-2760QM. And if youre seeling computers, be shure to look two to three years forward in time. I would prefer to sit with a i5-2310M versus any Athlon X2 laptop.
So even if most people won't notice a huge difference, a second here and a secongd there still counts.
On the third page, third paragraph from the bottom: "We are still early enough in 2011...", I think should be "2012", unless you also have developed that time travel machine.
A few thoughts for someone who went out and bought an XPS 15 L502x immediately after reading your L501x / L502X reviews.
I've got to agree with an earlier poster that I always skim the mobile reviews here at AnandTech, but frequently find myself referring to laptop reviews elsewhere (including notebookcheck.net) for better formatted specs, more helpful product pics, more dense information, specific noise / heat info, power adapter weight / pics, build quality opinions, etc.. Your current article format is perfect for pro hardware and new tech reviews (7970 review, vertex 3 review, etc.) but your consumer / mobile reviews could be a lot more dense ( http://www.notebookcheck.net/Review-Samsung-305U1A... ) or at least a little bit less ugly (see http://www.theverge.com/2012/1/4/2677801/hp-envy-1... ).
This may have more to do with your page layout formatting than anything else though, even on my 1080p screen I have to do a -lot- of scrolling on your site and the dropdown navigation for your articles is a bit clunky. This site has a solid reputation and I trust the recommendations of Anand more than any other tech site, but you could really use a facelift that was a little bit less all-business-all-the-time a la the asus transformer prime :)
As for benchmarks a sleep, hibernate, wake from sleep, wake from hibernate could be really helpful.
Ok, it's very niche, but if you would like to try a real ball-busting CPU test I would recommend a combination of 3DSMax 2012 and VRay. It's the most popular combination in the 3D world at the moment and nothing sucks the life out of a PC better than a VRay render. Plus these days VRay is available for Max, Maya, Sketchup, Rhino, Softimage and probably more. It also runs on OSX under the apps available for that platform. Something to think about.
I'd like to request a benchmark for battery life when gaming. This wouldn't need to be the full set of gaming tests, but just one or two games run at value, mainstream, and enthusiast settings.
I own a high-end gaming desktop for gaming at home, but I also do a lot of business travel, and I like to having a laptop that can run games during long waits at the airport. Right now, I have a Dell XPS 15, which does pretty well, but in the future, I'd be interested to know the tradeoffs in battery life and performance for dedicated gaming laptops (e.g. Alienware) as well as high-end mainstream laptops (e.g. XPS).
I realize that laptop battery life is pretty poor across the board when running demanding 3D games, but for me, there's a big difference between 90 minutes (which I get from my XPS 15), and say, 20 minutes. With a spare battery, the former gives you 3 hours of gaming time -- more than enough for typical flight delays.
Besides tripling the number of battery life tests (ouch!), there are other factors to consider. First, you need a "game test" that's repeatable. If you play Batman for a couple hours on a laptop, and then play it again for a couple hours, I'm not sure the load will be the same. Then again, I doubt that the difference between Value/Mainstream/Enthusiast and even different games will be much -- they all activate the GPU and put a decent load on the CPU, so they're pretty much the worst-case scenario.
How about this one: start battery life timer, open Skyrim save, and let the laptop just sit there until it runs dead. (Note that Skyrim will have the camera start circling the player in third person mode after a minute or two of inactivity.) That's about as consistently repeatable as I can come up with, though not necessarily a realistic test case. I would also be willing to check battery life on one laptop at our three detail settings to verify whether there's a difference in battery life or not. I might also check out some other games to see if they show variance in battery life (and if I can find a good looping test). Let me see if we can come up with something reasonable -- and we'd also want to test performance on battery power, as most higher end GPUs really clamp down on maximum clocks when on DC power.
I was wondering if you would be able to add to the video tests for games more notebooks with Nvidia Quadro and ATI FireGL video cards, as Dell specifically & several other vendors tend to use those cards as the only option available in business level or professional level laptops & it would be nice to know how well you could expect your $1500 Business laptop $3,000 CAD laptop to play some games when you get home with it.
For Example AMD FirePro M8900 Mobility Pro NVIDIA Quadro 3000M NVIDIA Quadro 4000M NVIDIA Quadro 5010M
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
48 Comments
Back to Article
codedivine - Friday, January 6, 2012 - link
Can you include the Chromium compilation test that you to do for CPU tests?And I would very much like to see Blender too if possible.
JarredWalton - Friday, January 6, 2012 - link
I need to see if I can get Anand (or someone else) to walk me through the entire Chromium compilation process. I tried to get that working at one point and after wasting several days with nothing to show for it I gave up. LOL. (Yes, I followed a guide, but even then it didn't work for me.) If someone has a step-by-step guide that they: A) know works, and B) is free (open source) for all needed tools, and C) works with Windows 7 64-bit... post a link. Remember, not only do I need to figure out how to run the compile test, but so does Dustin and anyone else we have doing laptop reviews. :-)Conficio - Friday, January 6, 2012 - link
I'd second that. A compilation test would be great.I'd even love to see some sort of Java test, may be an Eclipse Import a large project from zip and wait until it has finished updating the workspace.
coolhardware - Friday, January 6, 2012 - link
I realize it is not as demanding as many of the other games, but so many people play it (and will play it for years to come) that is sure seems like it deserves a spot.:-(
Still, thanks for keeping your benchmark suites fresh and relevant!
PS if anybody has tips/suggestions on the best <a href="http://www.jdhodges.com/2011/06/best-starcraft-2-g... for SC2</a> please let me know, thanks! [not talking about bells and whistles...but rather comfort, performance and reliability]
coolhardware - Friday, January 6, 2012 - link
Sorry about that malformed URL. Here's the correct one: http://www.jdhodges.com/2011/06/best-starcraft-2-g...PS any other games people are missing or games that you are excited to see WERE added?
JarredWalton - Friday, January 6, 2012 - link
Ultimately, Dustin and I decided to drop SC2 from the list because it:1) doesn't scale at all with more than 2 CPU cores
2) is rather a pain to benchmark
3) is DX9 only
4) runs more than fast enough at all but the most demanding settings
Basically, it's more of a CPU benchmark in most cases, as even with lots of stuff happening on screen the two-core nature of the engine doesn't allow it to scale that well. Sure, at Ultra quality with 4xAA forced on in your drivers (another pain to deal with), it drops FPS down to around 23 on the G74SX, but that still doesn't address the other items I'm concerned with. I figure if Civ5 and TWS2 run on a laptop, SC2 will run fine as well.
coolhardware - Friday, January 6, 2012 - link
Cool, thanks for listing those factors.It makes a lot more sense to me now to drop it, since it sounds like a real PITA to benchmark :-) Also, you raise a good point that if higher end games play okay on a particular laptop then SC2 should be good to go...
Appreciate all the hard work you guys do!
zebrax2 - Friday, January 6, 2012 - link
Maybe you could include at least rage?JarredWalton - Friday, January 6, 2012 - link
We had intended to include Rage, right up to the point where it was released and we discovered that it is largely useless as a benchmark:http://www.anandtech.com/show/4970/rage-against-th...
ltcommanderdata - Saturday, January 7, 2012 - link
Could you consider including Brink? Admittedly it's id Tech 4, but heavily modified to make use of OpenGL 3.1. It also uses virtual texturing like RAGE. The upcoming Prey 2 also uses id Tech 4 so performance in Brink does have forward looking relevance.Ryan Smith - Saturday, January 7, 2012 - link
Brink's really not any better. It's a multiplayer game that no one plays (and I say that as an owner), and because it's a MP game there's no practical way to structure a repeatable benchmark.We would have definitely liked to include an OpenGL game, but even I have to admit that OpenGL just isn't very important right now. Maybe Doom 4 or Prey 2 will change that, but with id not licensing Tech 5, OpenGL is quickly becoming an API that's only applicable to id games.
jjj - Friday, January 6, 2012 - link
"as where many wouldn’t notice the difference between a web page loading in two seconds and a web page loading in one second"That's not all that true,there are some actual numbers from Google and Amazon about how page loading time relates to sales/searches:
-Amazon : every 100 ms increase in load time of Amazon.com decreased sales by 1%
-Google: a change in a 10-result page loading in 0.4 seconds to a 30-result page loading in 0.9 seconds decreased traffic and ad revenues by 20%.
In the end it might be a more usefull test than some synthetic tests already on the list.
Intel low power CPUs choke when both GPU and CPU are active and maybe the 17W Trinity SKUs will too.Maybe there should be a test that reflects that,besides the gaming ones since it's a rather important piece of info.
Finally,the most important test that is missing is WIFI perf,it is a mobile device after all.
Conficio - Friday, January 6, 2012 - link
It's so funny that Google knows about the numbers and does even provide tools to measure the load times and make suggestions how to improve. However when I run the performance analysis against one of my websites, some of the suggestions to improve are for the Google Analytics, and Google Adsense scripts (like not allowing caching of the script, or only allowing for less than a week, or scripts loaded from redirections, etc.).I also see many web pages waiting for serves that load google AdSense or analytics.
Is it just me, or should Google start to eat it's own dog food?
JarredWalton - Friday, January 6, 2012 - link
What would you like us to test with WiFi? Maximum throughput? Part of that will be determined by the chipset, a larger part more than likely will come from the choice of router, and the rest is noise and interference between runs. I do make a point of listing the WiFi chipset, which tells you about 90% of what you need to know.(Hint: 1x1:1 MIMO solutions are the bottom of the barrel and come in most laptops; the 2x2:2 solutions are okay, and if you have a 5GHz router that can really help remove interference. I've only tested about three laptops with 3x3:3 MIMO over the years, and sadly my current router can't even support the maximum throughput.)
jjj - Saturday, January 7, 2012 - link
You guys are already testing WIFI for phones and tablets,it's easy to just apply the same methodology.I do have a hard time understanding why do it there and not here and this internet thing is something that tends to be used a lot.I guess testing notebooks started by doing the same thing as on desktops and it didn't seemed obvious that this should be tested. Listing the part used helps only folks that know what it means and you get what you expect only if the OEM does things right. I don't have a laptop example but look at Asus Transformer Prime and it's WIFI and GPS problems, can you be sure that there aren't a bunch of laptops that offer a lot less than they should? Poor WIFI perf could also be a deal breaker for many,if they knew about it and maybe,just maybe,some manufacturers would pay more attention to it.
JarredWalton - Saturday, January 7, 2012 - link
For smartphones, you're looking at 1x1:1 2.4GHz solutions with a very small device that can easily run into connectivity problems if, for instance, the casing is all aluminum. For laptops, it's usually not much of a concern (e.g. they're plenty large to route wires in a sensible fashion for the antennae). WiFi speeds usually aren't a concern unless you're transferring large files. If you're doing that, then you're going to typically be at the limits of the chipset, not the laptop, and you'd be far better off with Ethernet regardless.Anyway, there are other problems with trying to test laptop wireless speeds. One is that we have at least two different test locations, so that necessitates getting identical hardware on both sides -- router for sure, and testing location won't be identical. Even with the right hardware, outside interference (from neighboring networks) is a potential problem.
The better solution IMO is a separate review of wireless chipsets. I tried to do that with the Killer Wireless-N review a while back, and Brian is working on a more complete roundup of wireless chipsets. Outside of that review, I'll see what Anand thinks about getting us equipped with the necessary hardware to test wireless.
jjj - Saturday, January 7, 2012 - link
maybe this helps http://www.smallnetbuilder.com/wireless/wireless-f...Anyway design matters here too and ofc it impacts range too not only throughput.For using Ethernet,you don't always have it,maybe you use your laptop mostly as a desktop replacement but in the end that's not it's main purose for many.
The differences are there and i don't know how many folks wouldn't care at least about range.
jalexoid - Friday, January 6, 2012 - link
Please, please, please add OpenGL benchmarks for professional users. OpenCL would be quite good also, specifically data transfer between the main mem to graphics memory.JarredWalton - Saturday, January 7, 2012 - link
We generally run SpecViewPerf on workstation GPUs (Quadro FX), but is there really a desire to see those tests run on consumer graphics as well? Even a basic Quadro FX will generally outperform the fastest consumer cards in professional OpenGL testing, simply because of the drivers/firmware.For OpenCL, Ryan tests that on GPUs to some extent, but I'm not sure how many people are seriously looking at OpenCL performance on laptops. Would you suggest using the same tests as Ryan is using, or do you have some specific OpenCL benchmark you'd like us to run?
ltcommanderdata - Saturday, January 7, 2012 - link
PowerDirector 10 uses OpenCL to render it's video effects. Video editing seems like a good use case that most users can relate to rather than say fluid simulation or Monte Carlo calculations.PowerDirector 10 also supports AMD APP Acceleration (which is OpenCL I suppose), nVidia CUDA, and Intel QuickSync for final encoding so could be useful to compare each platform's ideal accelerated encoding method.
The upcoming WinZip 16.5 is supposed to be OpenCL accelerated for compression, decompression, and encryption making another benchmark with a use case that is applicable to most users.
JarredWalton - Saturday, January 7, 2012 - link
I don't know anyone that uses PowerDirector 10, so I'd be curious about how it's viewed (note: I'm not a video editor by any stretch). WinZip on the other hand is a far more interesting option; I'll keep an eye out for that one. :-)Ryan Smith - Saturday, January 7, 2012 - link
I'd note that we dropped our video encoding benchmark on GPU Bench midway through the year last year, because GPU accelerated video encoding was actually CPU limited. Performance quickly plateaued at GTX 460/Radeon 5750 levels, as at that point the GPUs outran the CPU.QChronoD - Friday, January 6, 2012 - link
Would it be possible to add the screen size to the specs listed for each system in Bench? It's kinda silly to be missing since that's one of the primary criteria people use to narrow down models.ArKritz - Friday, January 6, 2012 - link
Wouldn't it make more sense to just use medium presets for the medium benchmark, high for high, ultra (or very high) for ultra and just drop the "low" benchmarks altogether?JarredWalton - Saturday, January 7, 2012 - link
Basically, we've done what you suggested, only we call the settings Low, Medium, and High rather than Medium, High, and Ultra. It just seems weird to call test settings Medium/High/Ultra--or take it to another level and test at High/Very High/Extreme--when we can call the settings Low/Med/High. It's just semantics. Anyway, the settings were selected for two things:1) Get reasonable quality for the target res/setting (Min/Low is often insufficient)
2) Make sure there's a difference between the detail settings (this is why we don't test DiRT 3 at Ultra and Ultra + 4xAA for instance).
In several games, the difference between many of the settings is negligible, both in terms of quality and in terms of performance. We don't feel there's much point in testing at settings where games run at 100+ FPS unless there's no other option (e.g. Portal 2), and likewise we didn't want to have results where it was basically same quality, different resolution (unless we couldn't find a better option). Batman is another example of this, as 1366x768 at Low settings is only slightly slower than 1366x768 at Very High settings. Anyway, the main thing was to let people know exactly how we plan to test in one location, so that I can just link it in future reviews.
Gast - Saturday, January 7, 2012 - link
Have you looked at changing the names to providing some sort of meaning other than just the level of the test? Something along the lines of "Playable, High Quality, Max Quality"Changing the names from Medium, High, and Ultra will be jarring for me. When skimming I will see "Low" and think the minium settings needed to run the game. Which is different than the "playable" or "medium" settings you are presenting.
While I can learn to adjust to this change, irregular AT readers might not and walk away with the wrong impression of what the test was representing.
PolarisOrbit - Saturday, January 7, 2012 - link
I agree that when you use the terms "low / medium / high" there is an implication that you may be referring to the in-game settings rather than your interpretation of the different settings that are worth benchmarking. A careless reader may not notice the difference.To me, it makes sense to compare a cheap laptop to the cheap level and an expensive laptop to the expensive level (obviously I mean expensive gaming laptop since for a gaming benchmark). So I would suggest dividing it by market segment like so:
low -> value settings
medium -> mainstream settings
high -> performance settings
JarredWalton - Saturday, January 7, 2012 - link
Our charts will continue to list the game settings we use for testing, plus I intend to link back to this article on the gaming section so that new readers can understand exactly what we're testing. We could also call the settings "Value/Mainstream/Performance" or something, but all that really says to me is "they are using custom settings so make sure you know what is being tested". Which isn't necessarily a bad thing.I think at some point I need to go through the games and capture screenshots at our Low/Med/High settings as well to give a better indication of what the various settings mean -- and maybe add a "minimum" screenshot as well to show why we're skipping that in most titles. That probably won't happen until post-CES though.
Gast - Saturday, January 7, 2012 - link
"they are using custom settings so make sure you know what is being tested"That's basically what I'm pushing for. If would be ok if your medium test was very similar to the medium setting, but since almost all of your tests have a naming conflict with in game settings (low test = medium settings) I would find it helpful to call them something different.
JarredWalton - Saturday, January 7, 2012 - link
Okay, I've gone ahead and renamed the benchmarks to Value/Mainstream/Enthusiast, as I can see where confusion might otherwise result. Hopefully I caught all the references, but if not I'm sure someone will set me straight. :-)bennyg - Friday, January 6, 2012 - link
When I read the headline... nice fancy name there for just a reshuffle of what you already do. And there's enough of that going around already isn't there...I skim through AT reviews despite superior production quality.
I read notebookcheck reviews in full despite frequent editing and translation errors because (aside from much preferring monolithic-single-page-rendering...) they consider so many more aspects to the product in front of them.
They also take the effort to disassemble and put up a lot more pictures of the unit and thorough screen (incl viewing angle) assessment. If I google a model for reviews, I want to read new stuff (not most of the same stuff as every other review) and NBC gets my browser time most of the time for that reason.
And please please please incorporate some of the methodology from http://techreport.com/articles.x/21516 rather than just simplistic FPS measures, especially for the SLI/Xfire setups...
JarredWalton - Friday, January 6, 2012 - link
Fancy headline for a "reshuffle"? How about, I thought it would be useful to open up discussion to see if there's anything we missed that people would like to see us include, along with a detailed list of the benchmark settings we're using for games (so that I don't have to try and put this all into each laptop review)? And you do realize that this is an article specific to laptop testing, so we're not going to go into more detail on gaming performance and we rarely test SLI/CF setups, right? As for the pictures, I'm not sure what more you'd want from us. We do everything you mention -- screen angle shots, pictures from lots of angles, etc.Anyway, as I note in the article, we're open for suggestions on what you'd like to see added that isn't already there. Notebookcheck has a rundown of each laptop that's pretty much just regurgitating the spec sheet, so I think we're covered there. We run a standardized set of benchmarks that includes more detailed graphs, though perhaps some would prefer the NBC approach (e.g. just show the scores from the laptop being tested with a "heat map" below showing the spread and frequency of other scores)?
Consider the scope of the review and by all means let us know which aspects of laptop reviews you'd like us to cover more. About the only major test that NBC runs that we don't have is CrystalDisk, but I'm not sure how useful that really is. SSDs are much faster than HDDs, and the differences between HDDs are largely meaningless by comparison. I generally figure anyone after fast storage for a laptop will be looking to upgrade to an SSD regardless, and if that's the case they'll be reading our SSD reviews after determing which laptop they want. But let me throw this out there:
Are there others that would like us to run one of these "quick and dirty" storage benchmarks on the laptops we test? Is the PCMark 7 Storage score insufficient in what it reports? I'm not going to add a test because of one request for it, but if a lot of you would like some additional tests let me know.
kedesh83 - Saturday, January 7, 2012 - link
You would think most laptop gamers would be either playing World of warcraft, or Starcraft 2. I'm not being a fanboy or anything but why would they not include those in the games list? Most of the kids i see playing games on my campus are playing Blizzard titles.JarredWalton - Saturday, January 7, 2012 - link
They're older games and DX9 titles as well. If the gaming suite we test runs sufficiently fast on a laptop, I can pretty much guarantee SC2 and WoW will run.ananduser - Saturday, January 7, 2012 - link
When testing the battery please include a moderate amount of FLASH only sites the likes of Tag Heuer or famous car brands flash minisites. Flash is an important part of the web and would make your tests more realistic.signorRosso - Saturday, January 7, 2012 - link
In hardware or software?10-bit is mentioned at the bottom of this AT article page...
http://www.anandtech.com/show/4380/discrete-htpc-g...
signorRosso - Saturday, January 7, 2012 - link
Disregard the previous comment entirely!Paedric - Saturday, January 7, 2012 - link
I don't know if it is possible to do it, or if it is useful, but what about testing the performance of virtual machines?Also, you said you now use IE9 instead of IE8, is there a significant difference in battery life between Chrome/Firefox/IE9/Opera?
gero9mo - Saturday, January 7, 2012 - link
" I’m not sure most users would notice the difference between a 2GHz Core 2 Duo or Athlon X2 laptop and a quad-core i7-2760QM. This is why battery life is such an important element, as where many wouldn’t notice the difference between a web page loading in two seconds and a web page loading in one second, they’re far more likely to notice two hours of battery life versus four or eight hours. "I can honestly tell you that one second feels for me like a lifetime in computing. If i where to load 100 pages during a surfing-session, if i can call it that (my English ain't the best), I would most certainly prefer those pages to load in 1 second instead of two. And, in the werry moment you get to a bit more complex page, you are without a doubt gona notice a difference between lets say a Intel Sandy Bridge based CPU versus any AMD CPU. Most users also do other stuff than just surf a web-page. They also extract zip,rar and other files, and even here youre gona notice a difference between Athlon X2 and a i7-2760QM. And if youre seeling computers, be shure to look two to three years forward in time. I would prefer to sit with a i5-2310M versus any Athlon X2 laptop.
So even if most people won't notice a huge difference, a second here and a secongd there still counts.
PreacherEddie - Saturday, January 7, 2012 - link
On the third page, third paragraph from the bottom: "We are still early enough in 2011...", I think should be "2012", unless you also have developed that time travel machine.losonn - Saturday, January 7, 2012 - link
A few thoughts for someone who went out and bought an XPS 15 L502x immediately after reading your L501x / L502X reviews.I've got to agree with an earlier poster that I always skim the mobile reviews here at AnandTech, but frequently find myself referring to laptop reviews elsewhere (including notebookcheck.net) for better formatted specs, more helpful product pics, more dense information, specific noise / heat info, power adapter weight / pics, build quality opinions, etc.. Your current article format is perfect for pro hardware and new tech reviews (7970 review, vertex 3 review, etc.) but your consumer / mobile reviews could be a lot more dense ( http://www.notebookcheck.net/Review-Samsung-305U1A... ) or at least a little bit less ugly (see http://www.theverge.com/2012/1/4/2677801/hp-envy-1... ).
This may have more to do with your page layout formatting than anything else though, even on my 1080p screen I have to do a -lot- of scrolling on your site and the dropdown navigation for your articles is a bit clunky. This site has a solid reputation and I trust the recommendations of Anand more than any other tech site, but you could really use a facelift that was a little bit less all-business-all-the-time a la the asus transformer prime :)
As for benchmarks a sleep, hibernate, wake from sleep, wake from hibernate could be really helpful.
coolhardware - Sunday, January 8, 2012 - link
Nice points here.+Notebookcheck definitely does a great job. :-)
-To me though, theverge layout doesn't read/flow nearly as nicely as Anandtech does right now... :-(
colonelclaw - Saturday, January 7, 2012 - link
Ok, it's very niche, but if you would like to try a real ball-busting CPU test I would recommend a combination of 3DSMax 2012 and VRay. It's the most popular combination in the 3D world at the moment and nothing sucks the life out of a PC better than a VRay render.Plus these days VRay is available for Max, Maya, Sketchup, Rhino, Softimage and probably more. It also runs on OSX under the apps available for that platform. Something to think about.
RoninX - Sunday, January 8, 2012 - link
I'd like to request a benchmark for battery life when gaming. This wouldn't need to be the full set of gaming tests, but just one or two games run at value, mainstream, and enthusiast settings.I own a high-end gaming desktop for gaming at home, but I also do a lot of business travel, and I like to having a laptop that can run games during long waits at the airport. Right now, I have a Dell XPS 15, which does pretty well, but in the future, I'd be interested to know the tradeoffs in battery life and performance for dedicated gaming laptops (e.g. Alienware) as well as high-end mainstream laptops (e.g. XPS).
I realize that laptop battery life is pretty poor across the board when running demanding 3D games, but for me, there's a big difference between 90 minutes (which I get from my XPS 15), and say, 20 minutes. With a spare battery, the former gives you 3 hours of gaming time -- more than enough for typical flight delays.
JarredWalton - Sunday, January 8, 2012 - link
Besides tripling the number of battery life tests (ouch!), there are other factors to consider. First, you need a "game test" that's repeatable. If you play Batman for a couple hours on a laptop, and then play it again for a couple hours, I'm not sure the load will be the same. Then again, I doubt that the difference between Value/Mainstream/Enthusiast and even different games will be much -- they all activate the GPU and put a decent load on the CPU, so they're pretty much the worst-case scenario.How about this one: start battery life timer, open Skyrim save, and let the laptop just sit there until it runs dead. (Note that Skyrim will have the camera start circling the player in third person mode after a minute or two of inactivity.) That's about as consistently repeatable as I can come up with, though not necessarily a realistic test case. I would also be willing to check battery life on one laptop at our three detail settings to verify whether there's a difference in battery life or not. I might also check out some other games to see if they show variance in battery life (and if I can find a good looping test). Let me see if we can come up with something reasonable -- and we'd also want to test performance on battery power, as most higher end GPUs really clamp down on maximum clocks when on DC power.
RoninX - Sunday, January 8, 2012 - link
That Skyrim test sounds good to me, and testing performance on battery power would also be very useful.Thanks!
joelypolly - Monday, January 9, 2012 - link
Would be nice to have a web based comparison table so you can compare things like viewing angles, general specs and performance.JamesAnthony - Thursday, January 12, 2012 - link
I was wondering if you would be able to add to the video tests for games more notebooks with Nvidia Quadro and ATI FireGL video cards, as Dell specifically & several other vendors tend to use those cards as the only option available in business level or professional level laptops & it would be nice to know how well you could expect your $1500 Business laptop $3,000 CAD laptop to play some games when you get home with it.For Example
AMD FirePro M8900 Mobility Pro
NVIDIA Quadro 3000M
NVIDIA Quadro 4000M
NVIDIA Quadro 5010M
AMD FirePro M5950 Mobility Pro
NVIDIA Quadro 1000M
NVIDIA Quadro 2000M
NVIDIA NVS 4200M