Great write up Josh. I'm looking forward to seeing the camera performance, and the Adreno 530 is a beast! Let's hope performance can be sustained without significant throttling.
Yes, somewhat disappointed that overheating wasn't addressed in Part 1 given Andrei's skewering of Samsung's use of a heatpipe and his comments that the S7 got pretty hot under load. Hope it gets addressed in detail for Part 2!
I seem to recall that real world battery life has traditionally been worse on Samsung SoCs than Qualcomm's mainly due to the modems using a lot more power - hopefully they've resolved that this generation... Personally, I'm disappointed that we get the Exynos version in Europe because it means there will probably never be a fully working AOSP port.
be nice to add CUBOT H1 results in there as well (i had the phone for like a month or 2 now and 2 days of active use is lovely without having to use an external battery that norm would still only last me a day on my last phone)
my use i am typically getting 8 hours screen on time (i use GPS a lot so i never expect to get the 12 hours screen on time, or probably mythical 15-18 hours if anandtech benched it for power)
... as usual. Samsung and Qualcomm just can't balance their system. They just look for a benchmark number to impress people and try to be competitive with Apple. But that's just a smoke curtain...
There are some complications though, the device is just 1080p, the 60FPS limit,no heatpipe. While keeping all that in mind, we do see a 37% drop and this test is not the worst case scenario. Hopefully Andrei will have a better test based on OpenGL ES 3.x or test it with some actual games since T-Rex is becoming outdated.
Oh wow, you had 1 must do here, run GFXBench long time perf and you fail to do so, keeping everybody in the dark about the 1 thing we don't know. That 1 key result has 10 time more value today than this article...
If true that's a bad business decision especially when you did run 2 different battery tests.The fact that the article tries to dismiss the GPU throttling by claiming that mobile gaming is short sessions does seem to suggest a lack of understanding. Sure casual gamers that dgame just to waste time will go for short sessions and less demanding games but for them the perf is also of little to no relevance. Users that should care about the GPU perf are the ones that play longer sessions. It's also of dubious ethics to label the GPU perf in any way without knowing this key metric and not doing so is essential in differentiating AT from others.
Why don't you just start your own tech-related website and write your own reviews, "jjj". You don't seem to do anything other than bitch and moan about every single conceivable thing you can come up with. Just gtfo and do things better by yourself, as you seem to know how everything should be done. If you write some good stuff, maybe you'll earn a buck or two, too.
As for battery tests, as long as you don't simulate a bunch of social, IM, news apps in the backgroud, you can't get close to good results when you got so many diff core configs.
"s long as you don't simulate a bunch of social, IM, news apps in the backgroud, you can't get close to good results when you got so many diff core configs" You have to have consistent methodology to test against other phones. The point is not to show what you might get with your particular social/newsfeed apps running, the point is to test against other phones to see how it compares under the same set of circumstances so you know which one gets better life.
Sorry but you failed to understand my points. It would be consistent ,if it is simulated. The point is to have relatively relevant data and only by including at least that you can get such data. These apps have major impact on battery life- and it's not just FB even if they were the only ones that got a lot of bad press recently. The different core configs - 2 big cores, 4 big cores in 2+2, then 4+4, 2+4+4, or just 8 will result in very different power usage for these background apps, as a % of total battery life, sometimes changing the rankings. Here for example, chances are the Exynos would get a significant boost vs the SD820 if such tasks were factored in.
How many such simulated tasks should be included in the test is debatable and also depends on the audience, chances are that the AT reader has a few on average.
And you are missing mine as well... If you have a million users, you will have 10,000 different sets of apps. You cant just randomly pick and call it a benchmark. The methodology and the point it to measure a simple test against other phones without adding too many variables. I get what you want, but its a bit like saying "i wish your test suite tested my exact configuration" and that just isnt logical from a test perspective.
What i want is results with some relevance. The results have no relevance as they are, as actual usage is significantly different. In actual usage the rankings change because the core configs are so different. The difference is like what VW had in testing vs road conditions, huge difference. To obtain relevant results you need somewhat realistic scenarios with a methodology that doesn't ignore big things that can turn the rankings upside down. Remember that the entire point of bigLITTLE is power and these background tasks are just the right thing for the little cores.
relevant to who? I use zero social media apps and have no news feeds running at all until I launch feedly. Relevant to you is not relevant to everyone or even to most people. This site is a fairly high traffic site (for tech anyhow) and they have to test for the many, not the few. The methodology is sound. I see what you want and why you want it, but it doesn't work for "the many"
Relevant to the average user. I don't use social at all but that has no relevance as the tests should be relevant to the bulk of the users. And the bulk of the users do use those (that's a verifiable fact) and more. This methodology just favors fewer cores and penalizes bigLITTLE.
Its still too unpredictable. One persons Facebook feed may go nuts all day while anothers is relatively calm. This is also why specific battery ratings are never given by manufacturers... Because usage varies too much. This is why sites test a (mostly) controllable methodology against other phones to see which fares the best. I find it highly useful and when you get into the nuts and bolts, it's necessary. If you had a bunch of phones and actually started trying to test as you mentioned you would find a can of worms and inconsistent results at the end of your work...
"Its still too unpredictable" - that's the case with browsing usage too, even more so there but you can try to select a load that is representative. You might have forgotten that i said simulate such apps, there wouldn't be any difference between runs. Yes testing battery life is complex and there are a bunch of other things that could be done for better results but these apps are pretty universal and a rather big drain on resources.They could be ignored if we didn't had so many core configs but we do and that matters. Complexity for the sake of it is not a good idea but complexity that results in much better results is not something we should be afraid of. 10 years later smartphone benchmarking is just terrible. All synthetic and many of those apps are not even half decent. Even worse, after last year's mess 99% of reviews made no attempt to look at throttling. That wouldn't have happened in PC even 10 years ago.
I think you are a little too hung up on benchmarks. It is just a sampling, an attempt at measuring with the goal to compare to others to help people decide, but what really matters is what it does and how well it does it. I find it highly useful even if not exact to my own usage. If unit A lasts 20 hours and unit B lasts 16 hours in the standard tests, but I know my own usage gets 80% of what standard is I can estimate my usage will be 16 hours on unit A and 12.8 hours on unit B (give or take). It really doesn't need to be more complicated than that since testing batteries is not an exact science, not even on PC/laptops as usage varies there just hte same. That is why there are no exact guarantees. "Up to X hours" is a standard explanation. It is what it is.
One last time , these apps due to the different core configs can change the rankings. The one that gets 20h could go to 12 and the one with 16 to 14h.That was the main point here.Because the vast majority of users have such apps, you can't reach relevant results without factoring them in.A phone can get good results in the current test and poor in actual usage.
That is an extremely unlikely example. In real world usage, Octa cores have had very little difference in any area (performance or efficiency) vs. their quad core counterparts unless there is an issue with the ROM where it's simple bleeding power. Whatever though, if you feel so strongly about it, by all means, do some testing on various phones and submit to a few sites and see how worthwhile it is.
I'm no fan of the hitching at the AND team but he has a point with the background tasks - an average modern phone has a lot of them and testing endurance with them all disabled isn't realistic. Moreover, indeed a dual core would probably perform worse with more background tasks than a quad-core or octacore as they can spread the load and keep their cores at an energy-efficient optimal speed.
A repeatable, fixed background load would be good to have. Also hard to write, certainly across ios and Android due to their different capabilities and limitations with regards to background tasks. So I get why it hasn't been done and if jjj is up for writing a tool that does it I bet the anandtech team would take it... Yeah.
the only thing i find with anandtech is battery tests is that they do not follow real world battery use (to get correct results divide it by about 2 or 2.5 and there is your real battery use, when you look at anandtech reviews, all other tests are good, most phones last around 4 hours Screen on time (max tends to be 4 hours but can be as low as 1.5 hours)
the only exception to that rule tends to be the Motorola RAZR MAXX phones (Low end CPU with 3200 battery), CUBOT H1 1Ghz Quad core, 720p screen {basicly Samsung Note 2 spec} with Massive 5200 battery, (i own and love, and only costs £110 !!) still miss not having the HTC ONE stereo speakers but i take 8 hours over 2 hours (without having to use an external battery case that turns it into a brick) and the Huawei Mate 8 (4000 battery)
GFXBench long term performance is currently not useful to anybody as it hits Vsync in the current T-Rex test. In the full review we'll have a proper test which will be of use for users.
Seems that you might be wrong Andrei. I am guessing you got the MI5 but that's 1080p and you forgot to factor that in? Here at 1440p you don't hit the 60FPS wall even in the first run. If the MI5 doesn't fall bellow 60FPS with the 1080p, it would be relatively good news, as it means less than 33%-ish throttling with a 1080p display and no heatpipe.
This German review posted some GFXbench results you might be interested in, however ofc their phone uses Exynos. Hard to say that Snapdragon would behave the same way, but it's also clear that having a heatpipe alone in the phone doesn't mean it won't throttle.. significantly at some points.
Thanks a lot, ofc has nothing to do with SD820 but having this for the Exynos is nice too. Digitimes was claiming that only the SD820 models have the heatpipe, no clue if true or not, guess it's something else that has to be verified.
Is that what you do with your phone? Run continuous benchmarks all day? I understand we want an idea about throttling but using those benchmarks as proxy for expected performance won't get you that info about day to day usage.
It will let us know if throttling is a big issue. Benchmarks are just that, tests to see how well the phone does.
Of course we dont run benches all day, but if you do not run benches, then how do you know how well the phone does when pushed, in say, a mobile game like asphalt 8 for more than 5 minutes? It's useful information to have.
Well then, according to that measure, throttling is a big issue with the iPhone 6s. How do you tell how well it works with asphalt 8? I'd probably pay asphalt 8 for awhile. These benchmarks aren't a good proxy as they are designed to push the devices to their limits.
Note that the note5 did not throttle in GPU test, but its final run was still only 60% of the throttled S7 score (and about 50% of the starting score).
I just mentioned that it didn't throttle since jjj seems to be an exynos fan, so I was trying to forestall "BUT THE EXYNOS DIDN'T THROTTLE", even though it had lower results throughout.
Great write up. Glad to see the Snapdragon 820 is properly flagship level. I look forward to the rest of the review, details on throttling, and hopefully at some point a look at Samsungs new Exynos SoC in the other s7 model.
It's kind of interesting that Apple has a solid win in CU performance against Qualcomm, but a loss in GPU, even though they get their GPU from a company who should specialize in them. Twister truly is an astounding architecture, but they're probably at the end of the easy, huge performance boosts. It may be interesting to see how well Apple can go about completely redesigning their core when it happens.
Faster GPU is not as interesting as faster CPU because it's trivial to make a faster GPU --- just throw more GPU "cores" at the problem. If Apple wanted to double the GPU performance of the A9 tomorrow, all they have to do is use 12 GT7000 cores instead of their current 6. So the decision as to how many to use is an economic/use case decision, not a technical one. Given the pixels Apple is interested in driving, it looks like they used the right number of cores. QC sells into a broader market (and, in particular, a market that, whether or not it makes sense, uses crazy-high pixel densities) so their incentives align with throwing in more cores.
If you want to run a technically interesting GPU performance competition between Apple and Adreno (or Mali, or anyone else), raw performance is not the interesting metric. The interesting metrics are - performance/watt - performance/mm^2 I don't know who would win either of these, but my guess is Apple. We don't yet have enough information to answer the first question, but the mutterings about GPU throttling in the comments suggest that QC gets hig GPU numbers by burning lotsa power, not by having some super-advanced GPU that's a cleverer design than anyone else. In a sense, what you're seeing is what Apple could copy if they wanted by putting the A9X inside the iPhone6S.
" The interesting metrics are - performance/watt - performance/mm^2"
Nah, the really interesting metric would be how much power consumes to run (and sustain) the average game at 60FPS at the same resolution. Read: efficiency.
If it's that trivial to make a faster GPU, I'm pretty sure Apple would have gotten a faster GPU in their phones. Faster GPU's in mobile, where battery life is really relevant is probably the trickiest part to succeed with. As you said, it basically requires you to add a ton of extra cores to make it noticeable for whichever hardware platform you're doing it on. GPU's are after all the biggest power consumers in tech these days. doubling the size of the Apple GPU would literally wreck the iPhone battery life, which I'm pretty sure you're also aware of yourself. They did after all gimp the GPU on the iPhone compared to the iPad for this exact reason.
You forgot the user experience. Throwing 30 fps onscreen is the minimum requirement, no need higher than that, unless Apple follows the resolution war. Apple always resolution fan in their macs, but they still think that crazy resolution is not needed, I think still true.
Looks like Qualcomm spent their time creating a quality core instead of just adding more cores. Half the cores of the 8890 and by all accounts at least as fast (CPU, not GPU where it's quicker).
I just wish you did a series of tests to check throttling.
Nice... Mine is on pre-order now. I almost cant believe Samsung finally got it together and started making better devices with better build quality and bigger batteries and much less bloat, but I am very happy they did.
The lack of an IR blaster is the only thing holding me back from buying the s7 edge. I use the IR blaster every day on my G4, and I assume everyone with young kids would as well. Remotes just seem to always be missing...
There is no such thing as 'shot noise'. Every digital noise is just lack of light to hit sensor. By having larger pixel size means larger area for light to hit photo sensor hence reduce the 'sensor noise'.
Because the sensing area is larger per pixel, the number of photons incident in each pixel will increase with it. Those incident photons are what give the picture data. So weird quantum effects that somewhat simulate adding or removing photons have less significance when there are more photons to begin with. More specifically at 1.1um vs 1.4um, 1.1um being quite comparable to the wavelength of visible light is causing some extra anomalous effects too.
Yea I knew 1.1 um was the bare minimum due to quantam effects. But say going from 1.4 um to 2.0 um, would that make much of a difference?
After all the total amount of light collected by the sensor would be roughly same right? A flower can be composed of 10 million pixels of size X or 5 million pixels of size 2X, the total area of the flow will still have collected the same light?
I don't think it does, but it decreases noise caused by chance: with smaller pixels you have noise in low light situations in part simply due to the chance of one pixel catching randomly a bit more licht fotons than correctly represented the scene, and another less. With bigger pixels you smooth that out a bit and thus less random noise. It is only ONE source of noise, but it helps.
Just imagine you take a pic of the same scene with two sensors, one so small it catches 5 photons average per pixel cell, the other one is twice as big and catches, on average, 10. A random one photon difference in a given pixel cell gives 20% brighter or darker pixels on the small, 5 photon-catching sensor and only 10% on the bigger one.
Again, it is only one source of random noise, but a pretty fundamental one you can hardly calculate your way out off.
@Joshua Ho. Do you mind if I ask favor? Does S7 support AC tethering? Do you now any other Android phone with AC tethering? I am not talking about connecting S7 phone to a 5Ghz wifi such as home internet. These days most phone can connect to 5GHz wifi anyway. What I am asking is. If S7 itself can become a 5Ghz WiFi hotspot. This is can be very useful feature for me for transferring files between connected device to S7 tethering. I appreciate if you share with us a screenshot of network connection speed at PC to the S7 tethering at second part of review. Thanks!
I have a question about that. Is Chrome supposed to be an equalizer between the platforms? From a practical standpoint, the browser that Samsung ships is in some ways better optimized so I use that instead. Am I really missing out by not running Chrome on my S6?
I wanna see a test done on RBrowser or a Qualcomm optimized browser with the CAF Aurora files (does make your browsing up to 40% faster, you're welcolme)
Chrome is very poorly optimized for Exynos processors, so you aren't missing out on your S6, but for devices with Qualcomm processors, Chrome typically runs pretty well.
I'm wondering if the situation has changed any with Android 6. I'm assuming that all the test results for the S7 were with Marshmallow while for the S6 they are with Lollipop. I'm pointing this out because my Note 5 just got the Marshmallow update and after the update, Chrome has improved so much that I'm now using it as my default when I never did before.
Might be interesting seeing a test that included s6 with Lollipop, S6 with Marshmallow and S7 to see how much of the difference is actually attributable to the OS update???
Unfortunately Samsung's browser is not available on our Verizon-branded sample phone. The phone only ships with Chrome, and it is not possible to install Samsung's browser at this time.
The Verge is reporting that this is a Verizon decision, and that all Verizon S7s are like this.
What an interesting turn of events. Wonder why would Verizon be irked by Samsung's browser. Any thoughts Ryan? Maybe default adblocking since Verizon is also a media company.
I'm assuming my Verizon Note 5 would be the same as the Verizon S6? My note 5 has the browser that just says "Internet". Is that the one you are calling the Samsung Browser? Either way, Chrome is not the only browser on my phone. I got the 6.01 update last friday and still have both browsers avaiolable after the update.
To me it still feels like Kyro doesn't aim high enough, given A9 has been shipping for months, in volume, and we're likely closer to A10. Even if A10 is a "tick" with a modest 20% performance boost, the cycle of staying behind it continues.
That's the benefit of large profit margins I guess, and since just about no one but Samsung was making Android handset profits, no one was probably ordering a huge die with high IPC. Kyro comes a long way, and I'm glad they're going 2+2 with higher IPC rather than 4+4, but I guess I would have just liked it to go further and actually leapfrog the A9.
Samsung does have their own custom core in development, wonder how that compares to Qualcomms Kyro.
Samsung's core is already done I think, its the Mongoose core in the 8890, the one in the international variants. It's a slightly weaker core than Kryo
Unlike what most reviewers want to believe, when designing application processor cores, companies like ARM, Qualcomm and Samsung aim for a "sweet spot" of load-to-efficiency ratios, not MAX single threaded performance.
Their benchmark is common Android workloads (which btw, rarely saturates a Cortex A57 at 1.8GHz), since it's what makes the vast majority of the mobile application processor market. They measure the average/mean workload needs and optimize efficiency for that.
Android isn't as efficient as iOS and Windows Phone/10 Mobile in hardware acceleration and GPU compositing; it's much more CPU bound. It doesn't benefit as much from race to sleep in mobile devices. CPU cores remain significantly more active when rendering various aspects of the UI and scrolling.
By measuring CPU and RAM utilization when performing said tasks. More efficient implementations would offload more of the work to dedicated co-processors, (in this case, the GPU) and would use less RAM.
Generally, the more CPU utilization you need for these tasks, the less efficient the implementation. Android uses more CPU power and more RAM for basic UI rendering than iOS and WP/10M.
How do you measure this so that you can ignore differences in the system (like textures chosen)? Then you'd have to make sure they're running on the same hardware. The best you can do is probably test Android and Windows on the same phone (this will put Windows at a bit of a disadvantage as Android allows very close coupling of drivers as their HAL is pretty permissive). Then you run a native game on each. If you've found a way to do this I, and Google, would love to see the results. Other than for 2d (which NOBODY, including directdraw/2d or quartz, fully accelerates), Google really hammers the GPU through use of shared memory, overlays and whatever else may be of use. There's obviously more optimization for them to do as they still overdraw WAY too much on certain apps, and they've obviously got a serious issue with their input latency, but it's a modern system. Probably the most modern as its been developed from scratch most recently.
I also thought the difference in battery life between the S6 and S6 Edge was off. They either posted wrong data, or something wrong happened while testing.
I'd agree. When one of the phones goes from being upper middle of the pack on the old benchmark to being dead last--and woefully so--then I would have to wonder if something is really wrong with the new test. I've used the G4 for 6 months and have rarely had battery concerns over a day of "regular" use. I've owned several phones, and the G4 is a trooper.
Any chance you'll be doing an in depth review of the Exynos 8890 equipped model. I would really like to see an in depth review that compared it to the SD820. Preliminary benchmarks seem to suggest it is comparable in the CPU department but behind in the GPU department. Since it looks like us Canadians will be getting the Exynos variant I'd like to see if we'll actually notice a difference in everyday use...
It is pretty odd as Samsung usually ships the same hardware to both Canada and the US. It seems in this case they are only shipping the SD820 version in CDMA markets and since Canada is all GSM we are getting the Exynos version. I don't know that this has been 100% confirmed yet though, that's just what the rumour mill is saying.
Still completely stupefied as to Samsung's (and Google's) decision to continue with this whole black text on white background thing, knowing full well the OLED consumes more power when pushing white. Also, it's ugly.
It's less straining for your eye to read black text on white (which is why you'rereading this in black on white right now :D ). Also, the brightness of the system settings is to be fair completely irrelevant in the long run. Once you've tinkered with whichever settings you prefer, you rarely ever go into those menus again. Making the UI look consistent clearly has a bigger weight in that design choice.
Pretty good write-up. I have one request: In your charts, can you put the SoC powering the device (in smaller letters) underneath the name of the device? It would be helpful to understand why a device performs like it does (is it SoC related or subsystem or software), and it's pretty standard info in PC based reviews. Otherwise, keep up the good work!
Please Please Please can you do a review of the Exynos 8890 variant. I have a sneaky suspicion battery life on this SoC will be significantly better than SD820.
Great review as always. Looking forward to part deux :-)
Samsung has proven that they can build really great hardware. Great. Now can we go back to getting a GPE version of their phones? I love my Nexus 6's user interface. Every time I try to help a friend with a Samsung phone it feels like the interface has been hit with an ugly stick and all the useful little things have been taken away. But the Nexus 6 hardware is only 'meh', and the 6P isn't better enough to justify a change. If the S7 were a Nexus device I'd be saying TAKE MY MONEY right now.
No one was interested in the GPE models. Samsung has had no interest in being a Nexus manufacturer for half a decade. That's why. Like it or not, buyers wants the smart features pre-installed at the factory.
Throughout the review, Mr. Ho refers to the Note 5 but I do not see any data pertaining to the Note 5 in the charts?
I agree with him about the camera hump. I had no issue with the S6's camera hump, both aesthetically and practically. I think this provides an opportunity in which tech media should take a pause and self-reflect. In every freaking review of the S6/Edge, reviewers incessantly cried over that hump as if Samsung committed an unspeakable sin. It looks like Samsung took the criticism to heart, but unfortunately the criticism was an unwarranted one to begin with. Reviewers should think for themselves before following the fellow herd parroting sensational nitpicking on a non-issue.
The note 5 will perform essentially the same as an S6/S6 edge. I've got a Note 5 and personally I'm not seeing anything in the new generation that would make me lust after an upgrade.
Well, at least they address the issue in the right way IMHO: but making a only so slightly thicker phone and taking the opportunity to put a bigger battery in it.
"In the interest of providing another data point and some validation of our testing results, I ran both devices through our old web browsing test to see what the results would be for something that should be display-bound. Here, it’s obvious that the Galaxy S7 edge holds a significant lead over the iPhone 6s Plus"
But the actual delta in both cases is exactly the same! 53 mins! Am I calculating this wrong?
9:58 - 9:05 = 53 mins 14:03 - 13:10 = 53 mins
So the delta in the 2013 and 2016 tests is exactly the same
And I'm still wondering... how is it possible that years after various websited emphasized the better alignment and design of the connectors and perforations of the iPhone, Galaxies are still aligning them with no care at all? check that top side with the slots and holes thrown there randomly and that bottom side with the four holes (or groups thereof) aligned in FOUR different ways.
But hey at least Apple supporters won't say they are copying everything...
Fair enough. But don't then complain that it's "unfair" when Apple sucks up 80% of the profits in this sector. It's attention to details (ALL details) that allows a company to charge more...
Attention to detail hardly has anything to do with the port alignment at the bottom of the phone, and more so with the antenna bands on the back and the cheap choice of aluminum alloys in previous models (changed only after "bending" to public pressure, pun intended).
A nice finish =/= quality. A polished piece of cheap glass looks better than a rough diamond.
Most of Apple's attention to detail goes to media, perception, image, supply chain, and money making business models. Well, at least more so than their attention to hardware.
You have good points here and there, but your fanboyish attitude ruins the good parts...
Exactly. And there is a pretty good reason why the SD/SIM slot is on the side on the top. It can't be in the same location as the camera's. Apple have displaced the camera to a corner, that's not very symmetric in my eyes. And like you, I've never been a fan of those plastic separators on the back on all-metal phones. End of the day, manufactures are always doing deliberate design differentiations to make sure their hardware is distinguishable from a distance. Most people can easily identify a Samsung phone when someone's using it, simply because they've stuck to the same camera design/location since the Galaxy S2.
That was about the GS6 and only just started making its rounds, after Samsung finally tried to make nicer designs. I mean, I agree with you, it's such a bizarre miss, but when you say "years and years", it's really, "year", or less.
Well if the Galaxy phones had bezels the size of Texas like the iPhones, I'm sure they could align the ports better because they'd have more room to work with.
"While not quite going from zero to hero, Qualcomm has come close, and that definitely deserves some credit."
I disagree. Giving them credit because of the large improvement over the awful SD810 doesn't make sense.
Instead of a comparison to last years garbage, give them credit for how SD820 performs compared to todays best SoCs. It turns out SD820 isn't really leading that much. It's mostly still behind a year old Apple chip.
"Always-On Display is nice to have, but for some reason it only polls the ambient light sensor, so the display won’t actually turn off in your pocket."
This is strange and disappointing. I wonder why it does not use the proximity sensor.
...and as a stab at answering my own question: I assume constantly polling the prox sensor would cause a greater battery hit than simply leaving the display on all the time (including in pocket)?
Would be interesting to chart battery life with AOD on vs. off (in some kind of controlled way, of course).
Very good review the S7 is a beast but so is my iPhone 6s plus. I wish I could afford to have 2 high-end smartphones because I would definitely by the S7 if I could because its finally the android phone i've been waiting for.
I'm very glad you updated your battery testing methodology. I have the LG G4 and my personal experience with its battery life is MUCH closer to your 2016 test results than the 2013. The older test would have you believe it has much better than average battery life, but I personally find myself charging it much more often than I want to.
I'm curious about the microSD card slot performance. If you put a fast (<80MB) microSD card in the GS7, are you going to see the difference over a slower card?
Also, how fast is the USB MTP performance? Did Samsung re-introduce a USB3 connection?
It would depend on what you put on the card. Videos and music would probably see no difference. Apps, and if the camera can write directly to the card, maybe.
I put a Sandisk 64gb Extreme on mine and A1 SD Bench returns 60Mb/s read and 40Mb/s write. For comparison the Internal storage works at 300 read and 150 write.
...Then I put in a crappy $7 32gb sd card and it scored 1032 write and 1.02 read, so yeah there was something wrong with that one XD
"Of course, other than the workload the device setup has been held constant across these tests by equalizing ... disabling all background sync to the best of my ability."
Is this really a good idea? Id argue that part of what you are buying when you buy these devices is proper setup. (Certainly, for example, that's part of what Apple would say they are selling.) As such, I'd consider that the right way to benchmark them is in some sort of "as close to out-the-box" state as possible. Sign up for everything the device asks you to sign up for out the box. (For Apple that will be to enter an AppleID, for Android I assume it will always ask for a Google ID and then some random collection of additional logins that people have paid the phone vendor to request.)
Then see how the phone behaves under those conditions. The Apple pone will presumably occasionally sync with iCloud. The Android phone will sync with various Google Services. And if the vendor asked you to sign up for "MyWannabe Social Network" and "MyWannabe Social Network" delivers ads to the device every three minutes, constantly sucking up power and CPU, THAT IS THE EXPERIENCE THE PHONE IS SELLING YOU.
Vendors that sell crap like that should accept the consequences in reviews. It's not AnandTech's job to spend an hour scrubbing some phone of useless crap. It is, in fact, AnandTech's job to run the phone with precisely all that crap enabled --- and then let us know the results.
By background syncing Josh is mostly referring to disabling application auto-updates and other such services which can have an impact on battery life tests. The usual small sync services and GMS have little to no noticeable impact on these tests.
I disagree on your viewpoint about out-of-box software settings simply because the phones have different software and services depending on your region. North American units from carriers will have different settings and services than the international units. We don't always get samples from the same carrier even. AnandTech has been first and foremost a hardware site so I think it's correct to try to minimize the effect of such services to get a better representation of what the device itself is capable of, not what the carriers choose to add in or not.
Again, this is all overblown as in practice we see little to no effect on our tests and again we're mostly referring to auto-updates and the like which can eat up singificant amount of CPU cycles.
" Always-On Display is nice to have, but for some reason it only polls the ambient light sensor, so the display won’t actually turn off in your pocket. As a result I turned it off as it’s clearly going to be contributing to idle battery drain in situations where it shouldn’t."
Like I said... I think it is not helpful to improving phones generally when reviewers accept stupidity like this. If Samsung ship with a feature that's not ready for primetime, the review numbers (in this case battery life) should show it. I don't understand why so many phone users are willing to make excuses for manufacturers and just accept babysitting their machines, manually switching on and off GPS, Bluetooth, WiFi, NFC, etc as the situation demands.
My devices should damn well operate themselves, not rely on me to do it for them.
If you do a quick google search, you'll see that a lot of users are still facing keyboard lag. Check reddit and xda. I would like to hear your thoughts on the same and whether you are also facing such issues.
Still disappointed that Samsung didn't at least allow enabling using the SD card as extended internal memory. Yes, some people want to swap the cards in and out, but others have found they have run out of internal memory, and have no choice but to buy a new phone without this feature.
As a brand-agnostic consumer, I had gone with the LG 2/3 for my previous phones but recently picked up an S6 due to a deal I couldn't refuse... I'm impressed with how Samsung, beginning with the S6, significantly toned down the bloat of Touch Wiz and also got rid of the cartoonish oversatured colors of the screen. With the screen setting in Basic mode, colors are very accurate... It's only weakness is average battery life, which is silly because of how much the camera sticks out anyway, there was no reason to make it so thin. Even with a thin case that I prefer, it STILL sticks out...Also lack of expandable memory and waterproofing was sorely missed in the S6. With the S7 they addressed these issues and beefed up processing and RAM even more. Great job, Samsung! Now just make the battery removable next time! With the LG G5 coming out soon, it;s nice to see such good competition in the Android flagship market!
I made the jump from iPhone to S6 3 months ago. The only thing I miss and had hoped to see on a successor is the physical mute switch on Apple products.
Samsung's solution has always been to use the sensors as a "physical mute". Place your phone face down on a table and it'll auto-mute, even speaker phone calls.
It's a shame that Samsung haven't improved the fingerprint scanner. Using an iPhone 6S and Galaxy S6 Edge+, it's frustrating how much the S6 Edge+ shows the 'no match' message when trying to unlock the phone quickly. Definitely room for improvement and something they need to sort out.
I just went from an iPhone 6s Plus to a S7 Edge and so far I have no regrets. That is not to say its not without issues.The actually edge parts of the screen seem to be if anything a detriment because it causes image distortion looking head on. The web browsing is not as fast as on the iPhone, though it doesn't feel slow. This could be due to the superior single threaded performance of the iPhone or it could be due to Chrome not having adblock like Safari does now. Then finally the fingerprint reader is not as good as the iPhones, with constant fails, though I wonder if it is because of its smaller footprint and not a software issue.
Otherwise I am quite happy with it. AMOLED kicks LCD's ass. Anyone trying to argue for LCD over AMOLED is insane in my book. The colors, the blacks, the edges of this screen might be distorted, but everything still looks far better on it than the iPhone. TouchWiz is no longer laggy, I've yet to experience any animation that didnt feel fluid. Having a back button again is like having had your left arm fall asleep and then wake up, Apple really needs this basic control.
Now I just have to wait for my VR headset to actually get here! Got my phone a week ago, and it still hasn't shipped, and it was already released and my phone wasn't!
You could have at least measured the difference in efficiency between Chrome and Samsung's stock browser.................................. Sigh. Why insisting on using Chrome???? MOST GALAXY USERS AROUND THE WORLD USE THE STOCK BROWSER.
That's right ! And the stock browser is much more optimized for Samsung devices, most of the time (particularly this equipped with an Exynos processor) : everything is faster (and less energy-consuming).
Unfortunately Samsung's browser is not available on our Verizon-branded sample phone. The phone only ships with Chrome, and it is not possible to install Samsung's browser at this time.
The Verge is reporting that this is a Verizon decision, and that all Verizon S7s are like this.
And this is yet another reason I don't miss my Android phones after making the switch to the iPhone. The dysfunctional relationship between manufacturers, carriers and Google is such an enormous pain the ass between performance hits, weird permission choices and crapware from everyone who has touched the phone's creation process.
But I don't understand why the brightness would have been lowered whereas other websites tell the contrary - and Samsung usually increases the maximum brightness on every flagship device.
Moreover, the web battery life test is not representative of actual battery life, because AMOLED displays are very disadvantaged in this test on white web pages, whereas the battery life would be much higher on websites with a lot of dark areas (photos, dark background, etc.). Some browsers allow you to use a "night mode" which inverts the colors of websites background if the APL is high.
I hope you will review the Exynos version as well - I guess it will be much smoother and with a better battery life, as always.
Our new web test has some pages with dark themes currently. However, the overwhelming majority of webpages and UI have a high average picture level. In order to reflect this the vast majority of our webpages are black text on a white background.
We are hoping to get an Exynos unit to compare with the Snapdragon 820.
I am totally for prioritizing energy efficiency and battery life over a 100% fluid UI. But I also find it hard to believe that at this stage in the smartphone evolution, a flagship device still can't achieve that goal without adverse effect on battery life. I feel like in the S7's case, it's more likely than not that Samsung and their proprietary UI is at fault for the janky UI performance. Would love it if you guys have time to investigate this further! After all, I am not sure I am not alone in saying a smooth UI is a big part of the everyday smartphone experience.
Same old question, maybe in part 2 of the review we can get an answer: Is unlimited HEVC encoding at 2160p60 with both HDR and IOS supported? Thanks a lot.
While its always interesting to see proper deep dive reviews, I find that these days there is nothing that would get me to buy another Samsung "flagship". Totally overpriced by strategically riding the slipstream of Apple's own boundless greed - except these ones do not hold value at all.
You say people will find the stock Android interface "rather spartan." That strikes me as odd since I went from TouchWiz to an essentially stock Android on a Moto X Pure. I would never go back. Settings are located where they should be, not moved around at random. Customization is much easier. And stability is no longer an issue. I had, for instance, one major audiobook app that reliably crashed the system under TouchWiz to the point of needing a battery removal to get it restarted. I came to understand that TouchWiz, not the app, was the point of failure during the brief time I used Cyanogenmod and it worked fine. Also works fine on my Motorola.
TouchWiz and other skins are not about operability. They are exercises in branding, equivalent to logos. Rather than being nonspartan, they are actually an impediment to usability. If you go to Google to find out how to do something, the instructions will be different from how your skinned phone operates. Sometimes it is easy to translate, sometimes not.
For a lot of people, that probably doesn't matter. But for a lot of people, introducing a skin introduces another potential point of failure, and another opportunity for vendors to point at each other in a circle should there ever be a problem.
You were lucky. Back when I used an S4 mini I flashed a Cyanogenmod, because Touchwiz was too ugly with lack of a theme store, it crashed like hell. For my S6E I just got a Material Dark theme to fix the look and everything else was fine, I never objected the placement of the settings.
I'd love to see a comparison between the different camera modules inside the s7's if possible. Since they ship with either the Sony IMX260 or Samsung's britecell. I'd like to know if one is superior or not. Not that I have a choice in which sensor i receive but...
"And to top things off the camera hump has now been almost entirely eliminated." What's to be so obsessed with the camera hump? It's a design element! Would you rather there's only a small hole visible in the middle of the shiny metallic finish? That's closer to Xiaomi's approach to the Note 5 ripoff, but honestly, that's MUCH uglier than the hump.
Actually, Xiaomi had those design elements and rounded back in another of its phones before the Note 5, btw. Not that it matters much, since everyone copies everyone these days
"The thickness does result in a noticeably reduced camera hump, but on a personal level I never really cared about the camera hump in the Galaxy S6, so I’m not sure I care about the reduction in the camera hump here." Alright I should've finished reading before replying.
What I find shockingly stupid is the release of these new $600~900 phones, including the latest Moto X, that DO NOT include USB-C connector?! Its been available since last year.
Apple does things quickly, they come out with technology and release it: such as with the iPhone 5 with its reversible port. How hard is it to do with other companies?
Motorola (Lenovo) could have done this with their New X to make a bold statement on how they are going to run their business.
Probably because Samsung has usage data on what we actually use the USB port for these days. And I guess it's used almost exclusively as a charger. Why force the consumers to buy a bunch of new cables and chargers just because there is a new port out there? I know Apple would do that in a heartbeat, like you said, as they simply see it as a new way to increase earnings on licensing accessories.
Because USB-C is much easier and quicker to connect? (I certainly find the 5X and 6P much easier to connect in a darkened room - good thing since Google nicked Qi charging.)
Because USB-C cables and sockets should be (probably too early to say, but by design) far less prone to failure than Micro-B? (Micro-B cables, and not cheap ones - OEM LG/Nokia/Sony/Moto cables, die on me on a weekly basis. About half of the cables I own only work for charging now.)
I don't know, why don't we still connect our keyboards with the AT connector or PS/2, and our digital video cameras by firewire?
You don't need new chargers. If you've got dozens of USB-A power supplies, just use an A-C cable.
I'm not suffering from long connection times when connecting my mobile to the charger. Sure, it would be nice with a more uniform connector, but, if it comes at the expense me having to throw away all old cables, having to bring adapters and generally making life more expensive, I can easily live with a micro USB connector until connectors are entirely a thing of the past. Physical connectors for data transfer is really not essential these days. These phones has wifi and LTE connectivity at speeds close enough to any USB connection to make us not bothering transferring anything by wire any more.
Uh, just need to replace the cable or it comes with the phone... not difficult. A flip-able cable is VERY handy, especially in the dark. Unless the end is marked or molded a different shape - you have to LOOK which side is up. Apple changed the cable ONE time, because they wanted a much smaller and better connector.
So for a top end phone, I want a state of the art connector too. hence, I bought a new Moto G for $220... I lose the stereo speakers, but I saved $200 and have two free color covers I switch out for when I'm in the mood. So maybe I'll stick with the Gs.
For the average cell phone buyer, most of the information in this review goes right over their heads. I hope that part 2 of the review will address in simple language the following:
Can I read the display in direct sunlight (giving me the number of nits emitted by the display does not tell me this).
What happens if I drop the phone into a sink full of water.
What happens if I drop the phone onto a concrete floor.
Anandtech is not for the "average cell phone buyer". It's for people who *really want to know*. If there were more of us there would be less rip-off products on the market and everything would be easier, more money would be devoted to R&D instead of marketing and more will be achieved.
I think what article needs to highlight more is that the Kirin 950 handily beats SD820 is most of the tests. ARM Cortex-A72 is a great core, released quite sometime ago still doing pretty well. I am sure ARM has new CPUs in the pipeline which will be released soon.
I wonder if it makes sense for Qualcomm from business perspective to continue designing their own cores, while ARM already offers stock cores with great performance.
Yeah, I'm getting similar results to you (~400 read, 140 write, 80/15 random) on the S7, both with AndroBench and several other benchmarks. Those numbers in the article look... off
Truelovv.com is a 100% free dating site for singles and Relationship Who are looking to meet their special someone, True Lovers and after they can do romance and create a finalily relationship. True Lovv Website is Founded because When you are FREE and you want to chat someone then you can come into our website and Do Flirt, Romance, Chatting, Messaging etc. with your life partner or your True Lover. The only reason we require these forms is so we can match you with the right man or woman, and to make you feel comfortable at our site for singles dating. http://www.truelovv.com/
ARMv8 has AES instructions built-in, like AESE (ByteSub, ShiftRows, AddRoundKeys) AESMC (MixColumns), etc.. With this kind of hardware support FDE should not affect much of the NAND performance if Samsung actually uses them.
Anyone else not impressed with the old USB 2.0 type connector? USB 3.0 type C has been available for quite a long while. Its faster, it handles more power, faster charging, there is no UP or DOWN, just plug it in.
This from a $600~800 flagship phone in 2016? Apple has been shipping phones and tablets with such a connector since 2012.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
202 Comments
Back to Article
Michael Wilding - Tuesday, March 8, 2016 - link
Great write up Josh. I'm looking forward to seeing the camera performance, and the Adreno 530 is a beast! Let's hope performance can be sustained without significant throttling.warreo - Tuesday, March 8, 2016 - link
Yes, somewhat disappointed that overheating wasn't addressed in Part 1 given Andrei's skewering of Samsung's use of a heatpipe and his comments that the S7 got pretty hot under load. Hope it gets addressed in detail for Part 2!Ethos Evoss - Wednesday, March 9, 2016 - link
That is awesome that in EU will have Exynos 8890 .. crappy SD820 battery drainer..Azurael - Thursday, March 10, 2016 - link
I seem to recall that real world battery life has traditionally been worse on Samsung SoCs than Qualcomm's mainly due to the modems using a lot more power - hopefully they've resolved that this generation... Personally, I'm disappointed that we get the Exynos version in Europe because it means there will probably never be a fully working AOSP port.leexgx - Monday, March 14, 2016 - link
be nice to add CUBOT H1 results in there as well (i had the phone for like a month or 2 now and 2 days of active use is lovely without having to use an external battery that norm would still only last me a day on my last phone)my use i am typically getting 8 hours screen on time (i use GPS a lot so i never expect to get the 12 hours screen on time, or probably mythical 15-18 hours if anandtech benched it for power)
leexgx - Monday, March 14, 2016 - link
anandtech review http://www.anandtech.com/show/9868/cubot-h1-smartp...not sure if the phone is fully suitable for USA due to 3G and (real) 4G Bands been used on the phone
Ethos Evoss - Wednesday, March 9, 2016 - link
Huawei Mate 8 rocks .. still !jjj - Friday, March 11, 2016 - link
A throttling result for the Exynos variant of the Edge with a 42% drop http://www.techspot.com/articles-info/1147/images/...MaxIT - Monday, March 14, 2016 - link
... as usual. Samsung and Qualcomm just can't balance their system. They just look for a benchmark number to impress people and try to be competitive with Apple. But that's just a smoke curtain...jjj - Wednesday, March 9, 2016 - link
There is finally a GFXBench long term perf result for the MI5 https://gfxbench.com/device.jsp?benchmark=gfx40&am...There are some complications though, the device is just 1080p, the 60FPS limit,no heatpipe. While keeping all that in mind, we do see a 37% drop and this test is not the worst case scenario. Hopefully Andrei will have a better test based on OpenGL ES 3.x or test it with some actual games since T-Rex is becoming outdated.
jjj - Tuesday, March 8, 2016 - link
Oh wow, you had 1 must do here, run GFXBench long time perf and you fail to do so, keeping everybody in the dark about the 1 thing we don't know.That 1 key result has 10 time more value today than this article...
Ryan Smith - Tuesday, March 8, 2016 - link
There was enough time to run one battery life benchmark or the other one. We picked the web benchmark, as that's the more useful of the two.hans_ober - Tuesday, March 8, 2016 - link
Add it for the full review.Hoping for a deepdive on S820/ Exynos/Kirin950/A9? like the E7420 deepdive you guys did.
Ryan Smith - Tuesday, March 8, 2016 - link
You'll see a GFXBench rundown in part 2.As for a deep-dive, we'll be doing some architectural information, though power logging ala the E7420 review will be a separate, larger writeup.
hans_ober - Tuesday, March 8, 2016 - link
You've made my day :)jjj - Tuesday, March 8, 2016 - link
If true that's a bad business decision especially when you did run 2 different battery tests.The fact that the article tries to dismiss the GPU throttling by claiming that mobile gaming is short sessions does seem to suggest a lack of understanding. Sure casual gamers that dgame just to waste time will go for short sessions and less demanding games but for them the perf is also of little to no relevance. Users that should care about the GPU perf are the ones that play longer sessions.It's also of dubious ethics to label the GPU perf in any way without knowing this key metric and not doing so is essential in differentiating AT from others.
Cygni - Tuesday, March 8, 2016 - link
Mad about telephone reviews.Kepe - Tuesday, March 8, 2016 - link
Why don't you just start your own tech-related website and write your own reviews, "jjj". You don't seem to do anything other than bitch and moan about every single conceivable thing you can come up with. Just gtfo and do things better by yourself, as you seem to know how everything should be done. If you write some good stuff, maybe you'll earn a buck or two, too.maximumGPU - Wednesday, March 9, 2016 - link
Second that. Never read anything except criticism from that guy on every post. Gets old very quickly.retrospooty - Thursday, March 10, 2016 - link
3rd it... I was on with him on a separate thread. Seriously hard headed, and overthinking it.jjj - Tuesday, March 8, 2016 - link
As for battery tests, as long as you don't simulate a bunch of social, IM, news apps in the backgroud, you can't get close to good results when you got so many diff core configs.retrospooty - Tuesday, March 8, 2016 - link
"s long as you don't simulate a bunch of social, IM, news apps in the backgroud, you can't get close to good results when you got so many diff core configs"You have to have consistent methodology to test against other phones. The point is not to show what you might get with your particular social/newsfeed apps running, the point is to test against other phones to see how it compares under the same set of circumstances so you know which one gets better life.
jjj - Tuesday, March 8, 2016 - link
Sorry but you failed to understand my points.It would be consistent ,if it is simulated.
The point is to have relatively relevant data and only by including at least that you can get such data. These apps have major impact on battery life- and it's not just FB even if they were the only ones that got a lot of bad press recently.
The different core configs - 2 big cores, 4 big cores in 2+2, then 4+4, 2+4+4, or just 8 will result in very different power usage for these background apps, as a % of total battery life, sometimes changing the rankings. Here for example, chances are the Exynos would get a significant boost vs the SD820 if such tasks were factored in.
How many such simulated tasks should be included in the test is debatable and also depends on the audience, chances are that the AT reader has a few on average.
retrospooty - Tuesday, March 8, 2016 - link
And you are missing mine as well... If you have a million users, you will have 10,000 different sets of apps. You cant just randomly pick and call it a benchmark. The methodology and the point it to measure a simple test against other phones without adding too many variables. I get what you want, but its a bit like saying "i wish your test suite tested my exact configuration" and that just isnt logical from a test perspective.jjj - Tuesday, March 8, 2016 - link
What i want is results with some relevance.The results have no relevance as they are, as actual usage is significantly different. In actual usage the rankings change because the core configs are so different. The difference is like what VW had in testing vs road conditions, huge difference.
To obtain relevant results you need somewhat realistic scenarios with a methodology that doesn't ignore big things that can turn the rankings upside down. Remember that the entire point of bigLITTLE is power and these background tasks are just the right thing for the little cores.
retrospooty - Tuesday, March 8, 2016 - link
relevant to who? I use zero social media apps and have no news feeds running at all until I launch feedly. Relevant to you is not relevant to everyone or even to most people. This site is a fairly high traffic site (for tech anyhow) and they have to test for the many, not the few. The methodology is sound. I see what you want and why you want it, but it doesn't work for "the many"jjj - Tuesday, March 8, 2016 - link
Relevant to the average user. I don't use social at all but that has no relevance as the tests should be relevant to the bulk of the users. And the bulk of the users do use those (that's a verifiable fact) and more. This methodology just favors fewer cores and penalizes bigLITTLE.retrospooty - Tuesday, March 8, 2016 - link
Its still too unpredictable. One persons Facebook feed may go nuts all day while anothers is relatively calm. This is also why specific battery ratings are never given by manufacturers... Because usage varies too much. This is why sites test a (mostly) controllable methodology against other phones to see which fares the best. I find it highly useful and when you get into the nuts and bolts, it's necessary. If you had a bunch of phones and actually started trying to test as you mentioned you would find a can of worms and inconsistent results at the end of your work...jjj - Wednesday, March 9, 2016 - link
"Its still too unpredictable" - that's the case with browsing usage too, even more so there but you can try to select a load that is representative. You might have forgotten that i said simulate such apps, there wouldn't be any difference between runs.Yes testing battery life is complex and there are a bunch of other things that could be done for better results but these apps are pretty universal and a rather big drain on resources.They could be ignored if we didn't had so many core configs but we do and that matters. Complexity for the sake of it is not a good idea but complexity that results in much better results is not something we should be afraid of.
10 years later smartphone benchmarking is just terrible. All synthetic and many of those apps are not even half decent. Even worse, after last year's mess 99% of reviews made no attempt to look at throttling. That wouldn't have happened in PC even 10 years ago.
retrospooty - Wednesday, March 9, 2016 - link
I think you are a little too hung up on benchmarks. It is just a sampling, an attempt at measuring with the goal to compare to others to help people decide, but what really matters is what it does and how well it does it. I find it highly useful even if not exact to my own usage. If unit A lasts 20 hours and unit B lasts 16 hours in the standard tests, but I know my own usage gets 80% of what standard is I can estimate my usage will be 16 hours on unit A and 12.8 hours on unit B (give or take). It really doesn't need to be more complicated than that since testing batteries is not an exact science, not even on PC/laptops as usage varies there just hte same. That is why there are no exact guarantees. "Up to X hours" is a standard explanation. It is what it is.jjj - Wednesday, March 9, 2016 - link
One last time , these apps due to the different core configs can change the rankings. The one that gets 20h could go to 12 and the one with 16 to 14h.That was the main point here.Because the vast majority of users have such apps, you can't reach relevant results without factoring them in.A phone can get good results in the current test and poor in actual usage.retrospooty - Wednesday, March 9, 2016 - link
That is an extremely unlikely example. In real world usage, Octa cores have had very little difference in any area (performance or efficiency) vs. their quad core counterparts unless there is an issue with the ROM where it's simple bleeding power. Whatever though, if you feel so strongly about it, by all means, do some testing on various phones and submit to a few sites and see how worthwhile it is.jospoortvliet - Friday, March 11, 2016 - link
I'm no fan of the hitching at the AND team but he has a point with the background tasks - an average modern phone has a lot of them and testing endurance with them all disabled isn't realistic. Moreover, indeed a dual core would probably perform worse with more background tasks than a quad-core or octacore as they can spread the load and keep their cores at an energy-efficient optimal speed.A repeatable, fixed background load would be good to have. Also hard to write, certainly across ios and Android due to their different capabilities and limitations with regards to background tasks. So I get why it hasn't been done and if jjj is up for writing a tool that does it I bet the anandtech team would take it... Yeah.
leexgx - Monday, March 14, 2016 - link
the only thing i find with anandtech is battery tests is that they do not follow real world battery use (to get correct results divide it by about 2 or 2.5 and there is your real battery use, when you look at anandtech reviews, all other tests are good, most phones last around 4 hours Screen on time (max tends to be 4 hours but can be as low as 1.5 hours)the only exception to that rule tends to be the
Motorola RAZR MAXX phones (Low end CPU with 3200 battery),
CUBOT H1 1Ghz Quad core, 720p screen {basicly Samsung Note 2 spec} with Massive 5200 battery, (i own and love, and only costs £110 !!) still miss not having the HTC ONE stereo speakers but i take 8 hours over 2 hours (without having to use an external battery case that turns it into a brick)
and the Huawei Mate 8 (4000 battery)
Andrei Frumusanu - Tuesday, March 8, 2016 - link
GFXBench long term performance is currently not useful to anybody as it hits Vsync in the current T-Rex test. In the full review we'll have a proper test which will be of use for users.jjj - Tuesday, March 8, 2016 - link
Seems that you might be wrong Andrei. I am guessing you got the MI5 but that's 1080p and you forgot to factor that in? Here at 1440p you don't hit the 60FPS wall even in the first run.If the MI5 doesn't fall bellow 60FPS with the 1080p, it would be relatively good news, as it means less than 33%-ish throttling with a 1080p display and no heatpipe.
hansmuff - Tuesday, March 8, 2016 - link
This German review posted some GFXbench results you might be interested in, however ofc their phone uses Exynos. Hard to say that Snapdragon would behave the same way, but it's also clear that having a heatpipe alone in the phone doesn't mean it won't throttle.. significantly at some points.http://www.computerbase.de/2016-03/samsung-galaxy-...
jjj - Tuesday, March 8, 2016 - link
Thanks a lot, ofc has nothing to do with SD820 but having this for the Exynos is nice too.Digitimes was claiming that only the SD820 models have the heatpipe, no clue if true or not, guess it's something else that has to be verified.
jjj - Tuesday, March 8, 2016 - link
The Exynos models do have the heatpipe too.tuxRoller - Tuesday, March 8, 2016 - link
Is that what you do with your phone? Run continuous benchmarks all day?I understand we want an idea about throttling but using those benchmarks as proxy for expected performance won't get you that info about day to day usage.
TheinsanegamerN - Wednesday, March 9, 2016 - link
It will let us know if throttling is a big issue. Benchmarks are just that, tests to see how well the phone does.Of course we dont run benches all day, but if you do not run benches, then how do you know how well the phone does when pushed, in say, a mobile game like asphalt 8 for more than 5 minutes? It's useful information to have.
tuxRoller - Saturday, March 12, 2016 - link
Well then, according to that measure, throttling is a big issue with the iPhone 6s.How do you tell how well it works with asphalt 8? I'd probably pay asphalt 8 for awhile.
These benchmarks aren't a good proxy as they are designed to push the devices to their limits.
tuxRoller - Tuesday, March 8, 2016 - link
Also, XDA ran some graphics tests continuously if you're interested.All of the phones throttled, save the note 5.
frenchy_2001 - Tuesday, March 8, 2016 - link
here is the xda article:http://www.xda-developers.com/s7-edge-throttling-t...
Note that the note5 did not throttle in GPU test, but its final run was still only 60% of the throttled S7 score (and about 50% of the starting score).
tuxRoller - Tuesday, March 8, 2016 - link
I just mentioned that it didn't throttle since jjj seems to be an exynos fan, so I was trying to forestall "BUT THE EXYNOS DIDN'T THROTTLE", even though it had lower results throughout.gijames1225 - Tuesday, March 8, 2016 - link
Great write up. Glad to see the Snapdragon 820 is properly flagship level. I look forward to the rest of the review, details on throttling, and hopefully at some point a look at Samsungs new Exynos SoC in the other s7 model.Drumsticks - Tuesday, March 8, 2016 - link
It's kind of interesting that Apple has a solid win in CU performance against Qualcomm, but a loss in GPU, even though they get their GPU from a company who should specialize in them. Twister truly is an astounding architecture, but they're probably at the end of the easy, huge performance boosts. It may be interesting to see how well Apple can go about completely redesigning their core when it happens.name99 - Tuesday, March 8, 2016 - link
Faster GPU is not as interesting as faster CPU because it's trivial to make a faster GPU --- just throw more GPU "cores" at the problem. If Apple wanted to double the GPU performance of the A9 tomorrow, all they have to do is use 12 GT7000 cores instead of their current 6.So the decision as to how many to use is an economic/use case decision, not a technical one. Given the pixels Apple is interested in driving, it looks like they used the right number of cores. QC sells into a broader market (and, in particular, a market that, whether or not it makes sense, uses crazy-high pixel densities) so their incentives align with throwing in more cores.
If you want to run a technically interesting GPU performance competition between Apple and Adreno (or Mali, or anyone else), raw performance is not the interesting metric. The interesting metrics are
- performance/watt
- performance/mm^2
I don't know who would win either of these, but my guess is Apple. We don't yet have enough information to answer the first question, but the mutterings about GPU throttling in the comments suggest that QC gets hig GPU numbers by burning lotsa power, not by having some super-advanced GPU that's a cleverer design than anyone else. In a sense, what you're seeing is what Apple could copy if they wanted by putting the A9X inside the iPhone6S.
lilmoe - Tuesday, March 8, 2016 - link
" The interesting metrics are- performance/watt
- performance/mm^2"
Nah, the really interesting metric would be how much power consumes to run (and sustain) the average game at 60FPS at the same resolution. Read: efficiency.
sor - Tuesday, March 8, 2016 - link
Efficiency... So performance/watt.lilmoe - Thursday, March 10, 2016 - link
+ resolution+ drivers
+ software/engine
+ thermal headroom
+ etc, etc, etc.....
theduckofdeath - Tuesday, March 8, 2016 - link
If it's that trivial to make a faster GPU, I'm pretty sure Apple would have gotten a faster GPU in their phones. Faster GPU's in mobile, where battery life is really relevant is probably the trickiest part to succeed with. As you said, it basically requires you to add a ton of extra cores to make it noticeable for whichever hardware platform you're doing it on. GPU's are after all the biggest power consumers in tech these days. doubling the size of the Apple GPU would literally wreck the iPhone battery life, which I'm pretty sure you're also aware of yourself. They did after all gimp the GPU on the iPhone compared to the iPad for this exact reason.tuxRoller - Tuesday, March 8, 2016 - link
Do you mean Apple vs adreno or PVR vs adreno?I'd expect PVR to always be the most efficient GPU due to being a true tbdr.
http://blog.imgtec.com/powervr/a-look-at-the-power...
realbabilu - Wednesday, March 9, 2016 - link
You forgot the user experience. Throwing 30 fps onscreen is the minimum requirement, no need higher than that, unless Apple follows the resolution war.Apple always resolution fan in their macs, but they still think that crazy resolution is not needed, I think still true.
ciderrules - Tuesday, March 8, 2016 - link
Looks like Qualcomm spent their time creating a quality core instead of just adding more cores. Half the cores of the 8890 and by all accounts at least as fast (CPU, not GPU where it's quicker).I just wish you did a series of tests to check throttling.
tipoo - Tuesday, March 8, 2016 - link
He said part 2tuxRoller - Tuesday, March 8, 2016 - link
XDA already did a throttling test with a bunch of phones (6s, note 5, Nexus 6p, s7, etc). The results were that there was no throttling for the s7.retrospooty - Tuesday, March 8, 2016 - link
Nice... Mine is on pre-order now. I almost cant believe Samsung finally got it together and started making better devices with better build quality and bigger batteries and much less bloat, but I am very happy they did.anactoraaron - Tuesday, March 8, 2016 - link
The lack of an IR blaster is the only thing holding me back from buying the s7 edge. I use the IR blaster every day on my G4, and I assume everyone with young kids would as well. Remotes just seem to always be missing...Speedfriend - Tuesday, March 8, 2016 - link
No IR blaster!!!!! Samsung have just lost me as a customer. It is one of the best features of my S6.fanofanand - Tuesday, March 8, 2016 - link
I have to agree with this assessment, I won't buy another phone without an IR blaster. I have 3 kids, and the remotes are ALWAYS missing.iheresss - Tuesday, March 8, 2016 - link
There is no such thing as 'shot noise'. Every digital noise is just lack of light to hit sensor. By having larger pixel size means larger area for light to hit photo sensor hence reduce the 'sensor noise'.ah06 - Tuesday, March 8, 2016 - link
But unless the total size of the sensor is increased, isn't increasing the pixel size making only a minor difference?A 1/2.5" sensor is only going to collect X amount of light whether it collects it over 16 million 1.1 um pixels or 12 million 1.4 um pixels.
The only (very slight) gain over the higher pixel count is the loss at pixel boundaries due to pixel pitch.
Am I wrong?
frostyfiredude - Tuesday, March 8, 2016 - link
Because the sensing area is larger per pixel, the number of photons incident in each pixel will increase with it. Those incident photons are what give the picture data. So weird quantum effects that somewhat simulate adding or removing photons have less significance when there are more photons to begin with.More specifically at 1.1um vs 1.4um, 1.1um being quite comparable to the wavelength of visible light is causing some extra anomalous effects too.
ah06 - Wednesday, March 9, 2016 - link
Yea I knew 1.1 um was the bare minimum due to quantam effects. But say going from 1.4 um to 2.0 um, would that make much of a difference?After all the total amount of light collected by the sensor would be roughly same right?
A flower can be composed of 10 million pixels of size X or 5 million pixels of size 2X, the total area of the flow will still have collected the same light?
Where am I going wrong with this :P?
arayoflight - Tuesday, March 8, 2016 - link
Actually no. The sensor on s7 is a 4:3 1/2.5" sensor while the one in s6 us a 16:9 1/2.6" one.What it means is that it collects about 21.49% more licht than the one on s6.
ah06 - Wednesday, March 9, 2016 - link
You're right about the aspect ratio difference, hope more reviewers cover that there is no "Real" loss of resolution .However, does increasing pixel size really affect total light collected by sensor?
jospoortvliet - Friday, March 11, 2016 - link
I don't think it does, but it decreases noise caused by chance: with smaller pixels you have noise in low light situations in part simply due to the chance of one pixel catching randomly a bit more licht fotons than correctly represented the scene, and another less. With bigger pixels you smooth that out a bit and thus less random noise. It is only ONE source of noise, but it helps.Just imagine you take a pic of the same scene with two sensors, one so small it catches 5 photons average per pixel cell, the other one is twice as big and catches, on average, 10. A random one photon difference in a given pixel cell gives 20% brighter or darker pixels on the small, 5 photon-catching sensor and only 10% on the bigger one.
Again, it is only one source of random noise, but a pretty fundamental one you can hardly calculate your way out off.
adamto - Tuesday, March 8, 2016 - link
@Joshua Ho. Do you mind if I ask favor? Does S7 support AC tethering? Do you now any other Android phone with AC tethering? I am not talking about connecting S7 phone to a 5Ghz wifi such as home internet. These days most phone can connect to 5GHz wifi anyway. What I am asking is. If S7 itself can become a 5Ghz WiFi hotspot. This is can be very useful feature for me for transferring files between connected device to S7 tethering. I appreciate if you share with us a screenshot of network connection speed at PC to the S7 tethering at second part of review. Thanks!nerd1 - Tuesday, March 8, 2016 - link
Web browsing bench using chrome AGAIN????? How many times have anandtech been criticized for this?hansmuff - Tuesday, March 8, 2016 - link
I have a question about that. Is Chrome supposed to be an equalizer between the platforms? From a practical standpoint, the browser that Samsung ships is in some ways better optimized so I use that instead. Am I really missing out by not running Chrome on my S6?lilmoe - Tuesday, March 8, 2016 - link
You're not missing out at all, the stock browser is much better. I use it, and have it synced with my FireFox account (bookmarks and tabs).zeeBomb - Tuesday, March 8, 2016 - link
I wanna see a test done on RBrowser or a Qualcomm optimized browser with the CAF Aurora files (does make your browsing up to 40% faster, you're welcolme)grayson_carr - Wednesday, March 9, 2016 - link
Chrome is very poorly optimized for Exynos processors, so you aren't missing out on your S6, but for devices with Qualcomm processors, Chrome typically runs pretty well.Ratman6161 - Tuesday, March 8, 2016 - link
I'm wondering if the situation has changed any with Android 6. I'm assuming that all the test results for the S7 were with Marshmallow while for the S6 they are with Lollipop. I'm pointing this out because my Note 5 just got the Marshmallow update and after the update, Chrome has improved so much that I'm now using it as my default when I never did before.Might be interesting seeing a test that included s6 with Lollipop, S6 with Marshmallow and S7 to see how much of the difference is actually attributable to the OS update???
Ryan Smith - Tuesday, March 8, 2016 - link
Unfortunately Samsung's browser is not available on our Verizon-branded sample phone. The phone only ships with Chrome, and it is not possible to install Samsung's browser at this time.The Verge is reporting that this is a Verizon decision, and that all Verizon S7s are like this.
id4andrei - Tuesday, March 8, 2016 - link
What an interesting turn of events. Wonder why would Verizon be irked by Samsung's browser. Any thoughts Ryan? Maybe default adblocking since Verizon is also a media company.Ryan Smith - Tuesday, March 8, 2016 - link
At this point it's open to speculation. But I believe Verizon pulled the Samsung browser from the S6 as well, in which case this is nothing new.Ratman6161 - Wednesday, March 9, 2016 - link
I'm assuming my Verizon Note 5 would be the same as the Verizon S6? My note 5 has the browser that just says "Internet". Is that the one you are calling the Samsung Browser? Either way, Chrome is not the only browser on my phone. I got the 6.01 update last friday and still have both browsers avaiolable after the update.tipoo - Tuesday, March 8, 2016 - link
To me it still feels like Kyro doesn't aim high enough, given A9 has been shipping for months, in volume, and we're likely closer to A10. Even if A10 is a "tick" with a modest 20% performance boost, the cycle of staying behind it continues.That's the benefit of large profit margins I guess, and since just about no one but Samsung was making Android handset profits, no one was probably ordering a huge die with high IPC. Kyro comes a long way, and I'm glad they're going 2+2 with higher IPC rather than 4+4, but I guess I would have just liked it to go further and actually leapfrog the A9.
Samsung does have their own custom core in development, wonder how that compares to Qualcomms Kyro.
ah06 - Tuesday, March 8, 2016 - link
Samsung's core is already done I think, its the Mongoose core in the 8890, the one in the international variants. It's a slightly weaker core than KryoSpeedfriend - Tuesday, March 8, 2016 - link
A7 to A8 tock was 15%, and that was partly higher clock speed in a bigger body. Will be interesting to see what this tock brings.lilmoe - Tuesday, March 8, 2016 - link
Unlike what most reviewers want to believe, when designing application processor cores, companies like ARM, Qualcomm and Samsung aim for a "sweet spot" of load-to-efficiency ratios, not MAX single threaded performance.Their benchmark is common Android workloads (which btw, rarely saturates a Cortex A57 at 1.8GHz), since it's what makes the vast majority of the mobile application processor market. They measure the average/mean workload needs and optimize efficiency for that.
Android isn't as efficient as iOS and Windows Phone/10 Mobile in hardware acceleration and GPU compositing; it's much more CPU bound. It doesn't benefit as much from race to sleep in mobile devices. CPU cores remain significantly more active when rendering various aspects of the UI and scrolling.
tuxRoller - Tuesday, March 8, 2016 - link
Can you explain how you measure the relative "efficiencies" of the "hardware acceleration and GPU compositing"?lilmoe - Wednesday, March 9, 2016 - link
By measuring CPU and RAM utilization when performing said tasks. More efficient implementations would offload more of the work to dedicated co-processors, (in this case, the GPU) and would use less RAM.Generally, the more CPU utilization you need for these tasks, the less efficient the implementation. Android uses more CPU power and more RAM for basic UI rendering than iOS and WP/10M.
tuxRoller - Saturday, March 12, 2016 - link
How do you measure this so that you can ignore differences in the system (like textures chosen)? Then you'd have to make sure they're running on the same hardware.The best you can do is probably test Android and Windows on the same phone (this will put Windows at a bit of a disadvantage as Android allows very close coupling of drivers as their HAL is pretty permissive). Then you run a native game on each.
If you've found a way to do this I, and Google, would love to see the results.
Other than for 2d (which NOBODY, including directdraw/2d or quartz, fully accelerates), Google really hammers the GPU through use of shared memory, overlays and whatever else may be of use. There's obviously more optimization for them to do as they still overdraw WAY too much on certain apps, and they've obviously got a serious issue with their input latency, but it's a modern system. Probably the most modern as its been developed from scratch most recently.
Dobson123 - Tuesday, March 8, 2016 - link
In the 2016 web browsing battery life test, the S6 Edge is 20% worse than the S6, and the LG G4's number is also way too low.lilmoe - Tuesday, March 8, 2016 - link
I also thought the difference in battery life between the S6 and S6 Edge was off. They either posted wrong data, or something wrong happened while testing.MonkeyPaw - Tuesday, March 8, 2016 - link
I'd agree. When one of the phones goes from being upper middle of the pack on the old benchmark to being dead last--and woefully so--then I would have to wonder if something is really wrong with the new test. I've used the G4 for 6 months and have rarely had battery concerns over a day of "regular" use. I've owned several phones, and the G4 is a trooper.Ryan Smith - Tuesday, March 8, 2016 - link
We're re-checking the S6 Edge. We've had issues before with that specific phone.bodonnell - Tuesday, March 8, 2016 - link
Any chance you'll be doing an in depth review of the Exynos 8890 equipped model. I would really like to see an in depth review that compared it to the SD820. Preliminary benchmarks seem to suggest it is comparable in the CPU department but behind in the GPU department. Since it looks like us Canadians will be getting the Exynos variant I'd like to see if we'll actually notice a difference in everyday use...tipoo - Tuesday, March 8, 2016 - link
That's really strange! Usually it's NA vs world, but why would Canada get Exynos while the US gets SD820?bodonnell - Tuesday, March 8, 2016 - link
It is pretty odd as Samsung usually ships the same hardware to both Canada and the US. It seems in this case they are only shipping the SD820 version in CDMA markets and since Canada is all GSM we are getting the Exynos version. I don't know that this has been 100% confirmed yet though, that's just what the rumour mill is saying.phoenix_rizzen - Monday, April 11, 2016 - link
It's true, the Canadian version uses the Exynos 8890 SoC (as shown by CPU-Z, and CPU Spy on this Telus S7).zeeBomb - Tuesday, March 8, 2016 - link
OMG...! JOSH!!! you made my dreams come true with a blessed review.nathanddrews - Tuesday, March 8, 2016 - link
Still completely stupefied as to Samsung's (and Google's) decision to continue with this whole black text on white background thing, knowing full well the OLED consumes more power when pushing white. Also, it's ugly.Qwertilot - Tuesday, March 8, 2016 - link
Less net area of text than background, surely?theduckofdeath - Tuesday, March 8, 2016 - link
It's less straining for your eye to read black text on white (which is why you'rereading this in black on white right now :D ). Also, the brightness of the system settings is to be fair completely irrelevant in the long run. Once you've tinkered with whichever settings you prefer, you rarely ever go into those menus again. Making the UI look consistent clearly has a bigger weight in that design choice.WagonWheelsRX8 - Tuesday, March 8, 2016 - link
Pretty good write-up.I have one request: In your charts, can you put the SoC powering the device (in smaller letters) underneath the name of the device? It would be helpful to understand why a device performs like it does (is it SoC related or subsystem or software), and it's pretty standard info in PC based reviews. Otherwise, keep up the good work!
foneAddict - Tuesday, March 8, 2016 - link
Please Please Please can you do a review of the Exynos 8890 variant. I have a sneaky suspicion battery life on this SoC will be significantly better than SD820.Great review as always. Looking forward to part deux :-)
theduckofdeath - Tuesday, March 8, 2016 - link
I expect them to do a comprehensive test of the Exynos model, as that one seems to be the one they're going to sell everywhere not-USA.NonSequitor - Tuesday, March 8, 2016 - link
Samsung has proven that they can build really great hardware. Great. Now can we go back to getting a GPE version of their phones? I love my Nexus 6's user interface. Every time I try to help a friend with a Samsung phone it feels like the interface has been hit with an ugly stick and all the useful little things have been taken away. But the Nexus 6 hardware is only 'meh', and the 6P isn't better enough to justify a change. If the S7 were a Nexus device I'd be saying TAKE MY MONEY right now.R. Hunt - Tuesday, March 8, 2016 - link
Funny, that's exactly what I think everytime I've got to deal with stock Android: "Where have all the features gone?".Cooe - Thursday, March 24, 2016 - link
Lol what you call "features" I call terrible gimmicks. Stock Android FTW.theduckofdeath - Tuesday, March 8, 2016 - link
No one was interested in the GPE models. Samsung has had no interest in being a Nexus manufacturer for half a decade. That's why. Like it or not, buyers wants the smart features pre-installed at the factory.lopri - Tuesday, March 8, 2016 - link
Throughout the review, Mr. Ho refers to the Note 5 but I do not see any data pertaining to the Note 5 in the charts?I agree with him about the camera hump. I had no issue with the S6's camera hump, both aesthetically and practically. I think this provides an opportunity in which tech media should take a pause and self-reflect. In every freaking review of the S6/Edge, reviewers incessantly cried over that hump as if Samsung committed an unspeakable sin. It looks like Samsung took the criticism to heart, but unfortunately the criticism was an unwarranted one to begin with. Reviewers should think for themselves before following the fellow herd parroting sensational nitpicking on a non-issue.
Ratman6161 - Tuesday, March 8, 2016 - link
The note 5 will perform essentially the same as an S6/S6 edge. I've got a Note 5 and personally I'm not seeing anything in the new generation that would make me lust after an upgrade.R. Hunt - Tuesday, March 8, 2016 - link
Well, at least they address the issue in the right way IMHO: but making a only so slightly thicker phone and taking the opportunity to put a bigger battery in it.ah06 - Tuesday, March 8, 2016 - link
"In the interest of providing another data point and some validation of our testing results, I ran both devices through our old web browsing test to see what the results would be for something that should be display-bound. Here, it’s obvious that the Galaxy S7 edge holds a significant lead over the iPhone 6s Plus"But the actual delta in both cases is exactly the same! 53 mins!
Am I calculating this wrong?
9:58 - 9:05 = 53 mins
14:03 - 13:10 = 53 mins
So the delta in the 2013 and 2016 tests is exactly the same
Andrei Frumusanu - Tuesday, March 8, 2016 - link
Those are decimal hours 9.58h = 9 hours 34 minutes.buxe2quec - Tuesday, March 8, 2016 - link
And I'm still wondering... how is it possible that years after various websited emphasized the better alignment and design of the connectors and perforations of the iPhone, Galaxies are still aligning them with no care at all? check that top side with the slots and holes thrown there randomly and that bottom side with the four holes (or groups thereof) aligned in FOUR different ways.But hey at least Apple supporters won't say they are copying everything...
Dobson123 - Tuesday, March 8, 2016 - link
I couldn't care less.name99 - Tuesday, March 8, 2016 - link
Fair enough. But don't then complain that it's "unfair" when Apple sucks up 80% of the profits in this sector. It's attention to details (ALL details) that allows a company to charge more...lilmoe - Tuesday, March 8, 2016 - link
Attention to detail hardly has anything to do with the port alignment at the bottom of the phone, and more so with the antenna bands on the back and the cheap choice of aluminum alloys in previous models (changed only after "bending" to public pressure, pun intended).A nice finish =/= quality. A polished piece of cheap glass looks better than a rough diamond.
Most of Apple's attention to detail goes to media, perception, image, supply chain, and money making business models. Well, at least more so than their attention to hardware.
You have good points here and there, but your fanboyish attitude ruins the good parts...
theduckofdeath - Tuesday, March 8, 2016 - link
Exactly. And there is a pretty good reason why the SD/SIM slot is on the side on the top. It can't be in the same location as the camera's. Apple have displaced the camera to a corner, that's not very symmetric in my eyes. And like you, I've never been a fan of those plastic separators on the back on all-metal phones.End of the day, manufactures are always doing deliberate design differentiations to make sure their hardware is distinguishable from a distance. Most people can easily identify a Samsung phone when someone's using it, simply because they've stuck to the same camera design/location since the Galaxy S2.
tipoo - Tuesday, March 8, 2016 - link
I think you mean this?http://www.imore.com/difference-apple-samsung-indu...
That was about the GS6 and only just started making its rounds, after Samsung finally tried to make nicer designs. I mean, I agree with you, it's such a bizarre miss, but when you say "years and years", it's really, "year", or less.
grayson_carr - Wednesday, March 9, 2016 - link
Well if the Galaxy phones had bezels the size of Texas like the iPhones, I'm sure they could align the ports better because they'd have more room to work with.syxbit - Tuesday, March 8, 2016 - link
"While not quite going from zero to hero, Qualcomm has come close, and that definitely deserves some credit."I disagree. Giving them credit because of the large improvement over the awful SD810 doesn't make sense.
Instead of a comparison to last years garbage, give them credit for how SD820 performs compared to todays best SoCs. It turns out SD820 isn't really leading that much. It's mostly still behind a year old Apple chip.
whiteiphoneproblems - Tuesday, March 8, 2016 - link
"Always-On Display is nice to have, but for some reason it only polls the ambient light sensor, so the display won’t actually turn off in your pocket."This is strange and disappointing. I wonder why it does not use the proximity sensor.
whiteiphoneproblems - Tuesday, March 8, 2016 - link
...and as a stab at answering my own question: I assume constantly polling the prox sensor would cause a greater battery hit than simply leaving the display on all the time (including in pocket)?Would be interesting to chart battery life with AOD on vs. off (in some kind of controlled way, of course).
Arch_Fiend - Tuesday, March 8, 2016 - link
Very good review the S7 is a beast but so is my iPhone 6s plus. I wish I could afford to have 2 high-end smartphones because I would definitely by the S7 if I could because its finally the android phone i've been waiting for.Cant wait for review part 2.
misteroh - Tuesday, March 8, 2016 - link
I'm very glad you updated your battery testing methodology. I have the LG G4 and my personal experience with its battery life is MUCH closer to your 2016 test results than the 2013. The older test would have you believe it has much better than average battery life, but I personally find myself charging it much more often than I want to.stlc8tr - Tuesday, March 8, 2016 - link
I'm curious about the microSD card slot performance. If you put a fast (<80MB) microSD card in the GS7, are you going to see the difference over a slower card?Also, how fast is the USB MTP performance? Did Samsung re-introduce a USB3 connection?
tipoo - Tuesday, March 8, 2016 - link
It would depend on what you put on the card. Videos and music would probably see no difference. Apps, and if the camera can write directly to the card, maybe.H4CTOR96 - Friday, March 18, 2016 - link
I put a Sandisk 64gb Extreme on mine and A1 SD Bench returns 60Mb/s read and 40Mb/s write. For comparison the Internal storage works at 300 read and 150 write....Then I put in a crappy $7 32gb sd card and it scored 1032 write and 1.02 read, so yeah there was something wrong with that one XD
name99 - Tuesday, March 8, 2016 - link
"Of course, other than the workload the device setup has been held constant across these tests by equalizing ... disabling all background sync to the best of my ability."Is this really a good idea? Id argue that part of what you are buying when you buy these devices is proper setup. (Certainly, for example, that's part of what Apple would say they are selling.) As such, I'd consider that the right way to benchmark them is in some sort of "as close to out-the-box" state as possible. Sign up for everything the device asks you to sign up for out the box. (For Apple that will be to enter an AppleID, for Android I assume it will always ask for a Google ID and then some random collection of additional logins that people have paid the phone vendor to request.)
Then see how the phone behaves under those conditions. The Apple pone will presumably occasionally sync with iCloud. The Android phone will sync with various Google Services. And if the vendor asked you to sign up for "MyWannabe Social Network" and "MyWannabe Social Network" delivers ads to the device every three minutes, constantly sucking up power and CPU, THAT IS THE EXPERIENCE THE PHONE IS SELLING YOU.
Vendors that sell crap like that should accept the consequences in reviews. It's not AnandTech's job to spend an hour scrubbing some phone of useless crap. It is, in fact, AnandTech's job to run the phone with precisely all that crap enabled --- and then let us know the results.
Andrei Frumusanu - Tuesday, March 8, 2016 - link
By background syncing Josh is mostly referring to disabling application auto-updates and other such services which can have an impact on battery life tests. The usual small sync services and GMS have little to no noticeable impact on these tests.I disagree on your viewpoint about out-of-box software settings simply because the phones have different software and services depending on your region. North American units from carriers will have different settings and services than the international units. We don't always get samples from the same carrier even. AnandTech has been first and foremost a hardware site so I think it's correct to try to minimize the effect of such services to get a better representation of what the device itself is capable of, not what the carriers choose to add in or not.
Again, this is all overblown as in practice we see little to no effect on our tests and again we're mostly referring to auto-updates and the like which can eat up singificant amount of CPU cycles.
name99 - Tuesday, March 8, 2016 - link
" Always-On Display is nice to have, but for some reason it only polls the ambient light sensor, so the display won’t actually turn off in your pocket. As a result I turned it off as it’s clearly going to be contributing to idle battery drain in situations where it shouldn’t."Like I said...
I think it is not helpful to improving phones generally when reviewers accept stupidity like this.
If Samsung ship with a feature that's not ready for primetime, the review numbers (in this case battery life) should show it. I don't understand why so many phone users are willing to make excuses for manufacturers and just accept babysitting their machines, manually switching on and off GPS, Bluetooth, WiFi, NFC, etc as the situation demands.
My devices should damn well operate themselves, not rely on me to do it for them.
adityarjun - Tuesday, March 8, 2016 - link
If you do a quick google search, you'll see that a lot of users are still facing keyboard lag. Check reddit and xda.I would like to hear your thoughts on the same and whether you are also facing such issues.
jhh - Tuesday, March 8, 2016 - link
Still disappointed that Samsung didn't at least allow enabling using the SD card as extended internal memory. Yes, some people want to swap the cards in and out, but others have found they have run out of internal memory, and have no choice but to buy a new phone without this feature.danbfree1 - Tuesday, March 8, 2016 - link
As a brand-agnostic consumer, I had gone with the LG 2/3 for my previous phones but recently picked up an S6 due to a deal I couldn't refuse... I'm impressed with how Samsung, beginning with the S6, significantly toned down the bloat of Touch Wiz and also got rid of the cartoonish oversatured colors of the screen. With the screen setting in Basic mode, colors are very accurate... It's only weakness is average battery life, which is silly because of how much the camera sticks out anyway, there was no reason to make it so thin. Even with a thin case that I prefer, it STILL sticks out...Also lack of expandable memory and waterproofing was sorely missed in the S6. With the S7 they addressed these issues and beefed up processing and RAM even more. Great job, Samsung! Now just make the battery removable next time! With the LG G5 coming out soon, it;s nice to see such good competition in the Android flagship market!Homerr - Tuesday, March 8, 2016 - link
I made the jump from iPhone to S6 3 months ago. The only thing I miss and had hoped to see on a successor is the physical mute switch on Apple products.theduckofdeath - Tuesday, March 8, 2016 - link
Samsung's solution has always been to use the sensors as a "physical mute". Place your phone face down on a table and it'll auto-mute, even speaker phone calls.mrochester - Tuesday, March 8, 2016 - link
It's a shame that Samsung haven't improved the fingerprint scanner. Using an iPhone 6S and Galaxy S6 Edge+, it's frustrating how much the S6 Edge+ shows the 'no match' message when trying to unlock the phone quickly. Definitely room for improvement and something they need to sort out.Sttm - Tuesday, March 8, 2016 - link
I just went from an iPhone 6s Plus to a S7 Edge and so far I have no regrets. That is not to say its not without issues.The actually edge parts of the screen seem to be if anything a detriment because it causes image distortion looking head on. The web browsing is not as fast as on the iPhone, though it doesn't feel slow. This could be due to the superior single threaded performance of the iPhone or it could be due to Chrome not having adblock like Safari does now. Then finally the fingerprint reader is not as good as the iPhones, with constant fails, though I wonder if it is because of its smaller footprint and not a software issue.Otherwise I am quite happy with it. AMOLED kicks LCD's ass. Anyone trying to argue for LCD over AMOLED is insane in my book. The colors, the blacks, the edges of this screen might be distorted, but everything still looks far better on it than the iPhone. TouchWiz is no longer laggy, I've yet to experience any animation that didnt feel fluid. Having a back button again is like having had your left arm fall asleep and then wake up, Apple really needs this basic control.
Now I just have to wait for my VR headset to actually get here! Got my phone a week ago, and it still hasn't shipped, and it was already released and my phone wasn't!
grayson_carr - Wednesday, March 9, 2016 - link
If you switch over to using Samsung's browser instead of Chrome, you can install Ad Block for it from the Galaxy Apps store.lilmoe - Tuesday, March 8, 2016 - link
You could have at least measured the difference in efficiency between Chrome and Samsung's stock browser.................................. Sigh.Why insisting on using Chrome???? MOST GALAXY USERS AROUND THE WORLD USE THE STOCK BROWSER.
sachouba - Tuesday, March 8, 2016 - link
That's right !And the stock browser is much more optimized for Samsung devices, most of the time (particularly this equipped with an Exynos processor) : everything is faster (and less energy-consuming).
Ryan Smith - Tuesday, March 8, 2016 - link
Unfortunately Samsung's browser is not available on our Verizon-branded sample phone. The phone only ships with Chrome, and it is not possible to install Samsung's browser at this time.The Verge is reporting that this is a Verizon decision, and that all Verizon S7s are like this.
phexac - Wednesday, March 9, 2016 - link
And this is yet another reason I don't miss my Android phones after making the switch to the iPhone. The dysfunctional relationship between manufacturers, carriers and Google is such an enormous pain the ass between performance hits, weird permission choices and crapware from everyone who has touched the phone's creation process.sachouba - Tuesday, March 8, 2016 - link
Thanks for this review, it's quite good.But I don't understand why the brightness would have been lowered whereas other websites tell the contrary - and Samsung usually increases the maximum brightness on every flagship device.
Moreover, the web battery life test is not representative of actual battery life, because AMOLED displays are very disadvantaged in this test on white web pages, whereas the battery life would be much higher on websites with a lot of dark areas (photos, dark background, etc.).
Some browsers allow you to use a "night mode" which inverts the colors of websites background if the APL is high.
I hope you will review the Exynos version as well - I guess it will be much smoother and with a better battery life, as always.
JoshHo - Wednesday, March 9, 2016 - link
Our new web test has some pages with dark themes currently. However, the overwhelming majority of webpages and UI have a high average picture level. In order to reflect this the vast majority of our webpages are black text on a white background.We are hoping to get an Exynos unit to compare with the Snapdragon 820.
lilmoe - Wednesday, March 9, 2016 - link
Please use the stock browser with Ad block enabled for that test on the Exynos variant. Thanks.deskjob - Tuesday, March 8, 2016 - link
I am totally for prioritizing energy efficiency and battery life over a 100% fluid UI. But I also find it hard to believe that at this stage in the smartphone evolution, a flagship device still can't achieve that goal without adverse effect on battery life. I feel like in the S7's case, it's more likely than not that Samsung and their proprietary UI is at fault for the janky UI performance. Would love it if you guys have time to investigate this further! After all, I am not sure I am not alone in saying a smooth UI is a big part of the everyday smartphone experience.heartinpiece - Tuesday, March 8, 2016 - link
Wow Finally!Will you be reviewing the Exynos 8990 as well?
Seems like a comparison between the 820 and the 8990 would be interesting!
10basetom - Tuesday, March 8, 2016 - link
I can't wait for a detailed comparison between the Snapdragon 820 and Exynos 8890 once you have both S7 models in hand.SydneyBlue120d - Wednesday, March 9, 2016 - link
Same old question, maybe in part 2 of the review we can get an answer: Is unlimited HEVC encoding at 2160p60 with both HDR and IOS supported? Thanks a lot.Osamede - Wednesday, March 9, 2016 - link
While its always interesting to see proper deep dive reviews, I find that these days there is nothing that would get me to buy another Samsung "flagship". Totally overpriced by strategically riding the slipstream of Apple's own boundless greed - except these ones do not hold value at all.s.yu - Friday, March 11, 2016 - link
You must be blind, choosing the recent iCrap over this. Apple's last good looking phone was the 5th generation.pjcamp - Wednesday, March 9, 2016 - link
You say people will find the stock Android interface "rather spartan." That strikes me as odd since I went from TouchWiz to an essentially stock Android on a Moto X Pure. I would never go back. Settings are located where they should be, not moved around at random. Customization is much easier. And stability is no longer an issue. I had, for instance, one major audiobook app that reliably crashed the system under TouchWiz to the point of needing a battery removal to get it restarted. I came to understand that TouchWiz, not the app, was the point of failure during the brief time I used Cyanogenmod and it worked fine. Also works fine on my Motorola.TouchWiz and other skins are not about operability. They are exercises in branding, equivalent to logos. Rather than being nonspartan, they are actually an impediment to usability. If you go to Google to find out how to do something, the instructions will be different from how your skinned phone operates. Sometimes it is easy to translate, sometimes not.
For a lot of people, that probably doesn't matter. But for a lot of people, introducing a skin introduces another potential point of failure, and another opportunity for vendors to point at each other in a circle should there ever be a problem.
s.yu - Friday, March 11, 2016 - link
You were lucky. Back when I used an S4 mini I flashed a Cyanogenmod, because Touchwiz was too ugly with lack of a theme store, it crashed like hell.For my S6E I just got a Material Dark theme to fix the look and everything else was fine, I never objected the placement of the settings.
Shadowmaster625 - Wednesday, March 9, 2016 - link
For shame. If you dont have a robot actually tapping the screen every 6 seconds, your test is FAIL.hugoleal85 - Wednesday, March 9, 2016 - link
Excellent article. Looking foward for identical reviews of Lg G5, Xiaomi mi5 Pro and Galaxy S7 (Exynos 8890).donalddumb - Thursday, March 10, 2016 - link
I´m wondering about the NAND Performance results, particularly 4k, which are seriously lower than earlier BM results.http://www.anandtech.com/show/9146/the-samsung-gal...
Galaxy S6 results with Androbench 4.0
256k seq. write: 58,83
4k random read:73,15
4k rw: 18,9
donalddumb - Thursday, March 10, 2016 - link
as i guessed, Anandtech was still benching with an older version of Androbench, which doesn´t support UFS 2.0. Shame on yousoh.0 - Thursday, March 10, 2016 - link
I'd love to see a comparison between the different camera modules inside the s7's if possible. Since they ship with either the Sony IMX260 or Samsung's britecell. I'd like to know if one is superior or not. Not that I have a choice in which sensor i receive but...s.yu - Friday, March 11, 2016 - link
They just started production of an equivalent sensor with dual pixel PDAF, AFAIK.s.yu - Friday, March 11, 2016 - link
"And to top things off the camera hump has now been almost entirely eliminated."What's to be so obsessed with the camera hump? It's a design element! Would you rather there's only a small hole visible in the middle of the shiny metallic finish? That's closer to Xiaomi's approach to the Note 5 ripoff, but honestly, that's MUCH uglier than the hump.
H4CTOR96 - Friday, March 18, 2016 - link
Actually, Xiaomi had those design elements and rounded back in another of its phones before the Note 5, btw.Not that it matters much, since everyone copies everyone these days
s.yu - Friday, March 11, 2016 - link
"The thickness does result in a noticeably reduced camera hump, but on a personal level I never really cared about the camera hump in the Galaxy S6, so I’m not sure I care about the reduction in the camera hump here."Alright I should've finished reading before replying.
Belard - Friday, March 11, 2016 - link
What I find shockingly stupid is the release of these new $600~900 phones, including the latest Moto X, that DO NOT include USB-C connector?! Its been available since last year.Apple does things quickly, they come out with technology and release it: such as with the iPhone 5 with its reversible port. How hard is it to do with other companies?
Motorola (Lenovo) could have done this with their New X to make a bold statement on how they are going to run their business.
theduckofdeath - Friday, March 11, 2016 - link
Probably because Samsung has usage data on what we actually use the USB port for these days. And I guess it's used almost exclusively as a charger. Why force the consumers to buy a bunch of new cables and chargers just because there is a new port out there? I know Apple would do that in a heartbeat, like you said, as they simply see it as a new way to increase earnings on licensing accessories.Azurael - Tuesday, March 15, 2016 - link
Because USB-C is much easier and quicker to connect? (I certainly find the 5X and 6P much easier to connect in a darkened room - good thing since Google nicked Qi charging.)Because USB-C cables and sockets should be (probably too early to say, but by design) far less prone to failure than Micro-B? (Micro-B cables, and not cheap ones - OEM LG/Nokia/Sony/Moto cables, die on me on a weekly basis. About half of the cables I own only work for charging now.)
I don't know, why don't we still connect our keyboards with the AT connector or PS/2, and our digital video cameras by firewire?
You don't need new chargers. If you've got dozens of USB-A power supplies, just use an A-C cable.
theduckofdeath - Tuesday, March 15, 2016 - link
I'm not suffering from long connection times when connecting my mobile to the charger. Sure, it would be nice with a more uniform connector, but, if it comes at the expense me having to throw away all old cables, having to bring adapters and generally making life more expensive, I can easily live with a micro USB connector until connectors are entirely a thing of the past.Physical connectors for data transfer is really not essential these days. These phones has wifi and LTE connectivity at speeds close enough to any USB connection to make us not bothering transferring anything by wire any more.
Belard - Tuesday, March 15, 2016 - link
Uh, just need to replace the cable or it comes with the phone... not difficult. A flip-able cable is VERY handy, especially in the dark. Unless the end is marked or molded a different shape - you have to LOOK which side is up. Apple changed the cable ONE time, because they wanted a much smaller and better connector.So for a top end phone, I want a state of the art connector too. hence, I bought a new Moto G for $220... I lose the stereo speakers, but I saved $200 and have two free color covers I switch out for when I'm in the mood. So maybe I'll stick with the Gs.
Bruce Dunn - Friday, March 11, 2016 - link
For the average cell phone buyer, most of the information in this review goes right over their heads. I hope that part 2 of the review will address in simple language the following:Can I read the display in direct sunlight (giving me the number of nits emitted by the display does not tell me this).
What happens if I drop the phone into a sink full of water.
What happens if I drop the phone onto a concrete floor.
peedroo - Saturday, March 12, 2016 - link
http://www.displaymate.com/Galaxy_S7_ShootOut_1.ht...Maximum screen brightness in high ambient light results
s.yu - Thursday, March 17, 2016 - link
Anandtech is not for the "average cell phone buyer". It's for people who *really want to know*. If there were more of us there would be less rip-off products on the market and everything would be easier, more money would be devoted to R&D instead of marketing and more will be achieved.peedroo - Saturday, March 12, 2016 - link
Loved the review till nowBut here
http://www.displaymate.com/Galaxy_S7_ShootOut_1.ht...
...they have tottaly diferent results about screen brightness levels when we compair it to the S6. It's better
karthik.hegde - Sunday, March 13, 2016 - link
I think what article needs to highlight more is that the Kirin 950 handily beats SD820 is most of the tests. ARM Cortex-A72 is a great core, released quite sometime ago still doing pretty well. I am sure ARM has new CPUs in the pipeline which will be released soon.I wonder if it makes sense for Qualcomm from business perspective to continue designing their own cores, while ARM already offers stock cores with great performance.
kamhagh - Sunday, March 13, 2016 - link
Another terrible phone :Sgfieldew - Sunday, March 13, 2016 - link
I have an Exynos variant of the S7. I ran the browser based tests for you.Kraken 1.1 - 2553
Octane 2.0 - 12602
WebXPRT - 168
gfieldew - Sunday, March 13, 2016 - link
Sorry, I should have mentioned that I used the Samsung version of the AOSP Browser called simply Internet.lilmoe - Sunday, March 13, 2016 - link
Andrei,Just got my GS7 Edge and ran your NAND benchmark (4K random 256K sequential).
I'm getting:
*sequential: 524.43 MB/s read, 149.03 MB/s write
*random: 76.76 MB/s read, 14.99 MB/s write
What are your exact settings in AndroBench?
lilmoe - Sunday, March 13, 2016 - link
Sorry, I meant to address Joshqasdfdsaq - Wednesday, March 23, 2016 - link
Yeah, I'm getting similar results to you (~400 read, 140 write, 80/15 random) on the S7, both with AndroBench and several other benchmarks. Those numbers in the article look... offqasdfdsaq - Wednesday, March 23, 2016 - link
Here's a point. Anandtech are testing the Snapdragon 820 variant, whereas my results are from the Exynos variant. Which one were you testing?lilmoe - Friday, April 15, 2016 - link
Exynos, of course.truelovv - Monday, March 14, 2016 - link
Truelovv.com is a 100% free dating site for singles and Relationship Who are looking to meet their special someone, True Lovers and after they can do romance and create a finalily relationship.True Lovv Website is Founded because When you are FREE and you want to chat someone then you can come into our website and Do Flirt, Romance, Chatting, Messaging etc. with your life partner or your True Lover. The only reason we require these forms is so we can match you with the right man or woman, and to make you feel comfortable at our site for singles dating.
http://www.truelovv.com/
Aritra Ghatak - Wednesday, March 16, 2016 - link
Can anybody provide the link to the material wall in Software UX second screenshot?Bluetooth - Wednesday, March 16, 2016 - link
In the NAND storage tests are some phones using decryption and other not?qasdfdsaq - Wednesday, March 23, 2016 - link
As far as I can tell, the S7 comes encrypted out of the box and you cannot turn encryption off, so FDE is its natural and only state.jerrylzy - Wednesday, March 16, 2016 - link
ARMv8 has AES instructions built-in, like AESE (ByteSub, ShiftRows, AddRoundKeys) AESMC (MixColumns), etc.. With this kind of hardware support FDE should not affect much of the NAND performance if Samsung actually uses them.tipoo - Wednesday, March 16, 2016 - link
Ars Technica got something near 100MB/s for random 4K reads, sup with that?jacksonjacksona - Thursday, March 17, 2016 - link
( www).(ajkobeshoes).(com )christian louboutin
jordan shoes $60-
handbag
AF tank woman
puma slipper woman
=====
( www).(ajkobeshoes).(com )
SirKronan - Saturday, April 30, 2016 - link
Please remove spammer...XmppTextingBloodsport - Saturday, March 19, 2016 - link
Good job ignoring both the audio subsystem and antennae functionality of this PHONE.XmppTextingBloodsport - Saturday, March 19, 2016 - link
All LTE phone reviews need an eye to SIP (volte) and their respective audio paths and or black magic deviationsguyhindle - Monday, March 21, 2016 - link
I just *can't* bring myself to drop > £600 on a handset that will last < 3 yearsmaxnix - Wednesday, March 23, 2016 - link
Dumb Comment Time: But one has to note that if Samsung had not abandoned field replaceable batteries, Page one would be a mere curiosity.Coincidentally, I cannot find a review for the original Note Edge. Did AnandTech whiff on this?
s.yu - Monday, March 28, 2016 - link
It's been some time now...no second part to this?Belard - Monday, March 28, 2016 - link
Anyone else not impressed with the old USB 2.0 type connector?USB 3.0 type C has been available for quite a long while. Its faster, it handles more power, faster charging, there is no UP or DOWN, just plug it in.
This from a $600~800 flagship phone in 2016? Apple has been shipping phones and tablets with such a connector since 2012.
makuchaku - Wednesday, March 30, 2016 - link
Hey @Joshua, when is the 2nd part of this review coming? Eagerly awaited!Thanks
kreacher - Wednesday, April 6, 2016 - link
I can't find part 2 of the review, has it been published?gfieldew - Saturday, April 30, 2016 - link
Where can I read Part 2?SirKronan - Saturday, April 30, 2016 - link
Yes, Joshua! We are all eagerly awaiting part 2. :)vortmax2 - Thursday, May 5, 2016 - link
Not to complain, but why is Part 2 taking over 2 months??Aritra Ghatak - Saturday, May 7, 2016 - link
I guess the project is dead... It's over 2 months now.oryanh - Thursday, May 19, 2016 - link
Will there be a Part 2 of this review?soh.0 - Wednesday, June 15, 2016 - link
Now that the buzz around the gs7 has died I guess part 2 of this review means nothing to you.Eden-K121D - Monday, June 27, 2016 - link
AnandTech is incompetent for not being able to complete a reviewtfouto - Tuesday, July 5, 2016 - link
what about PWM on AMOLED displays. I can notice when the phone is moving the pwm quite easily.