Hey Anand, are you planing to teardown the gamepad itself? I've been reading it only last 3 hours so I'm wondering how big the battery is, can't find that info anywhere and there's no teardown of the gamepad either. Thanks.
That's not a bad idea. I'm tickled that Mr. Shimpi went through the trouble of tearing the Wii U down just to take a peek at its processors, but I suppose now I wouldn't mind seeing the inside of the controller as well. You give us an inch and we want a mile.
Smaller battery capacity than many new smartphones, with a larger screen. That's where the battery life of 3-4 hours comes in. I expect it will be very little time before third party extended batteries come out.
They are not comparable in that regard. The WiiU controller is not running the games, it is streaming the video from the console. So it only has decode hardware for the stream, it powers the display itself, and the of course the control surfaces.
Haha, what's up with all the empty space? There's a clear message that it's not for less casual gamers, because your controller will be dead before you're done gaming.
It's a 'safety feature' as no one paid attention to the warnings previously about taking a 3 hour break every 20 minutes, so now their forcing our hands.
N.B: The information in this post is either completely made up or extremely exaggerated.
It was $15 at launch, if you cared that much about a review, go buy it and review it yourself.
I'm pretty sure you can still get it for $15 (don't know when the promo for "buying a pc with windows 7" to upgrade to 8 stopped, but they did not require any proof of purchase or anything else).
Thank you for this tear down Anand. Though I'm not going to be jumping this bandwagon, it was a good, informative read--and one I'd prefer over a Windows 8 review (something anyone has had access to for over a half a year with release previews and like I said, a $15 launch cost).
Ryan and Jarred are working on the Windows 8 Performance Guide, we already posted a take on the modern UI in our Windows RT Review which hit shortly after the Surface review. Lots of stuff coming, just all takes time.
If it was an Apply Ios review anand would have it up the second it hit live.
I dont understand this 2nd hand treatment of win8 when it will sell millions of copies over the next few years. AND is a major UI change for the biggest pc os software company in the world.
I do like Windows 8 a lot. The major UI change we really covered in our Windows RT review (RT has the same UI as Windows 8). The rest is coming in our Windows 8 Performance Guide. I don't remember the last time I wrote an iOS review, and the same is true for a Windows review. Other folks are actively working on filling in the blanks on our Windows 8 coverage now :)
Isn't 6GB/s rediculously low considering the PS3 and 360 are over 20 each? I've read on other sites that the memory is closer to 17GB/s, also based on the markings.
And holly molly, that CPU die is small! Even smartphone SoCs are larger than that. With three cores seeming most likely, each core is pretty tiny, and that's with the eDRAM on die. I wonder if there is truth to the CPU not out-muscling the PS360 CPUs then, even with their age. The Cell on the same 45nm fab is still over 100mm2 if I remember right, and that's only 200 million transistors.
Well... at the end of the day the console is pretty powerful. Here is Call Of Duty Black Ops 2 looks better on the WiiU than the XBox 360/PS3 https://www.youtube.com/watch?v=bZO33bCFwks
The sad part about this, though, is that they're making the same mistake they did last time - this hardware is lacking in the present day, and basically catches up with 6-year old consoles.
The Wii U will be forgotten next year when Sony and Microsoft come out with their consoles. The technology in those will hopefully be a bit more forward looking...
By the end of next year, they might have that MCM fully integrated and shrunk to 32nm. They could be selling consoles for $139, and even turning a profit on each one. Meanwhile, the other consoles will be selling for $300-$400. What does that extra money really get you? Everything is still pretty much stuck at 1080p.
I'm totally agree, 1080p will be the standard for a longgg while. Beyond that price is very important. It took me 5 years before I buy my first ps3 at $250(with discount price). I got my wii during its first year deployment.
Incase you hadn't noticed resolution isn't the sole reason graphics power keeps increasing.
Games could look massively better at 1080p. The next consoles from MS and Sony will have way more RAM and processing power. No more blurry textures, poor AA, crap physics, and low polygon counts like on all current consoles, including the Wii U.
We aint even anywhere near to photo realistic graphics at 1080p. PC's could get near to it right now, but are completely held back by all the console ports. You're extremely narrow minded and short sighted.
Thats a VERY crap comparison. They don't even say what other version is being shown. And the frames look messed up as if it interlaced video for what ever console they're comparing the Wii U against.
Wow, I wonder if even a large eDRAM cache can offset that much speed difference from the PS3 and 360. It's larger in capacity so it would be doing less loading/unloading than them, but some things still depend on memory streaming.
Judging from the pictures, the part number is H5TQ4G63MFA-12C correct?
From their part number and this PDF (http://www.skhynix.com/inc/pdfDownload.jsp?path=/u... it is indeed 800 Mhz rated. I believe that that is the base clock though so that the bandwidth is actually 1600 MT. Note that the GDDR5 speeds, which go up to 7 Ghz effective, are not represented in that decoding table. Thus bandwidth for the Wii U would be 12.8 GByte/s.
rv740 means 4770 or only 640 (vliw5 dx10) radeon shaders If it was based off the 5770 it would have 800 (vliw5 dx11) radeon shaders, but the die size is too large for the die size would have to be 166mm^2 and anand only found about 156mm^2
And a cpu that has such a small die size only 32.76mm^2...I am pretty sure tegra3 cpu die size is larger than this (once you remove the die that is dedicated to the gpu and the companion core.)
He just said the size is a bit bigger than the RV740, that doesn't mean it's an RV740 in there. With the supposedly pretty large eDRAM in there that throws off estimates, the GPU core could be pretty customized.
Considering the Xbox 360 GPU is roughly on par with the Radeon 6450 (at least in compute power) this seems to be a pretty solid upgrade on the GPU side.
I do agree on the CPU though. I think the 3DS silicone may actually be larger than that CPU die. Given the use of gamepads, I'd have thought that the CPU would need to be pretty strong.Guess we'll have to see what happens later on.
The low end graphic cards have always been so poor in gaming performance on the pc side, with every iteration too, they seem to keep the same core performance and only add some compatibility stuff. Always disappointing.
From my research, the Xenos (Xbox 360 GPU) puts out about 240 GFlops single-precision, and the Radeon 6450 top out at 240 GFlops as well.
Of course that doesn't tell the entire story due to the extra eDram and a few extra tidbits including microcode optimizations, but yeah...
Low end GPUs exist nowadays to upgrade older PCs for playing HD video well, or otherwise to upgrade from older (pre-Intel HD 3000) integrated graphics, mainly for HTPC use. Because it's target market doesn't rely on performance, there's little point in making an entire new core unless there's a fancy new video decoder or encoder.
The XBox 360 has 48 shaders that would best be described as pre-Radeon HD 2000 series class. They're not DirectX 10 compliant, at least with what shipped with Vista. The XBox 360 was originally supposed to be DX10 compliant hardware but Vista was delayed and the PC spec changed while the XBox 360 GPU hardware was already completed.
Anyway, in terms of performance, this puts the GPU between a Radeon X1950GT and Radeon HD 2900GT in terms of capabilities and performance. GPU efficiency has crept upward with AMD's VLIW5 designs over time which would actually mean that the Radeon 6450 would be slightly faster than the XBox 360's GPU.
It appears that the Wii U's GPU has 32 MB of eDRAM on the GPU. More than likely this amount of eDRAM takes up half the GPU die but should be well worth it in terms of performance. I suspect that there are 96 VLIW5 shaders (480 ALU's), 32 TMU's and 16 ROP's. While everyone is drawing comparison with the RV740, chances are that this design incorporates many of the efficiency improvements found in AMD's VLIW5 architecture that made it into the Barts design (Radeon 6870).
PS3's RSX is = 7900GT with half the memory bandwidth and half the ROPs. So it's much slower than the desktop 7900GT.
The GPU in the 360 is probably at best as fast as X1800XT despite unified shader architecture because it is also memory bandwidth crippled.
If it has an HD 4770, that is at least 2.5-3x faster already than either the PS3/360's GPUs. There is probably little doubt that the next PS4 and Xbox will be more powerful but this console is definitely more powerful than current generation.
Well now that I read the article more carefully, the 12.8GB/sec memory bandwidth would cripple even an HD4770 GPU. Looks like the GPU is hopeless for next gen gaming.
The PowerPC is not competitive anymore. I doubt IBM has been throwing any R&D at it since Apple left ship. And yes it is a derivative of POWER which IBM is still developing, but there never was that much synergy between POWER and PPC, probably less so these days.
It's pretty safe not to compare it with Sandy/Ivy bridge. But I'm pretty sure even a dual core Clover trail or even Brazos would outperform this thing.
So, my question is, why is IBM still winning these contracts? Is it just ISA compatibility? I'm not sure that's such a big concern these days, considering how quickly even the PS3 got rid of its PS2 compatibility mode. I mean, does anyone actually want to play low-res Wii games on the Wii U? What for?
And if we are to put ISA compatibility aside, why is IBM still winning these contracts? Is it because Intel isn't even bidding for them, due to low margins?
It's because Intel won't licence chips out. IBM wins because they sell the design of the chip, then X console maker can modify it and shrink it on their own schedule. And also, there are whispers that the PS4 and Nextbox may be using AMD CPUs (or APUs), and AMD does already have experience catering to consoles.
By the way I'm not sure about the IBM not being competitive anymore either. The Power7 CPUs still are extremely competitive with Intel in high throughput scenarios, so I hear.
The weird thing about AMD's new strategy is that it doesn't necessarily mean an x86 chip inside of an APU from them anymore. They already have an ARM license and I'm sure that a licensing deal can be worked out with IBM to include a PowerPC core if the console maker desires (note that the Xbox 360 now uses a triple core PowerPC + Radeon based GPU on one die).
Actually the differences between POWER and PowerPC nowadays is purely marketing. The divergence pretty much ended with the POWER5 (though Altivec SIMD wasn't added until the POWER6). The distinctions between POWER and PowerPC in terms of hardware support are in SoC features like on-die cryptography, TCP/IP off load and accelerated memory compression that fall outside of the CPU core.
And IBM has been developing new PowerPC based cores. The Wii U could use the embedded PowerPC 476 core or the PowerPC A2 core. These cores would be a minor step up from what is found in the XBox 360 or the Cell PPE inside of the PS3. Sandy Bridge is far more aggressive core design than either of the PPE, PPC 476 or PPC A2 but it is also larger and consumes more power. The main reason IBM gets these contracts is due to their willingness to design and manufacture a custom chip.
As for POWER, the new POWER7+ takes the performance crown in the high end server world. It manages to top the 10 core Westmere-EX and the 8 core Sandy Bridge-E.
Reading an interview on the development of the WiiU, it seems that both the CPU die and GPU die have some form of embedded memory. Not sure what that tiny chip is for though.
That's what I've been hearing too, although the CPU one isn't nearly as large, possibly something like 32mb GPU, 3MB CPU. The CPU one is just used for an L2 cache replacement.
You mention Hynix DDR3-800 devices but I guess the 800 means 800Mhz which would translate into DDR3-1600. So that's 12.8GB/s instead of 6.4GB/s. Almost every laptop sold these days is equipped with DDR3-1600 and it's dirt cheap. I would even assume that DDR3-800 might be more expensive these days. The AMD GPU also functions as the northbridge and southbridge (just as on the Wii). So don't forget to take that into account. RV740 was 137mm². That doesn't leave too much space for the northbridge, southbridge and eDRAM. RV740 also had a power consumption of around 60-80W. Even with a decreased clock speed that wouldn't fit into the 33W of the Wii U. Or it's much less than a RV740 or it's not 40nm. I see two possible candidates for the CPU. My first guess would be the PowerPC 470. Multicore capable, very low power consumption, very small, customizable but the speed is more in the range of an ARM core. It would make sense I guess since many developers mention the lack of CPU performance. My second guess would be the PowerPC A2. Multicore capable, low power consumption, small, ... but not really meant for something like a console (but still possible).
See my comment below, they are 256mb x16 for the 96ball FBGA chips, so unless 4 more are on the underside of the mainboard we have just found the o/s only memory, not the dedicated video memory
And yes the data is in that pdf you linked, the xxC has the exact same specs only there are 2 models 12c (wii u uses) is 800mhz (x2 ddr) and the 11c @900mhz x2 ddr
That's why the sheet has the xxC listing because those are the only differences with that number.
thanks for the info i don't know where you're seeing the 12c speed grade in that pdf however
it didn't even cross my mind that the wii-u might have that much eDRAM
from watching the teardown PCPer did i can confirm that there's no further DRAM ICs on the underside of the PCB (i could identify what i believe to be the SLC NAND chip for the OS that the wii-u is supposed to have and which has been mentioned in this teardown and a bunch of smaller ICs the role of which isn't obvious to me)
Those Hynix chips are H5TQ4G63MFR-12C Clock Freq 800mhz 1.6gbps/pin 96ball FBGA chips, here's the interesting part nobody has caught onto yet...
They are 256mb x16 chips... so x 4 chips in the array we are looking at 1Gb total ram... so unless 4 more are hiding underneath, the video ram is somewhere else, and this ram is the ram being used by the O/S which is fine seeing as most pc's today use ddr3 1600mhz for O/S use...
What is on the underside??? In any case your estimate of 4GB (512mb) per chip is flat out wrong, as they are 256mb per chip with that model number.
If there are none below, then this might mean that the video ram is that small extra chip near the gpu/cpu and that the system ram is separate from video ram like a proper gaming pc would be per se. Which would debunk the ram bandwidth being slow etc theory, because if there is only 1Gb of the ddr3 present this is the ram nintendo said would be reserved for o/s and system use only and not be available to developers.
It's not possible as they only manufacture 256mb x16 bus width under that part number. If the chips were 512mb each they would be the x8 bus width 78ball FBGA type.
I will say it again: we have a Hynix datasheet with that specific part number on it, and it is specifically listed as being 4Gbit. So if you have a problem with the size, take it up with Hynix.
I just think you are having a problem with basic math, 256mb x16 bus width per die chip means that the entire ram allocated is 1GB total. There is no 4GB on the wii u to begin with so tooting that number makes you look insane. Nintendo said themselves that 1GB would be dedicated to the O/S (From what we can see those hynix chips) and 1GB for developers that is 2GB
But hey it's not the first time in this article alone you guys have posted wrong technical data.
It's as bad as neogaf claiming the samsung chip was the system ram originally, turns out that was the 8GB eMMC chip aka the built in hard drive...
256Mx16 doesn't directly refer to the density of the chip, rather the organization of the RAM itself. That's 256M rows x 16-bits, or 512MB.
The datasheet you linked to confirms this (check out page 3 under the description header - emphasis mine):
...are a ***4,294,967,296-bit*** CMOS Double Data Rate III (DDR3) Synchronous DRAM...Hynix ***4Gb*** DDR3 SDRAMs offer fully synchronous operations referenced to both rising and falling edges of the clock.
Yikes talk about conflicting info within their own technical sheets.
If they are 512 mob each then yes that's all 2Gb's of ram (video and o/s) which paints a very very bleak picture for overall memory speed and layout.
Looks like no amount of frame buffer ram will fix it from being 50% less then even the 7yr old Xbox 360....
To be honest this is disheartening especially given the system price and hopes of seeing proper ports from this Gen and next Gen (720/PS4)
I hope Nintendo hasn't cheaped out to the point of no return.
Sorry for the hassle earlier guys, but a misread of those numbers was quite honestly from a hardware point of view the wii u's last hope of being viable when the big 2 hit the arena. With that settled and no other ram on the device aside from edram there's not much else to contemplate.
No worries, it always helps to have another set of critical eyes because I know I do miss stuff like this from time to time. I appreciate the effort to keep us on our toes :)
The Wii U is sort of an in-between generation console in my eyes. The next gen systems should feature much faster GPUs (and CPUs I hope).
Unless there are 8 chips, there is only 1GB being shown on that board, and from what I have seen on other breakdowns there is no ram on the rear of the board. Something is up?
Anand is spot on, its actually written on the chips themselves (emphasis added) H5TQ ***4G*** 63MFR -12C. Hynix uses that in their code to differentiate their capacities of RAM, the same code with "2G" or "1G" replacing the "4G" refers to the 2Gb (Gigabit not byte) )and 1 Gb models respectively. Evidence on the chip and from Hynix here:
I agree, the tech sheets conflict a lot which is why it was confusing. That being said I can't help but feel a bit sick with the wii u's memory layout. They have shot themselves in the foot again...
Sorry again for the hassle, I just wanted 100% backing on the ram because frankly I was hoping there would be more to it but Nintendo cheaper out clearly, so not much to look forward to hardware wise lol. Cheers
Its a shame Nintendo didnt choose to modify something in the mid-tier Radeon 5xxx series like the 5770. It might put them out of the running for being able to keep up with next gen MS and Sony consoles.. because it doesnt support DirectX 11 that is a big problem for the future
We don't know that it does or it doesn't yet though, there's still no hard facts on the GPU family as far as I know. And it wouldn't use DX11 anyways, that's a Microsoft API, but I get that you mean dx11-like physical GPU features used by Nintendos own API. Off to Chipworks or someone to look at the GPU under a microscope!
It seems difficult for me to believe that something like a 4770 would be residing in there, the TDP of the whole system was around 33W while running a game, which is far below the TDP is a 4770..
Keep in mind that the 4770 was among the very first products produced on a very troubled TSMC 40nm process. Anything produced now (nearly 3.5 years later) is going to have the benefit of the process maturing and lots of design experience to fall back on for optimizing the layout and transistor leakage.
do we even know that the GPU is still being manufactured in the original structure size? wouldn't it be possible that they've worked with AMD on a die shrink?
AMD has been using the 32nm process in mass production for their IGPs for well over a year after all
you have a point with the refining of the process but the TDPs of the later cards manufactured in that structure size (i.e. HD 5 and HD 6) don't really support that as power consumption seems to have been largely staying level on GPUs with similar transistor numbers and similar raw GFLOPs performance
The unknown is that many hint at it being dx10.1 or not even a dx instruction set (as mentioned by some Indy devs) also the inclusion of ddr3 with a 64 bit bus width puts it really far back in the 4000 series era like 4550 range of the r700 chips.
The e6760 would be epic but it uses DDR with a 128 bus width, which is why its the rumored chip for either the ps4 or 720.
The wattage is definitely massively lower though I mean the launch ps3 pushed 180 watts whereas the wii u is hitting under 40 consistently.
I can't wait until it gets the X-ray done so we can see how much or how little is really in there.
I think what is throwing us off is that the ram is shared, is it ram for general use and the discrete gpu has more of it's own ( making the gpu you suggested completely viable) or is it shared as in being used on a much older inferior gpu.
I personally hope we learn of good news like something along the e6760, rather then some horrible low wattage e4xxx series gpu with shared ram.
You're forgetting the massive amount of eDRAM on the GPU die with regards to bandwidth. The width there could easily be 1024 bit wider (or wider). Bandwidth for that 32 MB of eDRAM should not be an issue.
Case in point, the PS2's Emotion Engine had a 2560 bit wide bus to its 4 MB of eDRAM on die and that was over 10 years ago.
The 32 MB figure has been floating around for awhile as one of the few confirmed specs out there (2 GB of system memory was the other hard figure that has been out in the public).
The 32 MB wouldn't be for texture storage in most cases. Rather it would be used to hold various frame buffers for quick read/write operations. That is enough for four 32 bit 1920 x 1080 buffers.
well that's the first time i heard about it (i did hear about the 2GB rumors beforehand tho.. i also heard numerous other rumors regarding different memory sizes, one of them was bound to be correct) and as far as i'm aware nintendo hasn't made any comment about it so i would be careful with the word "confirmed"
I know that wireless is great and all but why not include at least a 10/100 if not 10/100/1000 Ethernet port. Wireless can be flaky at times and when the Wii was first released it was awful for me. Is there really no room for them to add that on? Is it really that costly?
If it *is* the eDRAM cache and it's not actually on the GPU cihp as expected, that would mean more space for GPU stuff so size comparisons are not thrown off by the eDRAM.
In regards to the GPU; it looks like the die size estimate would put it closer to the HD4670? That was a low-ball rumor floating around for a while before being replaced by e6760. Any chance you can give some insight here?
The eDRAM throws off any estimates based on size though, I think it will be hard to say until someone takes pictures under a microscope so the units can be counted out.
Earlier this year, wasn't dev saying the system is 4-6% more powerful than current gen and the next xbox is about (rough estimate) 10-15% more powerful than the Wii u.. It's not surprising really, as did anyone expect nintendo to release a system really graphically powerful and has a mini tablet to boot at a reasonable price..
nintendo claim they are losing money per console because of the gamepad no doubt, but as someone posted earlier they'll be able to get out of the red much faster than what Sony and Microsoft has planned, and besides what nintendo is betting on in the long run is that because all games will be now on equal resolution, the average console gamer is going to find it difficult to tell which game look better than the other..
So the next Xbox will be 20% faster than the current one? That would be a huge let down. Can you link to where you read that? I would expect at least a doubling of performance if they stay at the same price point.
I think he mistyped. The original rumor was Wii-U was 5-6x more powerful then an xbox 360. And that the 720 would be 20% faster then the Wii-U. Too lazy to link but there are tons of articles from 6 months that came from a microsoft document that circulated based off of the hardware projections.
that leaked hardware document said that the nextbox *should* (not would.. it was a very early document apparently, even IF it wasn't fake, so likely it hadn't been determined yet) be 6-8x times as powerful as the xbox360 there was no comparison to the wii there
This was the Nextbox info and most articles claim that it would be only 20% more powerful then the Wii-U. The original source came from a leaked internal mircosoft document.
I would be incredibly surprised if this was 5-6x more powerful than the xbox 360. The whole thing draws 30 watts, the memory bandwidth is half that of the 360, the CPU is smaller than Xenon or Cell on 45nm. None of those things point to total performance by themselves, but they do seem to me like Nintendo went for an econobox.
4-6%? performance differences that small are hard to measure accurately even on PCs with dedicated benchmark tools and similar architectures on consoles it will be next to impossible with their very differing hardware architecture
the most accurate statements you will get is something like "1.5 times more powerful" like you got when the gamecube was compared to the wii.. it's not really anything to go by but it gives a very very rough estimate
with a chip comparable to the RV 740 the wii-u would be somewhere near 3-4 times as powerful as the last gen, but that's only counting the performance of the GPU comparing the CPU performance is even harder especially when factoring in the cell CPU of the PS3
It is weird that I get 1175.6 for the sunspider test with my S3 (international) but here is getting over 1.4k. Despite countless times I try to run it, I am always getting below 1.2k even if I purposely try to do something on the device to slow it down.
Very interesting article - I also like to "see what's inside". I did chuckle a bit with how Anand wrote it as if an instruction manual for how YOU can tear your own WiiU console apart and most likely make it inoperable :D
Came in handy when I eventually had to take apart my 360 :P
Wouldn't take it apart while it's in warranty but a few years down the line who knows what could go wrong. Although with only 30w of heat to dissipate it wont' flex like the 360 mobo.
i do like how you gave instructions on how to take the wii apart. im guessing not many people are going to go buy a $300+ wii and rip it open and void the warranty. i would have like to have more info comparing the wii to xbox, ps and pc.
The Wii U CPU is about as big as a *single* core Atom at 45nm, while packing three cores. Size isn't everything, but there is only so much you can do with a certain number of transistors. If three cores is true, each core has about a third the transistor budget of one Atom core. That's crazy small. Even six years later, I don't find it likely something as big as one Atom core can do more work than the Cell or Xenon, as inefficient as those were.
3x powerpc 400/476fp type cores with added broadw2ay fied extentions and a custom 3mb edram catch
the 3mb edram catch is thesame size and heat as a 1mb sram catch and the 3xcore powerpc 400 custom is about the same size as a broadway cpu in wii
the powerpc 476fp is 2x per clock the power of a ARM A9 and wiiu cpu is again upgraded over that core with gamecentric uogrades real time decompression data compression and graphics burst pipes/buffer AKA A NEW VERSION OF GEKKO/BROADWAY
wiiu cpu core is the most powerfull powerpc 32 bit core even made so that makes it the most powerful 32 bit risc core on earth a standard powerpc 476fp is 2x the chip of a ARM A9 at the same speed and wiiu expresso cores are a step up again
a tri core wiiu cpu at say 1.6ghz will eat a 2.0ghz arm a9 4x core for breakfast with ease
powerpc 476fp and wiiu cpu are 5 instructions per clock efficient the arm a9 is 2 instructions efficient powerpc 476fp runs on a 128 bit bus arm a9s run on a 64 bit bus arm a9s have catch issues and max 1mb sram catch
wiiu cpu has a edram catch and its 3mb its the fastest 32bit risc core on earth PROVE ME WRONG
wiiu ram is ether 64 bit x 2 or 128bit x 1 or 128bit x1 and a secondary 32 bit ram bus for os only LIKE WII AND GANECUBE DUAL BUS
wii 64bit plus 32 bit gamecube 64 bit plus 8 bit
so wiiu is ether 128 bit plus 32 bit or just 128 bit or dual 64 bit THATS COMMONSENSE
lol an and tech and all other haters you lied about gamecube vs xbox all those years ago and now your lying about wiiu
so the wiiu has a 16 bit to 32 bit bus and has bandwidth in main ram of 6.4 to 12.8 GB that is complete crap an and tech....
powerpc 32 bit 400s @ 45nm have 128 bit ring bus not a 64 bit fsb like wii and gamecube also the main bus ram was 64 bit in gamecube and wii and secondary bus was 8 bit gamecube and 32 bit wii WITCH YOU ALREADY KNOW AN AND TECH
so why the anti nintenoism nonsense wiiu has 6.4gb then its 12.8gb WHEN YOU KNOW 100% GAMECUBE AND WII WERE 65 BIT BUS NOT 32 BIT LIKE YOUR NOW TRYING TO SAY
the 2gb ram is ether of the 3 set ups that follow and nothing like what your saying OUT OF NINTENDO HATE I MAY ADD
2gb 1600 ddr3 800mhz bus 128 bit = 28gb not 12.8 or 6.4 it = 28gb
so 2x 64bit = 28gb and 1 x 128 bit = 28gb SO WERE IN HELL DID YOU GET 6.4 OR 12.8 GB
the ram is likely 1600mhz 800 dual channel the bus is likely 128 bit or dual 64bit the powerpc 400 range run on a 128 bit bus at 800mhz
so the information we ALL HAVE is 1600 mhz ram 800 bus 800 gpu and 1600 cpus
so at dual 64 bit or single 128 bit the bandwidth = 28gb not 6.4 or 12.8 or 17gb
17 was a lie 6.4 was a lie and 12.8 was a lie
powerpc 45nm runs at 1600mhz with 800mhz bus
the wii and gamecube was based around the bus speed so if that continues the LOGICAL conclusion at this point =
cpu 1600 customized powerpc 400 with ibm edram catch 3mb
gpu customized rv7 4670 with 32mb edram at 800mhz
ring bus 128 bit 800 mhz
ram ddr3 or gddr3 at 800mhz dual channel = 1600mhz and 28gb bandwidth
32mb edram buffer/catch to gpu will have a massive bandwidth and the 3mb catch to cpu will be 2x plus the bandwidth of sram under the same conditions
so high bandwidth low latency like gamecube alover again
oh that xbox vs gamecube you did years ago ASLO FULL OF SHIT
weak gpu fixed function PLEASE AN AND TECH STOP LYING
xbox gpu was 4 texture layers and 8 texture stages
gc flippewr was 8 texture layers and 16 stages
xbox gpu was 8x4 real time lighting
flipper was 8x8 real time lighting
flipper had 2.5 x the internal bandwidth of xbox gpu
theres many more facts i can add
lol at you blatant anti nintendo wiiu tear down as if wiiu only has 6.4gb bandwidth or 12.8gb as wii had
28gb bandwidth edram and 4gb 1tsram and 4gb gddr3
the main ram of wii was 4gb x 2 = 8 so your saying wiiu has less bandwidth than wii or only 50% more please stop an and tech your loosing all credability
no the bit interface checks out the model number on the chips (which you can see on the pictures, they're not made up) leads to DRAM chips by hynix with a 16 bit wide interface just because the basic cpu-model has a wider interface that doesn't mean it's being used or even exists on the custom wii-u cpu, besides wider interfaces require more space on the die and as you can see space is already pretty rare
dual channel mode only works if there's actually any capacity left to run it with.. if you have a PC with 2 DRAM modules for example.. each module will have a bandwidth of 64 bits.. so if you split the data evenly between the 2 modules you effectively get a 128bit wide bus.. this is not possible on the wii-u since the DRAM chips with their 16bit bus already run at their maximum capacity.. it's effectively running quad channel with each channel being 16 bits wide
also your maths are complete shite 2x64bit=28gb? what kind of screwed up calculation is that? even if that was the way you'd calculate bus transfer rates (which it isn't) it doesn't add up at all (if anything 2x64bit would be 16 byte, but that's as close to "28gb" as i can get)
800 MHz modules running in dual channel mode will achieve 1600*10^6Hz (1600 Mhz effective frequency at 800Mhz actual frqequency, thus the name double data rate) times 128 bit (dual channel bus) = 204800000000 bit/s = 25600000000 byte/s=25.6GB/s (decimal) half of that is, surprise surprise, exactly 12.8GB/s which is the speed you get when only having a 64bit wide bus which apparently is the case with the wii-u
the rest of your post is mostly a collection of speculation, complete off topic stuff, misunderstood technical data and outright wrong information all of it wrapped into a writing style that makes my toenails roll themselves into a sushi maki so i'm not even going to go into that
please do yourself a favor and don't ever get a job that requires even basic mathematic knowledge
why did you go for the secondary bus of wii, as the main bus of wiiu,!!! you deliberately ignored the main bus 64 bit powerpc that gamecube and wii had!!... and went for the 32 bit secondary bus that the secondary gddr3 ram was on in wii !!!! ALREADY KNOWING FULL WELL THAT WASN'T THE MAIN BUS so you deliberately went to the lowest bit bus , to make it look like wiiu is weak DON'T DENY THIS AN AND TECH THAT'S WHAT YOU DID....
why would the secondary bus be the main bus next gen THAT'S TOTAL CRAP so gamecube and wii had 64 bit main bus and all of a sudden the bus went 32 bit the 3rd time round IN YOUR DREAMS AN AND TECH
the powerpc 32 bit core that makes up the 3x core broadway 2 EXPRESSO cpu in wiiu is based on powerpc 476fp all 32bit powerpcs now run on a 128bit ring bus the 64 bit fsb was of the past its no longer used
so why ignore the 64 bit bus and the new 128bit bus AND TELL LIES about a 32 bit bus your exposed as anti nintendo i think (weak cpu lie il debunk that with ease too)
isnt it likely the wiiu with its mcm and a 45nm powerpc broadway fied tri core WOULD ALSO BE USING THE SAME 128BIT RING BUS AH-HHHMMMMmmmmmm wiiboy cannot be fooled like ps3 silly fanboy
another point you fail to understand only powerpc 400s on this 128bit ring bus SUPPORT MULI CORE powerpc 750 of wii and gamecube DO NOT support multi core !!!!!!!!!!!!!<<<<<suggestion<<<<<
so commonsense is the ddr3 (witch is actually gddr3) samsung dont show the G in there specs so its gddr3 obviously they market the ram as so gDDR3 not GDDR3 they drop the G as its meaningless there the same ram
at 1600mhz is perfectly in line with 45nm powerpc thinking as the bus is 800mhz 128 bit and the recommended ram is ddr2 ddr3 1600mhz (u said yourselves ram looks 1600)
so the bus is most likely 128bit YOU SAID 32BIT OUT OF ANTI NINTENDO SPITE DIDN'T YOU MAKE WIIU LOOK BAD lol i see thru this pc fanboy nonsense like superman looking thru clear glass lol
if the ram is 1600mhz then its highly likely the cpu is 1600mhz (exactly lining up with ibm powerpc 476fp) and the ring bus in the mcm is 128 bit 800mhz and as the wii and gamecube were both balanced to the bus speed exactly then no doubt the wiiu is also balanced to the bus IF SO
CPU = 1600MHZ GPU = 800MHZ AND RAM = 800MHZ DUAL CHANNEL = 1600MHZ AND THE EDRAM ON GPU IS 800MHZ
2TO1 BALANCE REPLACES 3TO1 BALANCE Of WII AND GAMECUBE,,,, REMINDER GAMECUBE WAS ORIGINALLY 2TO1 404 CPU CLOCK AND 202 GPU CLOCK IT WAS CHANGED TO 2TO1 WHEN IBM COULDN'T GET THE OLD 64 BIT G3 BUS TO GET TO 202MHZ SO DROPPED IT TO A 3TO1 BALANCE INSTEAD OF 2TO1
SURLY YOU REMEMBER THIS GUYS I DO IM A CORE GAMER ITS ARE JOB TO REMEMBER THIS!!!!!!!
so has nintendo returned to the original wanted tighter 2to1 balance now that ibm have a half speed bus on there powerpc 32bit cpu I THINK THEY HAVE !!!!!
if the wiiu is still 64bit bus and not 128bit ring bus for what ever reason then is it not likely the 2gb ram is on a dual 64bit bus still giving single 128bit bus levels of bandwidth
and isnt it safe to say if there not using a 128bit ring bus then theres still dual memory buses just like wii and gamecube had so again its still way higher than 32 bit THAT YOUR SAYING
likely speed of ram is 800mhz x 2 = 1600 and as nintendo make clock balanced systems BECAUSE THERE NOT STUPID ENOUGH TO TRASH AND WASTE CLOCKS COMBINED WITH LOW LATENCY FAST RAM = EFFICIENT
those speeds might not be exact but there ball park UNLIKE YOURS
likely edram to gpu = 512bit or higher REMEMBER GUYS WII AND GAMECUBE HAD A 512BIT TEXTURE CATCH AND A WHAT WAS IT 360 BIT FRAME Z BUFFER JESUS CHECK YOUR OWN ARTICLES GUYS BEFORE TEARING THE WIIU DOWN
so a minimum 512bit for edram if not more and 128bit bus for main ram LOL 32 BIT ....
just add that ibm edram as level 2 catch has more than 2x bandwidth of sram so the 3mb catch of wiiu has twice if not more bandwidth of the same catch made from sram
snd the edram to gpu is over 10x wii and looks very alike so has wiiu got a edram texture/shader catch as well as a frame and z buffer
frame and z buffer =22mb and texture shader catch =10mb just guesing here it may not use a texture catch like wii and gamecube BUT IT WOULD MAKE SENSE the developers have said texture and shader data can be loaded into the gpu THAT SAYS BIG EDRAM CATCH TO ME LIKE GAMECUBE
GAMECUBE 10GB TEXTURE CATCH AND A 7PLUS GB FRAME AND Z BUFFER
SO WIIU COULD BE 100GB TEXTURE CATCH AND 75GB FRAME AND Z BUFFER BUT THEN AGAIN WHAT ABOUT FREE AA likely using a system lie x360 but more ram and bandwidth
Excellent article but I was very disappointed that the engravings on the CPU or GPU weren't shown! Why didn't you guys clean off the thermal paste before posting pics? This is what a lot of people have been wishing to know, so I'll assume that you guys didn't show it because of legal reasons and not because you forgot.
People are still using this as some sort of fact sheet, so id figure id drop you guys a line.
You cant determine bus size by looking at a ramchips specification documentation. Its not in there, as the memory architecture is part of the machine, and not the ram product.
Also, while you got the hynix ram, the samsung ram is the same exact ram chip used in og 360's, down to the serial number/ nomenclature. By your logic of chip type=bus by serial number, the wii u would have a 256bit bus. Which also isnt true.
And finally, we have the die picture of the gpu. And can clearly see the ddr3 i/0 where the ram bus plugs directly into the gpu.
If the wii u only has a 64 bit bus. Why are there 158 pins on the ddr3 i/o?
And if the wii u only had half the main memory bandwidth of ps360, how is need for speed possible?
Bigger assets for textures, and better framerate/performance, at half the bandwidth? edram cant fix that, every access to main memory is at main memories bandwidth.
You are looking at a bandwidth a little over 30GB/s
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
132 Comments
Back to Article
devilfriend - Sunday, November 18, 2012 - link
Hey Anand, are you planing to teardown the gamepad itself? I've been reading it only last 3 hours so I'm wondering how big the battery is, can't find that info anywhere and there's no teardown of the gamepad either. Thanks.ImSpartacus - Sunday, November 18, 2012 - link
That's not a bad idea. I'm tickled that Mr. Shimpi went through the trouble of tearing the Wii U down just to take a peek at its processors, but I suppose now I wouldn't mind seeing the inside of the controller as well. You give us an inch and we want a mile.Anand Lal Shimpi - Sunday, November 18, 2012 - link
I hadn't planned on it, but to answer your question it's a 5.6Wh battery (3.7V chemistry, 1500mAh).tipoo - Sunday, November 18, 2012 - link
Smaller battery capacity than many new smartphones, with a larger screen. That's where the battery life of 3-4 hours comes in. I expect it will be very little time before third party extended batteries come out.Stuka87 - Monday, November 19, 2012 - link
They are not comparable in that regard. The WiiU controller is not running the games, it is streaming the video from the console. So it only has decode hardware for the stream, it powers the display itself, and the of course the control surfaces.tipoo - Friday, April 11, 2014 - link
Which still equates to a scant 3 hour battery life.redchar - Sunday, November 18, 2012 - link
In case you wanted to see how small the battery is relative to the gamepad, this link has teardown images of that.http://goo.gl/SiOic
(wouldn't let me post normal link?)
piroroadkill - Monday, November 19, 2012 - link
Haha, what's up with all the empty space?There's a clear message that it's not for less casual gamers, because your controller will be dead before you're done gaming.
Peanutsrevenge - Monday, November 19, 2012 - link
It's a 'safety feature' as no one paid attention to the warnings previously about taking a 3 hour break every 20 minutes, so now their forcing our hands.N.B: The information in this post is either completely made up or extremely exaggerated.
Flyingcircus - Monday, November 19, 2012 - link
the extra space would make it easy for third party companies to offer much bigger batteries in the future howeverSpivonious - Monday, November 19, 2012 - link
Wow, the pics down the side of that site make it definitely NSFW. How about a warning next time?randinspace - Thursday, November 29, 2012 - link
By definition the last thing that would occur to any self-respecting NEET is whether or not something is safe for work.cervantesmx - Friday, January 4, 2013 - link
I agree, NSFW at all man!Stuka87 - Monday, November 19, 2012 - link
Yeah, this site is certainly NSFW!!IdBuRnS - Friday, November 30, 2012 - link
Wow NSFW link...michal1980 - Sunday, November 18, 2012 - link
is that ever going to happen here?or are you too busy with apple these days?
martyrant - Sunday, November 18, 2012 - link
It was $15 at launch, if you cared that much about a review, go buy it and review it yourself.I'm pretty sure you can still get it for $15 (don't know when the promo for "buying a pc with windows 7" to upgrade to 8 stopped, but they did not require any proof of purchase or anything else).
Thank you for this tear down Anand. Though I'm not going to be jumping this bandwagon, it was a good, informative read--and one I'd prefer over a Windows 8 review (something anyone has had access to for over a half a year with release previews and like I said, a $15 launch cost).
Anand Lal Shimpi - Sunday, November 18, 2012 - link
Ryan and Jarred are working on the Windows 8 Performance Guide, we already posted a take on the modern UI in our Windows RT Review which hit shortly after the Surface review. Lots of stuff coming, just all takes time.Take care,
Anand
kyuu - Monday, November 19, 2012 - link
Looking forward your guys' takes on Windows Phone 8 and the Nokia Lumia 920. Hope those reviews are incoming soon!Anand Lal Shimpi - Monday, November 19, 2012 - link
Brian did a great piece on WP8 and the WP8X here:http://www.anandtech.com/show/6415/windows-phone-8...
920 is coming though :)
Take care,
Anand
kyuu - Monday, November 19, 2012 - link
Oh that was the full Windows Phone 8 review? I thought the whole thing was just a preview, but I guess the preview part was just the 8X then.Well, still looking forward to the 920, anyway!
Belard - Monday, November 19, 2012 - link
Huh? It's simple. Window8 = who cares.michal1980 - Monday, November 19, 2012 - link
says a moron.If it was an Apply Ios review anand would have it up the second it hit live.
I dont understand this 2nd hand treatment of win8 when it will sell millions of copies over the next few years. AND is a major UI change for the biggest pc os software company in the world.
Anand Lal Shimpi - Monday, November 19, 2012 - link
I do like Windows 8 a lot. The major UI change we really covered in our Windows RT review (RT has the same UI as Windows 8). The rest is coming in our Windows 8 Performance Guide. I don't remember the last time I wrote an iOS review, and the same is true for a Windows review. Other folks are actively working on filling in the blanks on our Windows 8 coverage now :)Take care,
Anand
tipoo - Sunday, November 18, 2012 - link
Isn't 6GB/s rediculously low considering the PS3 and 360 are over 20 each? I've read on other sites that the memory is closer to 17GB/s, also based on the markings.And holly molly, that CPU die is small! Even smartphone SoCs are larger than that. With three cores seeming most likely, each core is pretty tiny, and that's with the eDRAM on die. I wonder if there is truth to the CPU not out-muscling the PS360 CPUs then, even with their age. The Cell on the same 45nm fab is still over 100mm2 if I remember right, and that's only 200 million transistors.
GoodBytes - Sunday, November 18, 2012 - link
Well... at the end of the day the console is pretty powerful.Here is Call Of Duty Black Ops 2 looks better on the WiiU than the XBox 360/PS3
https://www.youtube.com/watch?v=bZO33bCFwks
tipoo - Sunday, November 18, 2012 - link
Which makes me think the 6GB/s is mistaken, as even with a large eDRAM lots would have to be streamed from memory.chucknelson - Monday, November 19, 2012 - link
The sad part about this, though, is that they're making the same mistake they did last time - this hardware is lacking in the present day, and basically catches up with 6-year old consoles.The Wii U will be forgotten next year when Sony and Microsoft come out with their consoles. The technology in those will hopefully be a bit more forward looking...
Shadowmaster625 - Monday, November 19, 2012 - link
By the end of next year, they might have that MCM fully integrated and shrunk to 32nm. They could be selling consoles for $139, and even turning a profit on each one. Meanwhile, the other consoles will be selling for $300-$400. What does that extra money really get you? Everything is still pretty much stuck at 1080p.amdwilliam1985 - Monday, November 19, 2012 - link
I'm totally agree, 1080p will be the standard for a longgg while. Beyond that price is very important.It took me 5 years before I buy my first ps3 at $250(with discount price).
I got my wii during its first year deployment.
B3an - Wednesday, November 21, 2012 - link
Incase you hadn't noticed resolution isn't the sole reason graphics power keeps increasing.Games could look massively better at 1080p. The next consoles from MS and Sony will have way more RAM and processing power. No more blurry textures, poor AA, crap physics, and low polygon counts like on all current consoles, including the Wii U.
We aint even anywhere near to photo realistic graphics at 1080p. PC's could get near to it right now, but are completely held back by all the console ports. You're extremely narrow minded and short sighted.
B3an - Wednesday, November 21, 2012 - link
Thats a VERY crap comparison. They don't even say what other version is being shown. And the frames look messed up as if it interlaced video for what ever console they're comparing the Wii U against.tipoo - Sunday, November 18, 2012 - link
800Mhz * 4 modules * 16-bits each module * 2 double data rate = 102500 Mb/s or 12800 MB/sAnand Lal Shimpi - Sunday, November 18, 2012 - link
So if I'm reading the Hynix datasheets correctly, that's an 800MHz datarate, which is where the 6.4GBs comes from.tipoo - Sunday, November 18, 2012 - link
Wow, I wonder if even a large eDRAM cache can offset that much speed difference from the PS3 and 360. It's larger in capacity so it would be doing less loading/unloading than them, but some things still depend on memory streaming.Anand Lal Shimpi - Sunday, November 18, 2012 - link
Nope you were right in the first place. Hynix lists GDDR5 at data rate, but DDR3 is listed pre-DDR-rate. So 12.8GB/s is correct.Kaleid - Monday, November 19, 2012 - link
That little? That's absolutely awful, that's like hd5450 speeds.Kevin G - Sunday, November 18, 2012 - link
Judging from the pictures, the part number is H5TQ4G63MFA-12C correct?From their part number and this PDF (http://www.skhynix.com/inc/pdfDownload.jsp?path=/u... it is indeed 800 Mhz rated. I believe that that is the base clock though so that the bandwidth is actually 1600 MT. Note that the GDDR5 speeds, which go up to 7 Ghz effective, are not represented in that decoding table. Thus bandwidth for the Wii U would be 12.8 GByte/s.
Roland00Address - Sunday, November 18, 2012 - link
rv740 means 4770 or only 640 (vliw5 dx10) radeon shadersIf it was based off the 5770 it would have 800 (vliw5 dx11) radeon shaders, but the die size is too large for the die size would have to be 166mm^2 and anand only found about 156mm^2
And a cpu that has such a small die size only 32.76mm^2...I am pretty sure tegra3 cpu die size is larger than this (once you remove the die that is dedicated to the gpu and the companion core.)
Talk about underwhelming in the specs department.
tipoo - Sunday, November 18, 2012 - link
He just said the size is a bit bigger than the RV740, that doesn't mean it's an RV740 in there. With the supposedly pretty large eDRAM in there that throws off estimates, the GPU core could be pretty customized.Zodiark1593 - Sunday, November 18, 2012 - link
Considering the Xbox 360 GPU is roughly on par with the Radeon 6450 (at least in compute power) this seems to be a pretty solid upgrade on the GPU side.I do agree on the CPU though. I think the 3DS silicone may actually be larger than that CPU die. Given the use of gamepads, I'd have thought that the CPU would need to be pretty strong.Guess we'll have to see what happens later on.
EnzoFX - Sunday, November 18, 2012 - link
a 6450? Is that true? Just sounds low to me.The low end graphic cards have always been so poor in gaming performance on the pc side, with every iteration too, they seem to keep the same core performance and only add some compatibility stuff. Always disappointing.
tipoo - Sunday, November 18, 2012 - link
25.6-28.8 GB/s (GDDR5) memory bandwidth, 200-240 GFLOPS Single Precision compute power, sounds about right as an equivalent to the 360s GPU.Zodiark1593 - Sunday, November 18, 2012 - link
From my research, the Xenos (Xbox 360 GPU) puts out about 240 GFlops single-precision, and the Radeon 6450 top out at 240 GFlops as well.Of course that doesn't tell the entire story due to the extra eDram and a few extra tidbits including microcode optimizations, but yeah...
Low end GPUs exist nowadays to upgrade older PCs for playing HD video well, or otherwise to upgrade from older (pre-Intel HD 3000) integrated graphics, mainly for HTPC use. Because it's target market doesn't rely on performance, there's little point in making an entire new core unless there's a fancy new video decoder or encoder.
Arbee - Monday, November 19, 2012 - link
Remember, 360 silicon was finalized in early 2005. A *lot* has happened at AMD since then :)Kevin G - Sunday, November 18, 2012 - link
The XBox 360 has 48 shaders that would best be described as pre-Radeon HD 2000 series class. They're not DirectX 10 compliant, at least with what shipped with Vista. The XBox 360 was originally supposed to be DX10 compliant hardware but Vista was delayed and the PC spec changed while the XBox 360 GPU hardware was already completed.Anyway, in terms of performance, this puts the GPU between a Radeon X1950GT and Radeon HD 2900GT in terms of capabilities and performance. GPU efficiency has crept upward with AMD's VLIW5 designs over time which would actually mean that the Radeon 6450 would be slightly faster than the XBox 360's GPU.
It appears that the Wii U's GPU has 32 MB of eDRAM on the GPU. More than likely this amount of eDRAM takes up half the GPU die but should be well worth it in terms of performance. I suspect that there are 96 VLIW5 shaders (480 ALU's), 32 TMU's and 16 ROP's. While everyone is drawing comparison with the RV740, chances are that this design incorporates many of the efficiency improvements found in AMD's VLIW5 architecture that made it into the Barts design (Radeon 6870).
RussianSensation - Monday, November 19, 2012 - link
Still miles faster than the GPUs in PS3/360.HD 4770 1GB = 51
vs.
X1950Pro 512mb = 20.4
X1800XT 512mb = 16.7
7900GT 256 mb = 16.1
http://alienbabeltech.com/abt/viewtopic.php?p=4117...
PS3's RSX is = 7900GT with half the memory bandwidth and half the ROPs. So it's much slower than the desktop 7900GT.
The GPU in the 360 is probably at best as fast as X1800XT despite unified shader architecture because it is also memory bandwidth crippled.
If it has an HD 4770, that is at least 2.5-3x faster already than either the PS3/360's GPUs. There is probably little doubt that the next PS4 and Xbox will be more powerful but this console is definitely more powerful than current generation.
RussianSensation - Saturday, December 1, 2012 - link
Well now that I read the article more carefully, the 12.8GB/sec memory bandwidth would cripple even an HD4770 GPU. Looks like the GPU is hopeless for next gen gaming.Zink - Sunday, November 18, 2012 - link
Nice pictures, I can't wait for an in depth look at the high end hardware next years consoles bring.aryonoco - Sunday, November 18, 2012 - link
The PowerPC is not competitive anymore. I doubt IBM has been throwing any R&D at it since Apple left ship. And yes it is a derivative of POWER which IBM is still developing, but there never was that much synergy between POWER and PPC, probably less so these days.It's pretty safe not to compare it with Sandy/Ivy bridge. But I'm pretty sure even a dual core Clover trail or even Brazos would outperform this thing.
So, my question is, why is IBM still winning these contracts? Is it just ISA compatibility? I'm not sure that's such a big concern these days, considering how quickly even the PS3 got rid of its PS2 compatibility mode. I mean, does anyone actually want to play low-res Wii games on the Wii U? What for?
And if we are to put ISA compatibility aside, why is IBM still winning these contracts? Is it because Intel isn't even bidding for them, due to low margins?
tipoo - Sunday, November 18, 2012 - link
It's because Intel won't licence chips out. IBM wins because they sell the design of the chip, then X console maker can modify it and shrink it on their own schedule. And also, there are whispers that the PS4 and Nextbox may be using AMD CPUs (or APUs), and AMD does already have experience catering to consoles.tipoo - Sunday, November 18, 2012 - link
By the way I'm not sure about the IBM not being competitive anymore either. The Power7 CPUs still are extremely competitive with Intel in high throughput scenarios, so I hear.Kevin G - Monday, November 19, 2012 - link
The weird thing about AMD's new strategy is that it doesn't necessarily mean an x86 chip inside of an APU from them anymore. They already have an ARM license and I'm sure that a licensing deal can be worked out with IBM to include a PowerPC core if the console maker desires (note that the Xbox 360 now uses a triple core PowerPC + Radeon based GPU on one die).Kevin G - Sunday, November 18, 2012 - link
Actually the differences between POWER and PowerPC nowadays is purely marketing. The divergence pretty much ended with the POWER5 (though Altivec SIMD wasn't added until the POWER6). The distinctions between POWER and PowerPC in terms of hardware support are in SoC features like on-die cryptography, TCP/IP off load and accelerated memory compression that fall outside of the CPU core.And IBM has been developing new PowerPC based cores. The Wii U could use the embedded PowerPC 476 core or the PowerPC A2 core. These cores would be a minor step up from what is found in the XBox 360 or the Cell PPE inside of the PS3. Sandy Bridge is far more aggressive core design than either of the PPE, PPC 476 or PPC A2 but it is also larger and consumes more power. The main reason IBM gets these contracts is due to their willingness to design and manufacture a custom chip.
As for POWER, the new POWER7+ takes the performance crown in the high end server world. It manages to top the 10 core Westmere-EX and the 8 core Sandy Bridge-E.
tipoo - Sunday, February 10, 2013 - link
According to the hacker Marcan it's actually using PowerPC 750 cores. So something related to the PowerPC G3.Zodiark1593 - Sunday, November 18, 2012 - link
Reading an interview on the development of the WiiU, it seems that both the CPU die and GPU die have some form of embedded memory. Not sure what that tiny chip is for though.tipoo - Sunday, November 18, 2012 - link
That's what I've been hearing too, although the CPU one isn't nearly as large, possibly something like 32mb GPU, 3MB CPU. The CPU one is just used for an L2 cache replacement.MrMilli79 - Sunday, November 18, 2012 - link
You mention Hynix DDR3-800 devices but I guess the 800 means 800Mhz which would translate into DDR3-1600. So that's 12.8GB/s instead of 6.4GB/s. Almost every laptop sold these days is equipped with DDR3-1600 and it's dirt cheap. I would even assume that DDR3-800 might be more expensive these days.The AMD GPU also functions as the northbridge and southbridge (just as on the Wii). So don't forget to take that into account. RV740 was 137mm². That doesn't leave too much space for the northbridge, southbridge and eDRAM. RV740 also had a power consumption of around 60-80W. Even with a decreased clock speed that wouldn't fit into the 33W of the Wii U. Or it's much less than a RV740 or it's not 40nm.
I see two possible candidates for the CPU.
My first guess would be the PowerPC 470. Multicore capable, very low power consumption, very small, customizable but the speed is more in the range of an ARM core. It would make sense I guess since many developers mention the lack of CPU performance.
My second guess would be the PowerPC A2. Multicore capable, low power consumption, small, ... but not really meant for something like a console (but still possible).
Flyingcircus - Sunday, November 18, 2012 - link
i'm a bit confused about the model number of the memory chipsaccording to your pictures we're dealing with H5TQ4G63MFR-xxC
however this data sheet states that there is no model of this chip with 512MB capacity
http://goo.gl/itla1
also speed grades in this pdf don't match with the 12 we're seeing here in the model number
i couldn't find that exact model number in any data sheet
am i missing something?
Sniffynose - Sunday, November 18, 2012 - link
See my comment below, they are 256mb x16 for the 96ball FBGA chips, so unless 4 more are on the underside of the mainboard we have just found the o/s only memory, not the dedicated video memorySniffynose - Sunday, November 18, 2012 - link
And yes the data is in that pdf you linked, the xxC has the exact same specs only there are 2 models 12c (wii u uses) is 800mhz (x2 ddr) and the 11c @900mhz x2 ddrThat's why the sheet has the xxC listing because those are the only differences with that number.
Flyingcircus - Monday, November 19, 2012 - link
thanks for the infoi don't know where you're seeing the 12c speed grade in that pdf however
it didn't even cross my mind that the wii-u might have that much eDRAM
from watching the teardown PCPer did i can confirm that there's no further DRAM ICs on the underside of the PCB (i could identify what i believe to be the SLC NAND chip for the OS that the wii-u is supposed to have and which has been mentioned in this teardown and a bunch of smaller ICs the role of which isn't obvious to me)
http://de.twitch.tv/pcper/b/341042388
minute 18+ is the interesting part
can see the backside of the PCB past minute 20
this much eDRAM seems odd however.. the GPU die doesn't seem big enough to even hold that much embedded memory
Flyingcircus - Monday, November 19, 2012 - link
nevermind i read the other comments now (should have done that before d'uh)not exactly a great outcome
tipoo - Monday, November 19, 2012 - link
I don't think there is dedicated video memory, just the unified memory pool + the eDRAM, sort of like the 360.Sniffynose - Sunday, November 18, 2012 - link
Those Hynix chips are H5TQ4G63MFR-12C Clock Freq 800mhz 1.6gbps/pin 96ball FBGA chips, here's the interesting part nobody has caught onto yet...They are 256mb x16 chips... so x 4 chips in the array we are looking at 1Gb total ram... so unless 4 more are hiding underneath, the video ram is somewhere else, and this ram is the ram being used by the O/S which is fine seeing as most pc's today use ddr3 1600mhz for O/S use...
What is on the underside??? In any case your estimate of 4GB (512mb) per chip is flat out wrong, as they are 256mb per chip with that model number.
If there are none below, then this might mean that the video ram is that small extra chip near the gpu/cpu and that the system ram is separate from video ram like a proper gaming pc would be per se. Which would debunk the ram bandwidth being slow etc theory, because if there is only 1Gb of the ddr3 present this is the ram nintendo said would be reserved for o/s and system use only and not be available to developers.
Ryan Smith - Sunday, November 18, 2012 - link
4Gb is the right density for those chips. We have the specific Hynix datasheet that confirms it.http://www.hynix.com/inc/pdfDownload.jsp?path=/dat...
Sniffynose - Sunday, November 18, 2012 - link
It's not possible as they only manufacture 256mb x16 bus width under that part number. If the chips were 512mb each they would be the x8 bus width 78ball FBGA type.Sniffynose - Sunday, November 18, 2012 - link
Use this data sheet, not the generic one http://goo.gl/itla1Ryan Smith - Monday, November 19, 2012 - link
I will say it again: we have a Hynix datasheet with that specific part number on it, and it is specifically listed as being 4Gbit. So if you have a problem with the size, take it up with Hynix.Sniffynose - Monday, November 19, 2012 - link
I just think you are having a problem with basic math, 256mb x16 bus width per die chip means that the entire ram allocated is 1GB total. There is no 4GB on the wii u to begin with so tooting that number makes you look insane. Nintendo said themselves that 1GB would be dedicated to the O/S (From what we can see those hynix chips) and 1GB for developers that is 2GBBut hey it's not the first time in this article alone you guys have posted wrong technical data.
It's as bad as neogaf claiming the samsung chip was the system ram originally, turns out that was the 8GB eMMC chip aka the built in hard drive...
Anand Lal Shimpi - Monday, November 19, 2012 - link
The density on those chips is 4Gb or 512MB.256Mx16 doesn't directly refer to the density of the chip, rather the organization of the RAM itself. That's 256M rows x 16-bits, or 512MB.
The datasheet you linked to confirms this (check out page 3 under the description header - emphasis mine):
...are a ***4,294,967,296-bit*** CMOS Double Data Rate III (DDR3) Synchronous DRAM...Hynix ***4Gb*** DDR3 SDRAMs offer fully synchronous operations referenced to both rising and falling edges of the clock.
Take care,
Anand
Sniffynose - Monday, November 19, 2012 - link
Yikes talk about conflicting info within their own technical sheets.If they are 512 mob each then yes that's all 2Gb's of ram (video and o/s) which paints a very very bleak picture for overall memory speed and layout.
Looks like no amount of frame buffer ram will fix it from being 50% less then even the 7yr old Xbox 360....
To be honest this is disheartening especially given the system price and hopes of seeing proper ports from this Gen and next Gen (720/PS4)
I hope Nintendo hasn't cheaped out to the point of no return.
Sorry for the hassle earlier guys, but a misread of those numbers was quite honestly from a hardware point of view the wii u's last hope of being viable when the big 2 hit the arena. With that settled and no other ram on the device aside from edram there's not much else to contemplate.
Thanks again for clarifying this for us.
Anand Lal Shimpi - Monday, November 19, 2012 - link
No worries, it always helps to have another set of critical eyes because I know I do miss stuff like this from time to time. I appreciate the effort to keep us on our toes :)The Wii U is sort of an in-between generation console in my eyes. The next gen systems should feature much faster GPUs (and CPUs I hope).
Take care,
Anand
Sniffynose - Sunday, November 18, 2012 - link
Unless there are 8 chips, there is only 1GB being shown on that board, and from what I have seen on other breakdowns there is no ram on the rear of the board. Something is up?Anand Lal Shimpi - Monday, November 19, 2012 - link
Nothing is up, those are 4 x 512MB (4Gb) DRAM devices.Take care,
Anand
Thisis4me - Monday, November 19, 2012 - link
Anand is spot on, its actually written on the chips themselves (emphasis added) H5TQ ***4G*** 63MFR -12C. Hynix uses that in their code to differentiate their capacities of RAM, the same code with "2G" or "1G" replacing the "4G" refers to the 2Gb (Gigabit not byte) )and 1 Gb models respectively. Evidence on the chip and from Hynix here:http://www.hynix.com/inc/pdfDownload.jsp?path=/dat...
Sniffynose - Monday, November 19, 2012 - link
I agree, the tech sheets conflict a lot which is why it was confusing. That being said I can't help but feel a bit sick with the wii u's memory layout. They have shot themselves in the foot again...Sorry again for the hassle, I just wanted 100% backing on the ram because frankly I was hoping there would be more to it but Nintendo cheaper out clearly, so not much to look forward to hardware wise lol. Cheers
roltzje - Monday, November 19, 2012 - link
Its a shame Nintendo didnt choose to modify something in the mid-tier Radeon 5xxx series like the 5770. It might put them out of the running for being able to keep up with next gen MS and Sony consoles.. because it doesnt support DirectX 11 that is a big problem for the futuretipoo - Monday, November 19, 2012 - link
We don't know that it does or it doesn't yet though, there's still no hard facts on the GPU family as far as I know. And it wouldn't use DX11 anyways, that's a Microsoft API, but I get that you mean dx11-like physical GPU features used by Nintendos own API. Off to Chipworks or someone to look at the GPU under a microscope!roltzje - Monday, November 19, 2012 - link
It seems difficult for me to believe that something like a 4770 would be residing in there, the TDP of the whole system was around 33W while running a game, which is far below the TDP is a 4770..Ryan Smith - Monday, November 19, 2012 - link
Keep in mind that the 4770 was among the very first products produced on a very troubled TSMC 40nm process. Anything produced now (nearly 3.5 years later) is going to have the benefit of the process maturing and lots of design experience to fall back on for optimizing the layout and transistor leakage.tipoo - Monday, November 19, 2012 - link
Plus the embedded GPUs draw quite a bit less power, ie the e6760 mentioned below.Flyingcircus - Monday, November 19, 2012 - link
do we even know that the GPU is still being manufactured in the original structure size? wouldn't it be possible that they've worked with AMD on a die shrink?AMD has been using the 32nm process in mass production for their IGPs for well over a year after all
you have a point with the refining of the process but the TDPs of the later cards manufactured in that structure size (i.e. HD 5 and HD 6) don't really support that as power consumption seems to have been largely staying level on GPUs with similar transistor numbers and similar raw GFLOPs performance
tarv - Monday, November 19, 2012 - link
Read somewhere there was a leaked email confirming this was the part that the Wii-U GPU was based off of.http://www.amd.com/us/products/embedded/graphics-p...
tarv - Monday, November 19, 2012 - link
That gives it 480 shaders and tops out at 576 gflops. It also is 40nm and runs at 35-40 watts.Sniffynose - Monday, November 19, 2012 - link
The unknown is that many hint at it being dx10.1 or not even a dx instruction set (as mentioned by some Indy devs) also the inclusion of ddr3 with a 64 bit bus width puts it really far back in the 4000 series era like 4550 range of the r700 chips.The e6760 would be epic but it uses DDR with a 128 bus width, which is why its the rumored chip for either the ps4 or 720.
The wattage is definitely massively lower though I mean the launch ps3 pushed 180 watts whereas the wii u is hitting under 40 consistently.
I can't wait until it gets the X-ray done so we can see how much or how little is really in there.
I think what is throwing us off is that the ram is shared, is it ram for general use and the discrete gpu has more of it's own ( making the gpu you suggested completely viable) or is it shared as in being used on a much older inferior gpu.
I personally hope we learn of good news like something along the e6760, rather then some horrible low wattage e4xxx series gpu with shared ram.
Kevin G - Monday, November 19, 2012 - link
You're forgetting the massive amount of eDRAM on the GPU die with regards to bandwidth. The width there could easily be 1024 bit wider (or wider). Bandwidth for that 32 MB of eDRAM should not be an issue.Case in point, the PS2's Emotion Engine had a 2560 bit wide bus to its 4 MB of eDRAM on die and that was over 10 years ago.
Flyingcircus - Monday, November 19, 2012 - link
do we already know the size of the eDRAM on the GPU die? i wasn't awareif it's big enough that could be the saving grace of the GPU, 32MB does seem a bit small however if you take todays high res textures in account
Kevin G - Monday, November 19, 2012 - link
The 32 MB figure has been floating around for awhile as one of the few confirmed specs out there (2 GB of system memory was the other hard figure that has been out in the public).The 32 MB wouldn't be for texture storage in most cases. Rather it would be used to hold various frame buffers for quick read/write operations. That is enough for four 32 bit 1920 x 1080 buffers.
Flyingcircus - Tuesday, November 20, 2012 - link
well that's the first time i heard about it (i did hear about the 2GB rumors beforehand tho.. i also heard numerous other rumors regarding different memory sizes, one of them was bound to be correct) and as far as i'm aware nintendo hasn't made any comment about it so i would be careful with the word "confirmed"Kevin G - Monday, November 19, 2012 - link
And here is a source:http://www.extremetech.com/gaming/137746-nintendo-...
Flyingcircus - Tuesday, November 20, 2012 - link
it is unclear however why or how they "know" there's 32MB of eDRAMi'm still sceptical
don't get me wrong, 32MB sounds reasonable and is probably entirely possible but i wouldn't rule out other sizes just yet
Flyingcircus - Monday, November 19, 2012 - link
that would be neat but the e6760 is an MCM in itselfhttp://alaintechupdate.blogspot.de/2011/05/amd-lau...
it's not on die ram but on module ram which pretty much eliminates that possibility :/
piroroadkill - Monday, November 19, 2012 - link
The closest comparison in GPUs available I could see is the 6750M.http://www.notebookcheck.net/AMD-Radeon-HD-6750M.4...
tipoo - Sunday, February 10, 2013 - link
We have a die shot now, no way it's a e6760http://www.neogaf.com/forum/showthread.php?t=51162...
laytoncy - Monday, November 19, 2012 - link
I know that wireless is great and all but why not include at least a 10/100 if not 10/100/1000 Ethernet port. Wireless can be flaky at times and when the Wii was first released it was awful for me. Is there really no room for them to add that on? Is it really that costly?abrowne1993 - Monday, November 19, 2012 - link
This annoyed me too. I guess the best we can do is use a USB to Ethernet adapter. It's obviously not ideal, though.Flyingcircus - Tuesday, November 20, 2012 - link
wouldn't the wii-u have to support that software side though? doesn't sound like it could workmaybe with custom firmware
Zink - Monday, November 19, 2012 - link
You guys should really put Windows Phone 8 on the SunSpider chart to help demonstrate the software dependence of SunSpider results.Anand Lal Shimpi - Monday, November 19, 2012 - link
Done :)Take care,
Anand
tipoo - Monday, November 19, 2012 - link
Wasn't there supposed to be a dedicated ARM chip to run the OS in the U?Arbee - Monday, November 19, 2012 - link
The original Wii had an ARM9TDMI system controller which handled security and also emulated some of the GameCube hardware for GC back-compat.But yes, I would think possibly that 3rd die is a similar ARM core. Guess we'll find out when ChipWorks gets their hands on it.
tipoo - Tuesday, November 20, 2012 - link
If it *is* the eDRAM cache and it's not actually on the GPU cihp as expected, that would mean more space for GPU stuff so size comparisons are not thrown off by the eDRAM.MistressMouse - Monday, November 19, 2012 - link
In regards to the GPU; it looks like the die size estimate would put it closer to the HD4670?That was a low-ball rumor floating around for a while before being replaced by e6760.
Any chance you can give some insight here?
tipoo - Tuesday, November 20, 2012 - link
The eDRAM throws off any estimates based on size though, I think it will be hard to say until someone takes pictures under a microscope so the units can be counted out.Ares1521 - Monday, November 19, 2012 - link
Would be very nice to have some inside-die photos to confirm the amount of cores of the processor, something like that: http://en.wikipedia.org/wiki/File:Cell-Processor.j...Of course, it will break the console... maybe you can do a partneyship with bledtech, remove the die to the photo, than let they blend it....
tipoo - Monday, November 19, 2012 - link
I think Chipworks will get around to that soon. They often partner with ifixit for that.Conner64 - Monday, November 19, 2012 - link
Earlier this year, wasn't dev saying the system is 4-6% more powerful than current gen and the next xbox is about (rough estimate) 10-15% more powerful than the Wii u.. It's not surprising really, as did anyone expect nintendo to release a system really graphically powerful and has a mini tablet to boot at a reasonable price..nintendo claim they are losing money per console because of the gamepad no doubt, but as someone posted earlier they'll be able to get out of the red much faster than what Sony and Microsoft has planned, and besides what nintendo is betting on in the long run is that because all games will be now on equal resolution, the average console gamer is going to find it difficult to tell which game look better than the other..
Death666Angel - Monday, November 19, 2012 - link
So the next Xbox will be 20% faster than the current one? That would be a huge let down. Can you link to where you read that? I would expect at least a doubling of performance if they stay at the same price point.tarv - Monday, November 19, 2012 - link
I think he mistyped. The original rumor was Wii-U was 5-6x more powerful then an xbox 360. And that the 720 would be 20% faster then the Wii-U. Too lazy to link but there are tons of articles from 6 months that came from a microsoft document that circulated based off of the hardware projections.Flyingcircus - Tuesday, November 20, 2012 - link
that leaked hardware document said that the nextbox *should* (not would.. it was a very early document apparently, even IF it wasn't fake, so likely it hadn't been determined yet) be 6-8x times as powerful as the xbox360there was no comparison to the wii there
tarv - Tuesday, November 20, 2012 - link
This was the Nextbox info and most articles claim that it would be only 20% more powerful then the Wii-U. The original source came from a leaked internal mircosoft document.http://www.ign.com/articles/2012/01/24/xbox-720-wi...
tipoo - Tuesday, November 20, 2012 - link
I would be incredibly surprised if this was 5-6x more powerful than the xbox 360. The whole thing draws 30 watts, the memory bandwidth is half that of the 360, the CPU is smaller than Xenon or Cell on 45nm. None of those things point to total performance by themselves, but they do seem to me like Nintendo went for an econobox.Flyingcircus - Tuesday, November 20, 2012 - link
4-6%? performance differences that small are hard to measure accurately even on PCs with dedicated benchmark tools and similar architectureson consoles it will be next to impossible with their very differing hardware architecture
the most accurate statements you will get is something like "1.5 times more powerful" like you got when the gamecube was compared to the wii.. it's not really anything to go by but it gives a very very rough estimate
with a chip comparable to the RV 740 the wii-u would be somewhere near 3-4 times as powerful as the last gen, but that's only counting the performance of the GPU
comparing the CPU performance is even harder especially when factoring in the cell CPU of the PS3
kopicha - Tuesday, November 20, 2012 - link
It is weird that I get 1175.6 for the sunspider test with my S3 (international) but here is getting over 1.4k. Despite countless times I try to run it, I am always getting below 1.2k even if I purposely try to do something on the device to slow it down.Penti - Tuesday, November 20, 2012 - link
Try running Chrome. As that is what is benchmark here on the site. I hope they will make sure its a fairly robust browser on the Wii U.matty3.0 - Tuesday, November 20, 2012 - link
Ifixit have put up their Wii U teardown (gamepad included)http://www.ifixit.com/Teardown/Nintendo+Wii+U+Tear...
dalewb - Tuesday, November 20, 2012 - link
Very interesting article - I also like to "see what's inside". I did chuckle a bit with how Anand wrote it as if an instruction manual for how YOU can tear your own WiiU console apart and most likely make it inoperable :Dtipoo - Tuesday, November 20, 2012 - link
Came in handy when I eventually had to take apart my 360 :PWouldn't take it apart while it's in warranty but a few years down the line who knows what could go wrong. Although with only 30w of heat to dissipate it wont' flex like the 360 mobo.
jonjonjonj - Wednesday, November 21, 2012 - link
i do like how you gave instructions on how to take the wii apart. im guessing not many people are going to go buy a $300+ wii and rip it open and void the warranty. i would have like to have more info comparing the wii to xbox, ps and pc.tipoo - Wednesday, November 21, 2012 - link
The Wii U CPU is about as big as a *single* core Atom at 45nm, while packing three cores. Size isn't everything, but there is only so much you can do with a certain number of transistors. If three cores is true, each core has about a third the transistor budget of one Atom core. That's crazy small. Even six years later, I don't find it likely something as big as one Atom core can do more work than the Cell or Xenon, as inefficient as those were.wiiboy101uk - Wednesday, November 21, 2012 - link
3x powerpc 400/476fp type cores with added broadw2ay fied extentions and a custom 3mb edram catchthe 3mb edram catch is thesame size and heat as a 1mb sram catch and the 3xcore powerpc 400 custom is about the same size as a broadway cpu in wii
the powerpc 476fp is 2x per clock the power of a ARM A9 and wiiu cpu is again upgraded over that core with gamecentric uogrades real time decompression data compression and graphics burst pipes/buffer AKA A NEW VERSION OF GEKKO/BROADWAY
wiiu cpu core is the most powerfull powerpc 32 bit core even made so that makes it the most powerful 32 bit risc core on earth a standard powerpc 476fp is 2x the chip of a ARM A9 at the same speed and wiiu expresso cores are a step up again
a tri core wiiu cpu at say 1.6ghz will eat a 2.0ghz arm a9 4x core for breakfast with ease
powerpc 476fp and wiiu cpu are 5 instructions per clock efficient the arm a9 is 2 instructions efficient powerpc 476fp runs on a 128 bit bus arm a9s run on a 64 bit bus arm a9s have catch issues and max 1mb sram catch
wiiu cpu has a edram catch and its 3mb its the fastest 32bit risc core on earth PROVE ME WRONG
wiiu ram is ether 64 bit x 2 or 128bit x 1 or 128bit x1 and a secondary 32 bit ram bus for os only LIKE WII AND GANECUBE DUAL BUS
wii 64bit plus 32 bit gamecube 64 bit plus 8 bit
so wiiu is ether 128 bit plus 32 bit or just 128 bit or dual 64 bit THATS COMMONSENSE
the main ram is not 32 bit LOL ana and tech
tipoo - Sunday, November 25, 2012 - link
It won't be competing with ARM cores...wiiboy101uk - Wednesday, November 21, 2012 - link
lol an and tech and all other haters you lied about gamecube vs xbox all those years ago and now your lying about wiiuso the wiiu has a 16 bit to 32 bit bus and has bandwidth in main ram of 6.4 to 12.8 GB that is complete crap an and tech....
powerpc 32 bit 400s @ 45nm have 128 bit ring bus not a 64 bit fsb like wii and gamecube also the main bus ram was 64 bit in gamecube and wii and secondary bus was 8 bit gamecube and 32 bit wii WITCH YOU ALREADY KNOW AN AND TECH
so why the anti nintenoism nonsense wiiu has 6.4gb then its 12.8gb WHEN YOU KNOW 100% GAMECUBE AND WII WERE 65 BIT BUS NOT 32 BIT LIKE YOUR NOW TRYING TO SAY
the 2gb ram is ether of the 3 set ups that follow and nothing like what your saying OUT OF NINTENDO HATE I MAY ADD
2gb 1600 ddr3 800mhz bus 128 bit = 28gb not 12.8 or 6.4 it = 28gb
so 2x 64bit = 28gb and 1 x 128 bit = 28gb SO WERE IN HELL DID YOU GET 6.4 OR 12.8 GB
the ram is likely 1600mhz 800 dual channel the bus is likely 128 bit or dual 64bit the powerpc 400 range run on a 128 bit bus at 800mhz
so the information we ALL HAVE is 1600 mhz ram 800 bus 800 gpu and 1600 cpus
so at dual 64 bit or single 128 bit the bandwidth = 28gb not 6.4 or 12.8 or 17gb
17 was a lie 6.4 was a lie and 12.8 was a lie
powerpc 45nm runs at 1600mhz with 800mhz bus
the wii and gamecube was based around the bus speed so if that continues the LOGICAL conclusion at this point =
cpu 1600 customized powerpc 400 with ibm edram catch 3mb
gpu customized rv7 4670 with 32mb edram at 800mhz
ring bus 128 bit 800 mhz
ram ddr3 or gddr3 at 800mhz dual channel = 1600mhz and 28gb bandwidth
32mb edram buffer/catch to gpu will have a massive bandwidth and the 3mb catch to cpu will be 2x plus the bandwidth of sram under the same conditions
so high bandwidth low latency like gamecube alover again
oh that xbox vs gamecube you did years ago ASLO FULL OF SHIT
weak gpu fixed function PLEASE AN AND TECH STOP LYING
xbox gpu was 4 texture layers and 8 texture stages
gc flippewr was 8 texture layers and 16 stages
xbox gpu was 8x4 real time lighting
flipper was 8x8 real time lighting
flipper had 2.5 x the internal bandwidth of xbox gpu
theres many more facts i can add
lol at you blatant anti nintendo wiiu tear down as if wiiu only has 6.4gb bandwidth or 12.8gb as wii had
28gb bandwidth edram and 4gb 1tsram and 4gb gddr3
the main ram of wii was 4gb x 2 = 8 so your saying wiiu has less bandwidth than wii or only 50% more please stop an and tech your loosing all credability
Flyingcircus - Thursday, November 22, 2012 - link
no the bit interface checks outthe model number on the chips (which you can see on the pictures, they're not made up) leads to DRAM chips by hynix with a 16 bit wide interface
just because the basic cpu-model has a wider interface that doesn't mean it's being used or even exists on the custom wii-u cpu, besides wider interfaces require more space on the die and as you can see space is already pretty rare
dual channel mode only works if there's actually any capacity left to run it with.. if you have a PC with 2 DRAM modules for example.. each module will have a bandwidth of 64 bits.. so if you split the data evenly between the 2 modules you effectively get a 128bit wide bus.. this is not possible on the wii-u since the DRAM chips with their 16bit bus already run at their maximum capacity.. it's effectively running quad channel with each channel being 16 bits wide
also your maths are complete shite
2x64bit=28gb? what kind of screwed up calculation is that? even if that was the way you'd calculate bus transfer rates (which it isn't) it doesn't add up at all (if anything 2x64bit would be 16 byte, but that's as close to "28gb" as i can get)
800 MHz modules running in dual channel mode will achieve 1600*10^6Hz (1600 Mhz effective frequency at 800Mhz actual frqequency, thus the name double data rate)
times 128 bit (dual channel bus) = 204800000000 bit/s = 25600000000 byte/s=25.6GB/s (decimal)
half of that is, surprise surprise, exactly 12.8GB/s which is the speed you get when only having a 64bit wide bus which apparently is the case with the wii-u
the rest of your post is mostly a collection of speculation, complete off topic stuff, misunderstood technical data and outright wrong information all of it wrapped into a writing style that makes my toenails roll themselves into a sushi maki so i'm not even going to go into that
please do yourself a favor and don't ever get a job that requires even basic mathematic knowledge
donttryandblockmii - Thursday, November 22, 2012 - link
why did you go for the secondary bus of wii, as the main bus of wiiu,!!! you deliberately ignored the main bus 64 bit powerpc that gamecube and wii had!!... and went for the 32 bit secondary bus that the secondary gddr3 ram was on in wii !!!! ALREADY KNOWING FULL WELL THAT WASN'T THE MAIN BUS so you deliberately went to the lowest bit bus , to make it look like wiiu is weak DON'T DENY THIS AN AND TECH THAT'S WHAT YOU DID....why would the secondary bus be the main bus next gen THAT'S TOTAL CRAP so gamecube and wii had 64 bit main bus and all of a sudden the bus went 32 bit the 3rd time round IN YOUR DREAMS AN AND TECH
the powerpc 32 bit core that makes up the 3x core broadway 2 EXPRESSO cpu in wiiu is based on powerpc 476fp all 32bit powerpcs now run on a 128bit ring bus the 64 bit fsb was of the past its no longer used
so why ignore the 64 bit bus and the new 128bit bus AND TELL LIES about a 32 bit bus your exposed as anti nintendo i think (weak cpu lie il debunk that with ease too)
isnt it likely the wiiu with its mcm and a 45nm powerpc broadway fied tri core WOULD ALSO BE USING THE SAME 128BIT RING BUS AH-HHHMMMMmmmmmm wiiboy cannot be fooled like ps3 silly fanboy
another point you fail to understand only powerpc 400s on this 128bit ring bus SUPPORT MULI CORE powerpc 750 of wii and gamecube DO NOT support multi core !!!!!!!!!!!!!<<<<<suggestion<<<<<
so commonsense is the ddr3 (witch is actually gddr3) samsung dont show the G in there specs so its gddr3 obviously they market the ram as so gDDR3 not GDDR3 they drop the G as its meaningless there the same ram
at 1600mhz is perfectly in line with 45nm powerpc thinking as the bus is 800mhz 128 bit and the recommended ram is ddr2 ddr3 1600mhz (u said yourselves ram looks 1600)
so the bus is most likely 128bit YOU SAID 32BIT OUT OF ANTI NINTENDO SPITE DIDN'T YOU MAKE WIIU LOOK BAD lol i see thru this pc fanboy nonsense like superman looking thru clear glass lol
if the ram is 1600mhz then its highly likely the cpu is 1600mhz (exactly lining up with ibm powerpc 476fp) and the ring bus in the mcm is 128 bit 800mhz and as the wii and gamecube were both balanced to the bus speed exactly then no doubt the wiiu is also balanced to the bus IF SO
CPU = 1600MHZ GPU = 800MHZ AND RAM = 800MHZ DUAL CHANNEL = 1600MHZ AND THE EDRAM ON GPU IS 800MHZ
2TO1 BALANCE REPLACES 3TO1 BALANCE Of WII AND GAMECUBE,,,, REMINDER GAMECUBE WAS ORIGINALLY 2TO1 404 CPU CLOCK AND 202 GPU CLOCK IT WAS CHANGED TO 2TO1 WHEN IBM COULDN'T GET THE OLD 64 BIT G3 BUS TO GET TO 202MHZ SO DROPPED IT TO A 3TO1 BALANCE INSTEAD OF 2TO1
SURLY YOU REMEMBER THIS GUYS I DO IM A CORE GAMER ITS ARE JOB TO REMEMBER THIS!!!!!!!
so has nintendo returned to the original wanted tighter 2to1 balance now that ibm have a half speed bus on there powerpc 32bit cpu I THINK THEY HAVE !!!!!
if the wiiu is still 64bit bus and not 128bit ring bus for what ever reason then is it not likely the 2gb ram is on a dual 64bit bus still giving single 128bit bus levels of bandwidth
and isnt it safe to say if there not using a 128bit ring bus then theres still dual memory buses just like wii and gamecube had so again its still way higher than 32 bit THAT YOUR SAYING
likely speed of ram is 800mhz x 2 = 1600 and as nintendo make clock balanced systems BECAUSE THERE NOT STUPID ENOUGH TO TRASH AND WASTE CLOCKS COMBINED WITH LOW LATENCY FAST RAM = EFFICIENT
those speeds might not be exact but there ball park UNLIKE YOURS
likely edram to gpu = 512bit or higher REMEMBER GUYS WII AND GAMECUBE HAD A 512BIT TEXTURE CATCH AND A WHAT WAS IT 360 BIT FRAME Z BUFFER JESUS CHECK YOUR OWN ARTICLES GUYS BEFORE TEARING THE WIIU DOWN
so a minimum 512bit for edram if not more and 128bit bus for main ram LOL 32 BIT ....
tipoo - Tuesday, November 27, 2012 - link
Are you the same guy as wiiboy? Same sentence structure, same rant style.donttryandblockmii - Thursday, November 22, 2012 - link
just add that ibm edram as level 2 catch has more than 2x bandwidth of sram so the 3mb catch of wiiu has twice if not more bandwidth of the same catch made from sramsnd the edram to gpu is over 10x wii and looks very alike so has wiiu got a edram texture/shader catch as well as a frame and z buffer
frame and z buffer =22mb and texture shader catch =10mb just guesing here it may not use a texture catch like wii and gamecube BUT IT WOULD MAKE SENSE the developers have said texture and shader data can be loaded into the gpu THAT SAYS BIG EDRAM CATCH TO ME LIKE GAMECUBE
donttryandblockmii - Friday, November 23, 2012 - link
GAMECUBE 2.6GB BANDWIDTH MAIN RAMWII 4GB X 2 = 8GB MAIN RAM
WIIU LIKELY 28GB OR THERE ABOUT GDDR3
GAMECUBE 10GB TEXTURE CATCH AND A 7PLUS GB FRAME AND Z BUFFER
SO WIIU COULD BE 100GB TEXTURE CATCH AND 75GB FRAME AND Z BUFFER BUT THEN AGAIN WHAT ABOUT FREE AA likely using a system lie x360 but more ram and bandwidth
lpedraja2002 - Tuesday, November 27, 2012 - link
Excellent article but I was very disappointed that the engravings on the CPU or GPU weren't shown! Why didn't you guys clean off the thermal paste before posting pics? This is what a lot of people have been wishing to know, so I'll assume that you guys didn't show it because of legal reasons and not because you forgot.tipoo - Sunday, February 10, 2013 - link
A couple of us on Neogaf got the die shot from Chipworks and have started analyzing ithttp://www.neogaf.com/forum/showthread.php?t=51162...
shortpiped - Thursday, April 4, 2013 - link
People are still using this as some sort of fact sheet, so id figure id drop you guys a line.You cant determine bus size by looking at a ramchips specification documentation. Its not in there, as the memory architecture is part of the machine, and not the ram product.
Also, while you got the hynix ram, the samsung ram is the same exact ram chip used in og 360's, down to the serial number/ nomenclature. By your logic of chip type=bus by serial number, the wii u would have a 256bit bus. Which also isnt true.
And finally, we have the die picture of the gpu. And can clearly see the ddr3 i/0 where the ram bus plugs directly into the gpu.
If the wii u only has a 64 bit bus. Why are there 158 pins on the ddr3 i/o?
And if the wii u only had half the main memory bandwidth of ps360, how is need for speed possible?
Bigger assets for textures, and better framerate/performance, at half the bandwidth? edram cant fix that, every access to main memory is at main memories bandwidth.
You are looking at a bandwidth a little over 30GB/s