Intel Broadwell Architecture Preview: A Glimpse into Core M
by Ryan Smith on August 11, 2014 12:01 PM ESTWith Haswell Refresh fully behind us and 2014 now in to its second half, Intel is turning their attention to their next generation of products and processes. Intel’s tick-tock methodology coupled with the long development periods of new products means that the company has several projects in flight at any given time. So while we have seen the name Broadwell on Intel’s roadmaps for some time now, the reality of the situation is that we know relatively little about Intel’s next generation architecture and the 14nm process that it is the launch vehicle for.
Typically we would see Intel unveil the bulk of the technical details of their forthcoming products at their annual Intel Developer Forum, and with the next IDF scheduled for the week of September 9th we’ll see just that. However today Intel will be breaking from their established standards a bit by not waiting until IDF to deliver everything at once. In a presentation coinciding with today’s embargo, dubbed Advancing Moore’s Law in 2014, Intel will be offering a preview of sorts for Broadwell while detailing their 14nm process.
Today’s preview and Intel’s associated presentation are going to be based around the forthcoming Intel Core M microprocessor, using the Broadwell configuration otherwise known at Broadwell-Y. The reason for this is a culmination of several factors, and in all honesty it’s probably driven as much by investor relations as it is consumer/enthusiast relations, as Intel would like to convince consumer and investor alike that they are on the right path to take control of the mobile/tablet market through superior products, superior technology, and superior manufacturing. Hence today’s preview will be focused on the part and the market Intel feels is the most competitive and most at risk for the next cycle: the mobile market that Core M will be competing in.
To that end Intel’s preview is very much a preview; we will see bits and pieces of Broadwell’s CPU architecture, GPU architecture, and packaging, along with information about Intel’s 14nm process. However this isn’t a full architecture preview or a full process breakdown. Both of those will have to wait for Intel’s usual forum of IDF.
Diving into matters then, Core M will be launch vehicle for Broadwell and will be released for the holiday period this year. In fact Intel is already in volume production of the Broadwell-Y CPU and production units are shipping to Intel’s customers (the OEMs) to begin production and stockpiling of finished devices for the holiday launch.
Intel’s decision to initially focus Broadwell on the mobile market comes as the company takes the next step in their plan to extend into the Core processor series into these devices. Arguably, Intel has been slow to response to the rise of ARM devices, whose rapid rise has undercut traditional PC sales and quickly become the biggest threat to Intel’s processor dominance in some number of years. Intel is far from doomed right now, but even they see the potential farther down the line if they do not act.
Intel for their part has responded, but it has taken a step-by-step (multi-year) process that has seen the company progressively build smaller and less power hungry CPUs in order to fit the needs of the mobile market. Since Intel integrated their graphics on-die with Sandy Bridge in 2011, the company has continued to tweak the designs of their products, with Ivy Bridge and Haswell generation products introducing further optimizations and new manufacturing processes. Now on their latest iteration with Broadwell, the company believes they’re turning a corner and have the technology they need to be a leader in the high performance mobile market. It's important to note that despite Intel's best intentions here, Broadwell and Core M remain targeted at premium devices. You won't see these parts in cheap tablets. The duty of doing battle with ARM remains Atom's alone.
Many of these changes ultimately amount to boosting performance and reducing power consumption to a point where power and heat are where they need to be for mobile form factors, either through process efficiency improvements or through better power management and wider dynamic ranges – boosting where it matters and doing a better job of idling between tasks. However as Intel has discovered they not only need to be able to meet the TDP requirements of a tablet but they need to be able to meet the size requirements too. A particularly daunting task when the entire thickness of a device needs to be under 10mm, and the CPU thinner yet.
As a result, coupled with Core M’s performance improvements and power reductions is a strong emphasis on the size of the processor package itself and what Intel could do to reduce it. Intel calls this an outside-in system design, with various parts of Intel focusing on everything from the size of the logic board needed to hold the processor to the thickness of the processor die itself. In the following pages we’ll take a look at Intel’s efforts to get slim, but to kick things off we have a picture of Broadwell-Y from Computex 2014.
From left to right: Broadwell-Y (Core M), Broadwell ULT/ULX and Haswell ULT/ULX
Intel wants a greater foothold in the mobile market and they want it badly. And with Broadwell-Y they believe they finally have what they need to accomplish that goal.
158 Comments
View All Comments
wurizen - Monday, August 11, 2014 - link
well, an fx-8350 is toe-to-toe with an i7-2600k, which is no slouch until today. and comparing fx-8350 with today's i7-4770k would be a little unfair since the 4770k is 22nm while the 8350 is at 32nm. and we're not even considering software optimizations from OS and/or programs that are probably bent towards intel chips due to its ubiquity.so, i think, you're wrong that the fx-8350 doesn't provide good enough. i have both i7-3770k oc'd to 4.1 ghz and an fx-8320 at stock and the amd is fine. it's more than good enough. i've ripped movies using handbrake on both systems and to me, both systems are fast. am i counting milliseconds? no. does it matter to me if the fx-8320 w/ lets say amd r9-290 has 85 fps for so and so game and an i7-4770k w/ the same gpu has a higher fps of 95, let's just say? i don't think so. that extra 10 fps cost that intel dude $100 more. and 10 extra frames with avg frames of 85-95 is undecipherable. it's only when the frames drop down below 60 does one notice it since most monitors are at 60 hz.
so what makes the fx not good enough for you again? are you like a brag queen? a rich man?
frostyfiredude - Monday, August 11, 2014 - link
Not fair to compare against a 22nm from Intel? Bogus, I can go to the store and buy a 22nm Intel so it should be compared against AMDs greatest. An i5-4670K matches or exceeds the performance of even the FX-9590 in all but the most embarrassingly threaded tasks while costing 50$ more. Cost to operate the machine through the power bill makes up for that price difference at a fairly standard 12c per KWh when used heavily 2 hours per day for 4 years or idling 8 hours per day for the same 4 years.Your argument for gaming with the 8350 being good enough is weak too when the 10$ cheaper i3-4430 keeps up. Or spent 125$ less to get a Pentium G3258 AE, then mildly overclock it to again have the same good enough gaming performance if >60FPS is all that matters. The i3 and pentiums are ~70$ cheaper yet when power use is counted again.
wurizen - Tuesday, August 12, 2014 - link
well, if a pentium g3258 is good enuff for gaming, then so is an fx-8350. whaaaaaat? omg we know intel is king. i acknowledge and understand that. intel rules. but, amd is not bad. not bad at all is all im trying to make./omg
wetwareinterface - Monday, August 11, 2014 - link
wow...first off you are assuming a lot and not bothering to check any published benchmarks out there so,
1. 8350 isn't even equal to 2500 i5 let alone 2600 i7.
2. 32nm vs. 22nm means nothing at all when comparing raw performance in a desktop. it will limit the thermal ceiling so in a laptop the higher nm chip will run hotter therefore be unable to hit higher clocks but in a desktop it means nil.
3. handbrake ripping relies on speed of dvd/blu-ray drive, handbrake transcoding relies on cpu performance and the 8350 gets spanked there by a dual core i3 not by miliseconds but tens of seconds. i5 it gets to the level of minutes i7 more so.
4. let's say you're pulling framerates for an r9-290 out of somewhere other than the ether... reality is an i5 is faster than the 8350 in almost any benchmark i've ever seen by roughly 15% overall. in certan games with lots of ai you get crazy framerate advantages with i5 over 8350, things like rome total war and starcraft 2 and diablo 3 etc...
i'll just say fx8350 isn't good enough for me and i'm certainly not a rich man. system build cost for what i have vs. what the 8350 system would have run was a whopping $65 difference
wurizen - Tuesday, August 12, 2014 - link
#3 is B.S. a dual-core i3 can't rip faster than an fx-8350 in handbrake.#4 the r-290 was an example to pair a fairly high end gpu with an fx-8350. a fairly high end gpu helps in games. thus, pairing it with an fx-8350 will give you a good combo that is more than good enough for gaming.
#2 22nm vs. 32nm does matter in desktops. the fx-8350 is 32nm. if it goes to 22nm, the die shrink would enable the chip to either go higher in clockspeed or lower it's tdp.
u sound like a benchmark queen or a publicity fatso.
wurizen - Tuesday, August 12, 2014 - link
oh and #1--i am not saying the fx 8350 is better than the i7-2600k. i said "toe-to-toe." the i5-2500k can also beat the fx-835o b/c of intel's IPC speed advantage. but, i think the reasons for that are programs not made to be multithreaded and make use of fx-8350 8-cores to it's potential. since amd trails intel in IPC performance by a lot--this means that a 4-core i5-2500k can match it or sometimes even beat it in games. in a multithreaded environment, the 8-core fx-8350 will always beat the i5-2500k. although it might still trailer the 4-core + 4 fake cores i7-2600k. just kidding. lol.i said toe to toe with 2600k which means its "competitive" to an i7-2600k even though the AMD is handicapped with slower IPC speed and most programs/OS not optimize for multithreading. so, to be 10-20% behind in most benchmarks against an i7-2600k is not bad considering how programs take advantage of intel's higher IPC performance.
do u understand what im trying to say?
Andrew Lin - Tuesday, August 26, 2014 - link
i'm sorry, is your argument here that the FX-8350 is better because it's inferior? because that's all i'm getting out of this. Of course a benchmark is going to take advantage of higher IPC performance. That's the point of a benchmark: to distinguish higher performance. The way you talk about benchmarks it's as if you think benchmarks only give higher numbers because they're biased. That's not how it works. The benchmarks give the i7-2600k higher scores because it is a higher performance part in real life, which is what anyone buying a CPU actually care about. Not to mention the significantly higher efficiency, which is just an added benefit.Also, it's really hard to take you seriously when your posts make me think they're written by a teenage girl.
wurizen - Tuesday, August 12, 2014 - link
also, if the fps disparity is so huge btwn fx-8350 and say i5-2500k in games u mention like starcraft 2, then something is wrong with that game. and not the fx-8350. i actually have sc2 and i have access to a pc w/ an fx-8320. so i am going to do a test later tonight. my own pc is an i7-3770k. so i could directly compare 2 different systems. the only thing is that the amd pc has an hd5850 gpu, which should be good enuff for sc2 and my pc has a gtx680 so it's not going to be a direct comparison. but, it should still give a good idea, right?wurizen - Tuesday, August 12, 2014 - link
i just played starcraft 2 on a pc with fx-8320 (stock clockspeed), 8GB 1600Mhz RAM, 7200rpm HDD and an old AMD HD5850 w/ 1GB VRAM. the experience was smooth. the settings were 1080P, all things at ultra or high and antialiasing set to ON. i wasn't looking at FPS since i don't know how to do it with starcraft 2, but, the gameplay was smooth. it didn't deter my experience.i also play this game on my own pc which is an i7-3770k OC'd to 4.1, 16GB 1600 Mhz RAM, 7200rpmHDD and an Nvidia GTX680 FTW w/ 2GB VRAM and i couldn't tell the difference as far as the smoothness of the gameplay is concerned. there is some graphical differences between the AMD GPU and the Nvidia GPU but that is another story. my point is that my experience were seamless playing on an FX chip pc to my own pc with 3700k.
to make another point, i also have this game on my macbook pro and that is where the experience of playing this game goes down. even in low settings. the MBP just can't handle it. at least the one i have with the older gt330m dGpu and dual-core w/ hyperthreading i7 mobile cpu.
so.... there.... no numbers or stats. just the experience, to me, which is what counts did not change with the pc that had the amd fx cpu.
wurizen - Tuesday, August 12, 2014 - link
well, i should point out that my macbook pro (mid-2010 model) can handle starcraft 2. but, it's not a "fun" experience. or as smooth.