The Xbox One: Hardware Analysis & Comparison to PlayStation 4
by Anand Lal Shimpi on May 22, 2013 8:00 AM ESTIt’s that time of decade again. Time for a new Xbox. It took four years for Microsoft to go from the original Xbox to the Xbox 360. The transition from Xbox 360 to the newly announced Xbox One will take right around 8 years, and the 360 won’t be going away anytime soon either. The console business demands long upgrade cycles in order to make early investments in hardware (often sold at a loss) worthwhile. This last round was much longer that it ever should have been, so the Xbox One arrives to a very welcoming crowd.
Yesterday Microsoft finally took the covers off the new Xbox, what it hopes will last for many years to come. At a high level here’s what we’re dealing with:
- 8-core AMD Jaguar CPU
- 12 CU/768 SP AMD GCN GPU
- 8GB DDR3 system memory
- 500GB HDD
- Blu-ray drive
- 2.4/5.0GHz 802.11 a/b/g/n, multiple radios with WiFi Direct support
- 4K HDMI in/out (for cable TV passthrough)
- USB 3.0
- Available later this year
While Microsoft was light on technical details, I believe we have enough to put together some decent analysis. Let’s get to it.
Chassis
The Xbox 360 was crafted during a time that seems so long ago. Consumer electronics styled in white were all the rage, we would be a few years away from the aluminum revolution that engulfs us today. Looking at the Xbox One tells us a lot about how things have changed.
Microsoft isn’t so obsessed with size here, at least initially. Wired reports that the Xbox One is larger than the outgoing 360, although it’s not clear whether we’re talking about the new slim or the original design. Either way, given what’s under the hood - skimping on cooling and ventilation isn’t a good thing.
The squared off design and glossy black chassis scream entertainment center. Microsoft isn’t playing for a position in your games cabinet, the Xbox One is just as much about consuming media as it is about playing games.
In its presentation Microsoft kept referencing how the world has changed. Smartphones, tablets, even internet connectivity are very different today than they were when the Xbox 360 launched in 2005. It’s what Microsoft didn’t mention that really seems to have played a role in its decision making behind the One: many critics didn’t see hope for another generation of high-end game consoles.
With so much of today focused on mobile, free to play and casual gaming on smartphones and tablets - would anyone even buy a next-generation console? For much of the past couple of years I’ve been going around meetings saying that before consolidation comes great expansion. I’ve been saying this about a number of markets, but I believe the phrase is very applicable to gaming. Casual gaming, the advent of free to play and even the current mobile revolution won’t do anything to the demand for high-end consoles today or in the near term - they simply expand the market for gamers. Eventually those types of games and gaming platforms will grow to the point where they start competing with one another and then the big console players might have an issue to worry about, but I suspect that’s still some time away. The depth offered by big gaming titles remains unmatched elsewhere. You can argue that many games are priced too high, but the Halo, BioShock, Mass Effect, CoD experience still drives a considerable portion of the market.
The fact that this debate is happening however has to have impacted Microsoft. Simply building a better Xbox 360 wasn’t going to guarantee success, and I suspect there were not insignificant numbers within the company who felt that even making the Xbox One as much of a gaming machine as it is would be a mistake. What resulted was a subtle pivot in strategy.
The Battle for the TV
Last year you couldn’t throw a stone without hitting a rumor of Apple getting into the TV business. As of yet those rumors haven’t gone anywhere other than to point to continued investment in the Apple TV. Go back even further and Google had its own TV aspirations, although met with far less success. More recently, Intel threw its hat into the ring. I don’t know for sure how things have changed with the new CEO, but as far as I can tell he’s a rational man and things should proceed with Intel Media’s plans for an IPTV service. All of this is a round about way of saying that TV is clearly important and viewed by many as one of the next ecosystem battles in tech.
Combine the fact that TV is important, with the fact that the Xbox 360 has evolved into a Netflix box for many, add a dash of uncertainty for the future of high end gaming consoles and you end up with the formula behind the Xbox One. If the future doesn’t look bright for high-end gaming consoles, turning the Xbox into something much more than that will hopefully guarantee its presence in the living room. At least that’s what I suspect Microsoft’s thinking was going into the Xbox One. With that in mind, everything about the One makes a lot of sense.
245 Comments
View All Comments
JDG1980 - Wednesday, May 22, 2013 - link
In terms of single-threaded performance *per clock*, Thuban > Piledriver. Sure, if you crank up the clock rate *and the heat and power consumption* on Piledriver, you can barely edge out Deneb and Thuban on single-threaded benchmarks. But if you clock them the same, the Thuban uses less power, generates less heat, and performs better. Tom's Hardware once ran a similar test with Netburst vs Pentium M, and his conclusion was quite blunt: the test called into question the P4's "right to exist". The same is true of the Bulldozer/Piledriver line.And I don't buy the argument that K10 is too old to be fixable. Remember that Ivy Bridge and Haswell are part of a line stretching all the way back to the original Pentium Pro. The one time Intel tried a clean break with the past (Netburst) it was an utter fail. The same is true of AMD's excavation equipment line and for the same reason - IPC is terrible so the only way to get acceptable performance is to crank up clock rate, power, noise, and thermals.
silverblue - Wednesday, May 22, 2013 - link
It's true that K10 is generally more effective per clock, but look at it this way - AMD believed that the third AGU was unnecessary as it was barely used, much like when VLIW4 took over from VLIW5 as the average slot utilisation within a streaming processor was 3.4 at any given time. Put simply, they made trade-offs where it made sense to make them. Additionally, K10 was most likely hampered by its 3-issue front end, but it also lacked a whole load of ISAs - SSE4.1 and 4.2 are good examples.Thuban compares well with the FX-8150 in most cases and favourably so when we're considering lighter workloads. The work done to rectify some of Bulldozer's ills shows that Piledriver is not only about 7% faster per clock, but can clock higher within the same power envelope. AMD was obviously aiming for more performance within a given TDP. The FX-83xx series is out of reach of Thuban in terms of performance.
The 6300 compares with the 1100T BE as such:
http://www.cpu-world.com/Compare/316/AMD_FX-Series...
Oddly, one of your arguments for having a Thuban in the first place was power consumption. The very reason a Thuban isn't clocked as high as the top X4s is to keep power consumption in check. Those six cores perform very admirably against even a 2600K in some circumstances, and generally with Bulldozer and Piledriver you'd look to the FX-8xxx CPUs if comparing with Thuban, however I expect the FX-6350 will be just enough to edge the 1100T BE in pretty much any area:
http://www.cpu-world.com/Compare/321/AMD_FX-Series...
The two main issues with the current "excavation equipment line" as you put it is a lack of single threaded power, plus the inherent inability to switch between threads more than once per clock - clocking Bulldozer high may offset the latter in some way but at the expense of power usage. The very idea that Steamroller fixes the latter with some work done to help the former, and that Excavator improves IPC whilst (supposedly) significantly reducing power consumption should be evidence enough that whilst it started off bad, AMD truly believes it will get better. In any case, how much juice does anybody expect eight cores to use at 4GHz with a shedload of cache? Does anybody remember how hungry Nehalem was, let along P4?
I doubt that Jaguar could come anywhere near even a downclocked A10-4600M. The latter has a high-speed dual channel architecture and a 4-issue front end; to be perfectly honest, I think that even with its faults, it would easily beat Jaguar at the same clock speed.
Tacking bits onto K10 is a lost cause. AMD doesn't have the money, and even if it did, Bulldozer isn't actually a bad idea. Give them a chance - how much faster was Phenom II over the original Phenom once AMD worked on the problem for a year?
Shadowmaster625 - Wednesday, May 22, 2013 - link
Yeah but AMD would not have stood still with K10. Look at how much faster Regor is compared to the previous athlon:http://www.anandtech.com/bench/Product/121?vs=27
The previous athlon had a higher clock speed and the same amount of cache, but regor crushes it by almost 30% in Far Cry 2. It is 10% faster across the board despite being lower clocked and consuming far less power. Had they continued with Thuban it is possible they would have continued to squeeze 10% per year out of it as well as reduce power consumption by 15%, which if you do the math that leaves us with something relatively competitive today. Not to mention they would have saved a LOT of money. They could have easily added AVX or any other extensions to it.
Hubb1e - Wednesday, May 22, 2013 - link
Per clock Thuban > Piledriver, but power consumption favors Piledriver. Compare two chips of similar performance. The PhII 965 is a 125W CPU and the FX4300 is a 95W CPU and they perform similarly with the FX4300 actually beating the PhII by a small margin.kyuu - Wednesday, May 22, 2013 - link
... Lol? You can't simply clock a low-power architecture up to 4GHz. Even if you could, a 4GHz Jaguar-based CPU would still be slower than a 4GHz Piledriver-based one.Jaguar is a low-power architecture. It's not able (or meant to) compete with full-power CPUs in raw processing power. It's being used in the Xbox One and PS4 for two reasons: power efficiency, and cost. It's not because of its processing power (although it's still a big step up from the CPUs in the 360/PS3).
plcn - Wednesday, May 22, 2013 - link
BD/PD have plenty of viability in big power envelope, big/liquid cooler, desktop PC arrangements. consoles aspire to be much quieter, cooler, energy efficient - thus the sensible jaguar selection. even the best ITX gaming builds out there are still quite massive and relatively unsightly vs what seems achievable with jaguar... now for laptops on the other hand, a dual jaguar 'netbook' could be very very interesting. you can probably cook your eggs on it, too, but still interesting..lmcd - Wednesday, May 22, 2013 - link
It isn't a step in the right direction in IPC. Piledriver 40% faster than Jaguar at the same clocks and also clocks higher.Stop spreading the FUD about Piledriver -- my A8-4500m is a very solid processor with very strong graphics performance and excellent CPU performance for all but the most taxing tasks.
lightsout565 - Wednesday, May 22, 2013 - link
Pardon my ignorance, What is the "Embedded Memory" used for?tipoo - Wednesday, May 22, 2013 - link
It's a fast memory pool for the GPU. It could help by holding the framebuffer or caching textures etc.BSMonitor - Wednesday, May 22, 2013 - link
Embedded memory latency is MUCH closer to L1/L2 cache latency than system memory. System memory is Brian and Stewie taking the airline to Vegas vs the Teleporter to Vegas that would be cache/embedded memory...