The Xbox One: Hardware Analysis & Comparison to PlayStation 4
by Anand Lal Shimpi on May 22, 2013 8:00 AM ESTCPU & GPU Hardware Analyzed
Although Microsoft did its best to minimize AMD’s role in all of this, the Xbox One features a semi-custom 28nm APU designed with AMD. If this sounds familiar it’s because the strategy is very similar to what Sony employed for the PS4’s silicon.
The phrase semi-custom comes from the fact that AMD is leveraging much of its already developed IP for the SoC. On the CPU front we have two Jaguar compute units, each one with four independent processor cores and a shared 2MB L2 cache. The combination of the two give the Xbox One its 8-core CPU. This is the same basic layout of the PS4‘s SoC.
If you’re not familiar with it, Jaguar is the follow-on to AMD’s Bobcat core - think of it as AMD’s answer to the Intel Atom. Jaguar is a 2-issue OoO architecture, but with roughly 20% higher IPC than Bobcat thanks to a number of tweaks. In ARM terms we’re talking about something that’s faster than a Cortex A15. I expect Jaguar to be close but likely fall behind Intel’s Silvermont, at least at the highest shipping frequencies. Jaguar is the foundation of AMD’s Kabini and Temash APUs, where it will ship first. I’ll have a deeper architectural look at Jaguar later this week. Update: It's live!
Inside the Xbox One, courtesy Wired
There’s no word on clock speed, but Jaguar at 28nm is good for up to 2GHz depending on thermal headroom. Current rumors point to both the PS4 and Xbox One running their Jaguar cores at 1.6GHz, which sounds about right. In terms of TDP, on the CPU side you’re likely looking at 30W with all cores fully loaded.
The move away from PowerPC to 64-bit x86 cores means the One breaks backwards compatibility with all Xbox 360 titles. Microsoft won’t be pursuing any sort of a backwards compatibility strategy, although if a game developer wanted to it could port an older title to the new console. Interestingly enough, the first Xbox was also an x86 design - from a hardware/ISA standpoint the new Xbox One is backwards compatible with its grandfather, although Microsoft would have to enable that as a feature in software - something that’s quite unlikely.
Microsoft Xbox One vs. Sony PlayStation 4 Spec comparison | ||||||||||||||
Xbox 360 | Xbox One | PlayStation 4 | ||||||||||||
CPU Cores/Threads | 3/6 | 8/8 | 8/8 | |||||||||||
CPU Frequency | 3.2GHz | 1.6GHz (est) | 1.6GHz (est) | |||||||||||
CPU µArch | IBM PowerPC | AMD Jaguar | AMD Jaguar | |||||||||||
Shared L2 Cache | 1MB | 2 x 2MB | 2 x 2MB | |||||||||||
GPU Cores | 768 | 1152 | ||||||||||||
Peak Shader Throughput | 0.24 TFLOPS | 1.23 TFLOPS | 1.84 TFLOPS | |||||||||||
Embedded Memory | 10MB eDRAM | 32MB eSRAM | - | |||||||||||
Embedded Memory Bandwidth | 32GB/s | 102GB/s | - | |||||||||||
System Memory | 512MB 1400MHz GDDR3 | 8GB 2133MHz DDR3 | 8GB 5500MHz GDDR5 | |||||||||||
System Memory Bus | 128-bits | 256-bits | 256-bits | |||||||||||
System Memory Bandwidth | 22.4 GB/s | 68.3 GB/s | 176.0 GB/s | |||||||||||
Manufacturing Process | 28nm | 28nm |
On the graphics side it’s once again obvious that Microsoft and Sony are shopping at the same store as the Xbox One’s SoC integrates an AMD GCN based GPU. Here’s where things start to get a bit controversial. Sony opted for an 18 Compute Unit GCN configuration, totaling 1152 shader processors/cores/ALUs. Microsoft went for a far smaller configuration: 768 (12 CUs).
Microsoft can’t make up the difference in clock speed alone (AMD’s GCN seems to top out around 1GHz on 28nm), and based on current leaks it looks like both MS and Sony are running their GPUs at the same 800MHz clock. The result is a 33% reduction in compute power, from 1.84 TFLOPs in the PS4 to 1.23 TFLOPs in the Xbox One. We’re still talking about over 5x the peak theoretical shader performance of the Xbox 360, likely even more given increases in efficiency thanks to AMD’s scalar GCN architecture (MS quotes up to 8x better GPU performance) - but there’s no escaping the fact that Microsoft has given the Xbox One less GPU hardware than Sony gave the PlayStation 4. Note that unlike the Xbox 360 vs. PS3 era, Sony's hardware advantage here won't need any clever developer work to extract - the architectures are near identical, Sony just has more resources available to use.
Remember all of my talk earlier about a slight pivot in strategy? Microsoft seems to believe that throwing as much power as possible at the next Xbox wasn’t the key to success and its silicon choices reflect that.
245 Comments
View All Comments
tipoo - Wednesday, May 22, 2013 - link
I wonder how close the DDR3 plus small fast eSRAM can get to the GDDR5s peak performance from the PS4. The GDDR5 will be better in general for the GPU no doubt, but how much will be offset by the eSRAM? And how much will GDDRs high latency hurt the CPU in the PS4?Braincruser - Wednesday, May 22, 2013 - link
The cpu is running on low frequency ~ 1.6 GHz which is half the frequency of most mainstream processors. And the GDDRs latency shouldn't be more than double the DDR3 latency. So in effect the latency stays the same, relativelly speaking.MrMilli - Wednesday, May 22, 2013 - link
GDDR5 actually has around ~8-10x worse latency compared to DDR3. So the CPU in the PS4 is going to be hurt. Everybody's talking about bandwidth but the Xbox One is going to have such a huge latency advantage that maybe in the end it's going to be better off.mczak - Wednesday, May 22, 2013 - link
gddr5 having much worse latency is a myth. The underlying memory technology is all the same after all, just the interface is different. Though yes memory controllers of gpus are more optimized for bandwidth rather than latency but that's not gddr5 inherent. The latency may be very slightly higher, but it probably won't be significant enough to be noticeable (no way for a factor of even 2 yet alone 8 as you're claiming).I don't know anything about the specific memory controller implementations of the PS4 or Xbox One (well other than one using ddr3 the other gddr5...) but I'd have to guess latency will be similar.
shtldr - Thursday, May 23, 2013 - link
Are you talking latency in cycles (i.e. relative to memory's clock rate) or latency in seconds (absolute)? Latency in cycles is going to be worse, latency in seconds is going to be similar. If I understand it correctly, the absolute (objective) latency expressed in seconds is the deciding factor.MrMilli - Thursday, May 23, 2013 - link
I got my info from Beyond3D but I went to dig into whitepapers from Micron and Hynix and it seems that my info was wrong.Micron's DDR3 PC2133 has a CL14 read latency specification but possibly set as low as CL11 on the XBox. Hynix' GDDR5 (I don't know which brand GDDR5 the PS4 will use but they'll all be more or less the same) has a CL18 up to CL20 for GDDR5-5500.
So even though this doesn't give actual latency information since that depends a lot on the memory controller, it probably won't be worse than 2x.
tipoo - Monday, May 27, 2013 - link
Nowhere near as bad as I thought GDDR5 would be given what everyone is saying about it to defend DDR3, and given that it runs at such a high clock rate the CL effect will be reduced even more (that's measured in clock cycles, right?).Riseer - Sunday, June 23, 2013 - link
For game performance,GDDR5 has no equal atm.Their is a reason why it's used in Gpu's.MS is building a media center,while Sony is building a gaming console.Sony won't need to worry so much about latency for a console that puts games first and everything else second.Overall Ps4 will play games better then Xbone.Also ESram isn't a good thing,the only reason why Sony didn't use it is because it would complicate things more then they should be.This is why Sony went with GDDR5 it's a much simpler design that will streamline everything.This time around it will be MS with the more complicated console.Riseer - Sunday, June 23, 2013 - link
Also lets not forget you only have 32mb worth of ESRAM.At 1080p devs will push for more demanding effects.On Ps4 they have 8 gigs of ram that has around 70GB's more bandwidth.Since DDR3 isn't good for doing graphics,that only leaves 32mb of true Vram.That said Xbone can use the DDR3 ram for graphics,issue being DDR3 has low bandwidth.MS had no choice but to use ESRam to claw back some performance.CyanLite - Sunday, May 26, 2013 - link
I've been a long-term Xbox fan, but the silly Kinect requirement scares me. It's only a matter of time before somebody hacks that. And I'm a casual sit-down kind of gamer. Who wants to stand up and wave arm motions playing Call of Duty? Or shout multiple voice commands that are never recognized the first time around?If PS4 eliminates the camera requirement, get rids of the phone-home Internet connections, and lets me buy used games then I'm willing to reconsider my console loyalty.