UL Benchmarks: PCMark and 3DMark

This section deals with a selection of the UL Futuremark benchmarks - PCMark 10, PCMark 8, and 3DMark. While the first two evaluate the system as a whole, 3DMark focuses on the graphics capabilities.

PCMark 10

UL's PCMark 10 evaluates computing systems for various usage scenarios (generic / essential tasks such as web browsing and starting up applications, productivity tasks such as editing spreadsheets and documents, gaming, and digital content creation). We benchmarked select PCs with the PCMark 10 Extended profile and recorded the scores for various scenarios. These scores are heavily influenced by the CPU and GPU in the system, though the RAM and storage device also play a part. The power plan was set to Balanced for all the PCs while processing the PCMark 10 benchmark.

The Frost Canyon NUC comes in the middle of the pack, getting bettered by either systems with CPUs capable of higher TDP or better-performing storage. The hexa-core capabilities do not deliver any substantial benefits across various PCMark 10 scenarios, and the absence of Iris Plus Graphics / eDRAM pulls down the 'Gaming Score'.

Futuremark PCMark 10 - Essentials

Futuremark PCMark 10 - Productivity

Futuremark PCMark 10 - Gaming

Futuremark PCMark 10 - Digital Content Creation

Futuremark PCMark 10 - Extended

PCMark 8

We continue to present PCMark 8 benchmark results (as those have more comparison points) while our PCMark 10 scores database for systems grows in size. PCMark 8 provides various usage scenarios (home, creative and work) and offers ways to benchmark both baseline (CPU-only) as well as OpenCL accelerated (CPU + GPU) performance. We benchmarked select PCs for the OpenCL accelerated performance in all three usage scenarios. These scores are heavily influenced by the CPU in the system, and the scores roughly track what was observed in the PCMark 10 workloads.

Futuremark PCMark 8 - Home OpenCL

Futuremark PCMark 8 - Creative OpenCL

Futuremark PCMark 8 - Work OpenCL

3DMark

UL's 3DMark comes with a diverse set of graphics workloads that target different Direct3D feature levels. Correspondingly, the rendering resolutions are also different. We use 3DMark 2.4.4264 to get an idea of the graphics capabilities of the system. In this section, we take a look at the performance of the Intel NUC10i7FNH (Frost Canyon) across the different 3DMark workloads.

3DMark Ice Storm

This workload has three levels of varying complexity - the vanilla Ice Storm, Ice Storm Unlimited, and Ice Storm Extreme. It is a cross-platform benchmark (which means that the scores can be compared across different tablets and smartphones as well). All three use DirectX 11 (feature level 9) / OpenGL ES 2.0. While the Extreme renders at 1920 x 1080, the other two render at 1280 x 720. The graphs below present the various Ice Storm worloads' numbers for different systems that we have evaluated.

UL 3DMark - Ice Storm Workloads

3DMark Cloud Gate

The Cloud Gate workload is meant for notebooks and typical home PCs, and uses DirectX 11 (feature level 10) to render frames at 1280 x 720. The graph below presents the overall score for the workload across all the systems that are being compared. The absence of eDRAM / Iris Plus Graphics results in the Frost Canyon performing significantly worse compared to the Bean Canyon.

UL 3DMark Cloud Gate Score

3DMark Sky Diver

The Sky Diver workload is meant for gaming notebooks and mid-range PCs, and uses DirectX 11 (feature level 11) to render frames at 1920 x 1080. The graph below presents the overall score for the workload across all the systems that are being compared. At 1080p, Frost Canyon falls further behind, and is below Baby Canyon's performance - pointing to the lack of GPU prowess.

UL 3DMark Sky Diver Score

3DMark Fire Strike Extreme

The Fire Strike benchmark has three workloads. The base version is meant for high-performance gaming PCs. Similar to Sky Diver, it uses DirectX 11 (feature level 11) to render frames at 1920 x 1080. The Ultra version targets 4K gaming system, and renders at 3840 x 2160. However, we only deal with the Extreme version in our benchmarking - It renders at 2560 x 1440, and targets multi-GPU systems and overclocked PCs. The graph below presents the overall score for the Fire Strike Extreme benchmark across all the systems that are being compared. The results are similar to the Sky Diver workload.

UL 3DMark Fire Strike Extreme Score

3DMark Time Spy

The Time Spy workload has two levels with different complexities. Both use DirectX 12 (feature level 11). However, the plain version targets high-performance gaming PCs with a 2560 x 1440 render resolution, while the Extreme version renders at 3840 x 2160 resolution. The graphs below present both numbers for all the systems that are being compared in this review, with results being similar to the 1080p Sky Diver workload.

UL 3DMark - Time Spy Workloads

3DMark Night Raid

The Night Raid workload is a DirectX 12 benchmark test. It is less demanding than Time Spy, and is optimized for integrated graphics. The graph below presents the overall score in this workload for different system configurations.

UL 3DMark Fire Strike Extreme Score

Overall, for CPU-bound graphics workloads, the Frost Canyon performs well, but, in other cases, the absence of eDRAM and the need to share the TDP with a hexa-core CPU block shows its effects. In almost all cases, the Bean Canyon NUC either vastly overperforms the Frost Canyon NUC, or, is neck-to-neck with it.

BAPCo SYSmark 2018 Miscellaneous Performance Metrics
Comments Locked

85 Comments

View All Comments

  • HStewart - Tuesday, March 3, 2020 - link

    I will say that evolution of Windows has hurt PC market, with more memory and such, Microsoft adds a lot of fat into OS. As as point of sale developer though all these OS, I wish Microsoft had a way to reduce the stuff one does not need.

    Just for information the original Doom was written totally different to games - back in old days Michael Abrash (a leader in original game graphics) work with John Carmack of Id software for Doom and Quake, Back then we did not have GPU driven graphics and code was done in assembly language.

    Over time, development got fat and higher level languages plus GPU and drivers. came in picture. This also occurred in OS area where in 1992 I had change companies because Assembly Language developers started becoming a dying breed.

    I think part of this is Microsoft started adding so many features in the OS, and there is a lot of bulk to drive the windows interface which is much simpler in older versions.

    If I was with Microsoft, I would have options in Windows for super trim version of the OS. Reducing overhead as much as possible. Maybe dual boot to it.

  • HStewart - Tuesday, March 3, 2020 - link

    I have some of original Abrash's books - quite a collectors item now a days

    https://www.amazon.com/Zen-Graphics-Programming-2n...
  • HStewart - Tuesday, March 3, 2020 - link

    And even more - with Graphics Programming Black book - almost $1000 now

    https://www.amazon.com/Michael-Abrashs-Graphics-Pr...
  • Qasar - Tuesday, March 3, 2020 - link

    you do know there are programs out there that can remove some of the useless bloat that windows auto installs, right ? maybe not to the extent that you are referring to, but ot is possible. on a fresh reinstall of win 10, i usually remove almost 500 megs of apps that i wont use.
  • erple2 - Saturday, March 14, 2020 - link

    This is an age old argument that ultimately falls flat in the face of history. "Bloated" software today is VASTLY more capable of the "efficient" code written decades ago. You could make the argument that we might not need all of the capabilities of software today, but I rather like having the incredibly stable OS's today than what I had to deal with in the past. And yes, OS's today are much more stable than they were in 1992 (not to mention vastly more capable)
  • Lord of the Bored - Thursday, March 5, 2020 - link

    My recollection is that was Windows Vista, not XP. XP was hitting 2D acceleration hardware that had stopped improving much around the time Intel shipped their first graphics adapter.
    Vista, however, had a newfangled "3D" compositor that took advantage of all the hardware progress that had happened since 1995... and a butt-ugly fallback plan for systems that couldn't use it(read as: Intel graphics).
    And then two releases later, Windows 8 dialed things way back because those damnable Intel graphics chips were STILL a significant install base and they didn't want to keep maintaining multiple desktop renderers.
    ...
    Unless the Vista compositor was originally intended for XP, in which case I eat my hat.
  • TheinsanegamerN - Monday, March 2, 2020 - link

    you dont need a 6 core CPU for back office systems or report machines either. So they wouldnt buy this at all.

    Dell, HP, ece make small systems with better CPU power for a lower price then this. The appeal of the NUCs was good CPUs with iris level GPUs isntead of the UHD that everyone else used.
  • PeachNCream - Monday, March 2, 2020 - link

    The intention of the NUC was to provide a fairly basic computing device in a small and power efficient package. Iris models were something of an aberration in more recent models. In fact, the first couple of NUC generations used some of Intel's slowest processors available at the time. tim
  • niva - Tuesday, March 3, 2020 - link

    The point is that if you're making a basic computing device why even go beyond 4 cores. I kind of want a NUC as a basic browsing computer that takes up little space. I can see these being used in the office too. Many use cases for a device like this with 6 or more cores in the office, especially for folks in engineering fields running Matlab or doing development/compiling. However, in almost all of these use cases having a stronger graphics package helps, never mind gaming. Taking a step back in the GPU side, especially given what AMD is doing right now and this being in response to the competition, doesn't make much sense. Perhaps this is just to hold them over until Intel fully transitions to using AMD GPUs in the future?
  • Lord of the Bored - Thursday, March 5, 2020 - link

    Can I just say how much I love that four cores is now considered a "basic" computing device? It leaves me suffused with a warm glow of joy.

Log in

Don't have an account? Sign up now