BAPCo SYSmark 2018

The Intel NUC10i7FNH (Frost Canyon) was evaluated using our Fall 2018 test suite for small-form factor PCs. In the first section, we will be looking at SYSmark 2018.

BAPCo's SYSmark 2018 is an application-based benchmark that uses real-world applications to replay usage patterns of business users in the areas of productivity, creativity, and responsiveness. The 'Productivity Scenario' covers office-centric activities including word processing, spreadsheet usage, financial analysis, software development, application installation, file compression, and e-mail management. The 'Creativity Scenario' represents media-centric activities such as digital photo processing, AI and ML for face recognition in photos and videos for the purpose of content creation, etc. The 'Responsiveness Scenario' evaluates the ability of the system to react in a quick manner to user inputs in areas such as application and file launches, web browsing, and multi-tasking.

Scores are meant to be compared against a reference desktop (the SYSmark 2018 calibration system, a Dell Optiplex 5050 tower with a Core i3-7100 and 4GB of DDR4-2133 memory to go with a 128GB M.2 SATA III SSD). The calibration system scores 1000 in each of the scenarios. A score of, say, 2000, would imply that the system under test is twice as fast as the reference system.

SYSmark 2018 - Productivity

SYSmark 2018 - Creativity

SYSmark 2018 - Responsiveness

SYSmark 2018 - Overall

SYSmark 2018 also adds energy measurement to the mix. A high score in the SYSmark benchmarks might be nice to have, but, potential customers also need to determine the balance between power consumption and the efficiency of the system. For example, in the average office scenario, it might not be worth purchasing a noisy and power-hungry PC just because it ends up with a 2000 score in the SYSmark 2014 SE benchmarks. In order to provide a balanced perspective, SYSmark 2018 also allows vendors and decision makers to track the energy consumption during each workload. In the graphs below, we find the total energy consumed by the PC under test for a single iteration of each SYSmark 2018 workload. For reference, the calibration system consumes 5.36 Wh for productivity, 7.71 Wh for creativity, 5.61 Wh for responsiveness, and 18.68 Wh overall.

SYSmark 2018 - Productivity Energy Consumption

SYSmark 2018 - Creativity Energy Consumption

SYSmark 2018 - Responsiveness Energy Consumption

SYSmark 2018 - Overall Energy Consumption

The 'Creativity' workload benefits from the extra cores in Frost Canyon compared to the Core i7-8559U in the Bean Canyon. Our Bean Canyon review configuration is also equipped with a WD Black 3D NVMe SSD (PCIe 3.0 x4) that delivers much better performance compared to the PCIe 3.0 x2 Kingston A1000-class in the Frost Canyon configuration. This results in the responsiveness score for the NUC10i7FNH coming in the middle of the pack compared to the other systems in the sample set. This pulls down the Frost Canyon NUC well below the Bean Canyon NUC in the overall score. The energy consumption is also worse off.

Introduction and Platform Analysis UL Benchmarks: PCMark and 3DMark
Comments Locked


View All Comments

  • The_Assimilator - Monday, March 2, 2020 - link

    It's not, but the point is still valid: nobody buying these things is doing so because they expect them to be graphics powerhouses.
  • HStewart - Monday, March 2, 2020 - link

    But some people are so naive and don't realize the point. I came up in days when your purchase card that didn't even have GPU's on it. Not sure what level iGPU's are but they surely can run business graphics fine and even games a couple of years ago.
  • notb - Thursday, March 5, 2020 - link

    These iGPUs can drive 3 screens with maybe 1-2W power draw. Show me another GPU that can do this.

    This is an integrated GPU made for efficient 2D graphics. There's very little potential to make it any better.
  • PaulHoule - Monday, March 2, 2020 - link

    Well, Intel's horrible iGPUs forced Microsoft to walk back the graphical complexity of Windows XP. They kept the GPU dependent architecture, but had to downgrade to "worse than cell phone" visual quality because Intel kneecaped the graphics performance of the x86 platform. (Maybe you could get something better, but developers can't expect you to have it)
  • HStewart - Monday, March 2, 2020 - link

    I think we need actual proof on these bias statements. I think there is big difference of running a screen at 27 or more inches than 6 to 8 inches no matter what the resolution.
  • Korguz - Monday, March 2, 2020 - link

    we need proof of your bias statements, but yet, you very rarely provide any.. point is ??
  • Samus - Monday, March 2, 2020 - link

    What does screen size have to do with anything? Intel can't make an iGPU that can drive a 4K panel fluidly, meanwhile mainstream Qualcomm SoC's have GPU performance able to drive 4K panels using a watt of power.
  • HStewart - Tuesday, March 3, 2020 - link

    Can Qualcomm actually drive say a 32 in 4k screen efficiently. Also what is being measure here, Videos or actually games and that depends on how they are written.
  • erple2 - Saturday, March 14, 2020 - link

    I'm not sure that I understand your statement here, as it doesn't seem to make any sense. I was not aware that they physical dimensions of the screen mattered at all to the GPU, apart from how many pixels it has to individually manage/draw. If your implication is the complexity and quantity of information that can be made significant on a 32" screen is different from a 5.7" screen, then I suppose you can make that argument. However, I have to make guesses as to what you meant for this to come to that conclusion.

    Generally the graphical load to display 4k resolution is independent of whether the actual screen is 6" or 100". Unless I'm mistaken?
  • PeachNCream - Monday, March 2, 2020 - link

    For once, I agree with HStewart (feels like I've been shot into the Twilight Zone to even type that). To the point though, Windows XP was released in 2001. Phones in that time period were still using black and white LCD displays. Intel's graphics processors in that time period were the Intel Extreme series built into the motherboard chipset (where they would remain until around 2010, after the release of WIndows 7). Sure those video processors are slow compared to modern cell phones, but nothing a phone could do when XP was in development was anything close to what a bottom-feeder graphics processor could handle. I mean crap, Doom ran (poorly) on a 386 with minimal video hardware and that was in the early 1990s whereas phones eight years later still didn't have color screens.

Log in

Don't have an account? Sign up now