Conclusions

No-one ever said that integrated graphics solutions had to be good. Nonetheless there is always the desire for something better, something higher performance, and something suitable for end-users. At the beginning of the era of integrated graphics, the focus was on simply providing something basic for 2D work - applications on an operating system and no real graphics rendering to mention. That solution is simple enough, however the demand on integrated graphics has grown over time, especially as the demands we put on our devices have also grown.

A modern system, especially a commercial system or one designed for work, has to do it all. Anyone not working in graphics might depend on a integrated solution to navigate complex arty web interfaces for the tool they use, or rely upon the acceleration features now baked into those platforms. Also perhaps, from time to time, some mild gaming use as well, if not outright using the compute features of the graphics for transcoding or AI. These demands are most heavily focused on mobile platforms, and as a result mobile platforms from Intel tend to get the best integrated graphics solutions, especially in thin-and-light designs where a discrete graphics solution is too power hungry. Intel's mobile Tiger Lake-U series offers a sizeable 96 execution units of the latest generation graphics architecture, compared to the desktop processors we are testing today, that only have 32.

So what use is a desktop processor with integrated graphics?

AMD and Intel both have product lines with integrated graphics. From Intel, its integrated graphics is in almost everything that Intel sells for consumers. AMD used to be that way in the mid-2010s, until it launched Ryzen, and now we have separate CPU-only and CPU+Graphics options. This is where the company philosophy differs.

AMD's desktop processors with integrated graphics are primarily intended to be a whole system replacement, with users relying on the integrated graphics for all their graphics needs. As a result AMD puts a lot more processing hardware into its integrated graphics solutions for the desktop, and it results in a good gaming experience for entry level gaming.

Intel's route on the otherhand is a bit more basic. The desktop integrated graphics here has two main directions: first, as the basic graphics needed for an office system, or second, more of a fall-back option for when the discrete card doesn't work or fails in more premium desktop systems. The power isn't there for hardcore grunt like gaming of any serious note, but it is certainly enough to cover the basics.

Despite this, with the new Xe-LP graphics solution, Intel has some aces up its sleeve. First is AV1 decoding, which allows users to watch AV1 content without putting stress on the CPU. Second is video encoding accelerationt through QuickSync, which has actually been a part of Intel's graphics for a number of years. Third is a relatively new feature: Intel's 'additional processor' mentality. Normally when a system has a discrete graphics card, the integrated graphics is disabled. But now, with its latest mobile devices for example, when Intel pairs its mobile processor with integrated graphics with a second graphics solution at about the same performance, with the right software Intel allows both graphics to work asynchrouusly on two different problems. The limit to this in the past has been dictating which graphics is the video out rather than simply a compute accelerator, but Intel believes it has worked it out. However, this is relatively little use for gaming, the topic of today.

Results Summary

In this review, we highlighted that Intel has now implemented its new Xe-LP graphics architecture onto its desktop processor line, and tested the new solutions against our traditional CPU gaming test suite. What we saw, in terms of a generational uplift from the i9-10900K to the i9-11900K, is actually quite impressive:

In our 720p testing, there's a clear generational gain across the board for Rocket Lake, and in most cases the games become a lot more playable. The average gain is 39%. If we flip to our gaming results at the higher resolution and settings:


Games with under 10 FPS across the board are left out

For these titles, the average gain is 153%, showing that Xe-LP is certainly a step up regardless of the workload.

The Future of Integrated Graphics

A key talking point about integrated graphics is whether a company should leverage a strong CPU product at the expense of graphics, or aim for something with strong integrated graphics as a more complete chip at the expense of the mid-range graphics market. The console market for example relies fully on integrated graphics designs, especially as it keeps the manufacturing simpler and number of chips lower. But on the desktop space, because discrete graphics are an option (well, when we're not in a mining craze or semiconductor shortage), there seems to be no impetus for companies to do a full fat integrated graphics solution that competes on the same stage as a mid-range graphics card. AMD could do it, but it might overlap with their console agreements, and Intel hasn't done anything serious since Broadwell.

To put a nod to Broadwell, Intel's 5th Gen processor. It was so powerful at integrated graphics at the time, we are still using it today as a comparison point when comparing against other Intel solutions. Broadwell had dedicated 48% of the die area of its top processor to graphics, and for that product it also added some really fast cache memory as well. Intel's focus on integrated graphics as a function of die size has decreased over time, now with Rocket Lake sits at around 20% of the silicon. It hasn't been this low since Intel first introduced its integrated graphics solutions. For that 20%, we get 32 execution units with eight processing cores. Tiger Lake has 96 EUs which total around 33% of overall die size, but has four cores. If Intel was focused on graphics performance in the same way as it was in Broadwell, we might be looking at a 256+ EU solution.

With Intel taking a renewed approach to graphics with its Xe portfolio, stemming from entry up to high performance compute, there is room here for Intel to develop integrated graphics focused solutions. Intel has detailed that it is moving to chiplets with its future mainstream processors under its 'Client 2.0' strategy, and part of that is allowing customers to select how many IP blocks they want of cores, IO, memory, security, and graphics. In the image above, the Gamer option has half of the die area for graphics. This could at the end of the day be a target that could see Intel making desktop integrated graphics a focus again.

 

 

 

 

 

 

Integrated Graphics Testing
Comments Locked

165 Comments

View All Comments

  • mode_13h - Friday, May 7, 2021 - link

    > I didn't see that the title question was answered in the article

    I think they presume that piece of meat behind your eyes is doing more than keeping your head from floating away. Look at the graphs, and see the answer for yourself.

    However, the article does in fact sort of answer it, in the title of the final page:

    "Conclusions: The Bare Minimum"
  • mode_13h - Friday, May 7, 2021 - link

    > unless Dr. Ian Cutress is asking whether Intel's current IGPs are "competitive"
    > with older Intel IGPs...which would seem to be the case.

    As is often the case, they're comparing it with previous generations that readers might be familiar with, in order to get a sense of whether/how much better it is.

    And it's not as if that's *all* they compared it against!
  • dwillmore - Friday, May 7, 2021 - link

    So your choices are postage stamp or slide show? No thank you.
  • Oxford Guy - Friday, May 7, 2021 - link

    My favorite part of the Intel CPU + Intel GPU history is Atom, where serious hype was created over how fabulously efficient the chip was, while it was sold with a GPU+chipset that used — what was it? — three times the power — negating the ostensible benefit from paying for the pain of an in-order CPU (a time-inefficient design sensibly abandoned after the Pentium 1). The amazing ideological purity of the engineering team’s design goal (maximizing the power efficiency of the CPU) was touted heavily. Netbooks were touted heavily. I said they’re a mistake, even before I learned (which wasn’t so easy) that the chipset+GPU solution Intel chose to pair with Atom (purely to save the company money) made the whole thing seem like a massive bait and switch.
  • mode_13h - Friday, May 7, 2021 - link

    > fabulously efficient the chip was, while it was sold with a GPU+chipset that used
    > — what was it? — three times the power

    Well, if they want to preserve battery life, maybe users could simply avoid running graphically-intensive apps on it? I think that's a better approach than constraining its graphics even further, which would just extend the pain.

    I'm also confused which Atoms you mean. I'm not sure, but I think they didn't have iGPUs until Silvermont, which was already an out-of-order core. And those SoC's only had 4 EUs, which I doubt consumed 3x the power of the CPU cores & certainly not 3x the power of the rest of the chip.

    What I liked best about Intel's use of their iGPUs in their low-power SoCs is that the drivers just work. Even in Linux, these chips were well-supported, pretty much right out of the gate.
  • TheinsanegamerN - Friday, May 7, 2021 - link

    Graphically intensive apps, you mean like windows explorer and a web browser? Because that was enought o obliterate battery life.

    The original atom platform was awful. Plain and simple.
  • 29a - Friday, May 7, 2021 - link

    This^ Atoms were awful turning the computer on would be considered graphically intensive.
  • mode_13h - Friday, May 7, 2021 - link

    I still don't follow the logic of the Oxford dude. Would it really have been a good solution to put in even worse graphics, further impinging on the user experience, just to eke out a little more battery life? I'm not defending the overall result, but that strikes me as an odd angle on the issue.

    Indeed, if explorer and web browser were as much as their GPU could handle, then it seems the GPU was well-matched to the task.
  • Oxford Guy - Sunday, May 9, 2021 - link

    You should learn about the Atom nonsense before posting opinions about it.

    The power consumption chipset + GPU completely negated the entire point of the Atom CPU, from its design philosophy to the huge hype placed behind it by Intel, tech media, and companies peddling netbooks.

    It is illustrative of large-scale bait and switch in the tech world. It happened purely because Intel wanted to save a few pennies, not because of technological restriction. The chipset + GPU could have been much more power-efficient.
  • Spunjji - Monday, May 10, 2021 - link

    You don't follow because you're trying to assess what he said by your own (apparently incomplete) knowledge, whereas what would make sense here would be to pay more attention to what he said - because, in this case, it's entirely accurate.

    Intel paired the first 45nm Atom chips with one of two chipsets - either the recycled 180nm 945 chipset, designed for Pentium 4 and Core 2 processors, or the 130nm Poulsbo chipset. The latter had an Imagination Technologies mobile-class GPU attached, but Intel never got around to sorting out working Windows drivers for it. In either case, it meant that they'd built an extremely efficient CPU on a cutting-edge manufacturing process and then paired it with a hot, thirsty chipset. It was not a good look; this was back when they were absolutely clobbering TSMC on manufacturing, too, so it was a supreme own-goal.

Log in

Don't have an account? Sign up now