Conclusions

No-one ever said that integrated graphics solutions had to be good. Nonetheless there is always the desire for something better, something higher performance, and something suitable for end-users. At the beginning of the era of integrated graphics, the focus was on simply providing something basic for 2D work - applications on an operating system and no real graphics rendering to mention. That solution is simple enough, however the demand on integrated graphics has grown over time, especially as the demands we put on our devices have also grown.

A modern system, especially a commercial system or one designed for work, has to do it all. Anyone not working in graphics might depend on a integrated solution to navigate complex arty web interfaces for the tool they use, or rely upon the acceleration features now baked into those platforms. Also perhaps, from time to time, some mild gaming use as well, if not outright using the compute features of the graphics for transcoding or AI. These demands are most heavily focused on mobile platforms, and as a result mobile platforms from Intel tend to get the best integrated graphics solutions, especially in thin-and-light designs where a discrete graphics solution is too power hungry. Intel's mobile Tiger Lake-U series offers a sizeable 96 execution units of the latest generation graphics architecture, compared to the desktop processors we are testing today, that only have 32.

So what use is a desktop processor with integrated graphics?

AMD and Intel both have product lines with integrated graphics. From Intel, its integrated graphics is in almost everything that Intel sells for consumers. AMD used to be that way in the mid-2010s, until it launched Ryzen, and now we have separate CPU-only and CPU+Graphics options. This is where the company philosophy differs.

AMD's desktop processors with integrated graphics are primarily intended to be a whole system replacement, with users relying on the integrated graphics for all their graphics needs. As a result AMD puts a lot more processing hardware into its integrated graphics solutions for the desktop, and it results in a good gaming experience for entry level gaming.

Intel's route on the otherhand is a bit more basic. The desktop integrated graphics here has two main directions: first, as the basic graphics needed for an office system, or second, more of a fall-back option for when the discrete card doesn't work or fails in more premium desktop systems. The power isn't there for hardcore grunt like gaming of any serious note, but it is certainly enough to cover the basics.

Despite this, with the new Xe-LP graphics solution, Intel has some aces up its sleeve. First is AV1 decoding, which allows users to watch AV1 content without putting stress on the CPU. Second is video encoding accelerationt through QuickSync, which has actually been a part of Intel's graphics for a number of years. Third is a relatively new feature: Intel's 'additional processor' mentality. Normally when a system has a discrete graphics card, the integrated graphics is disabled. But now, with its latest mobile devices for example, when Intel pairs its mobile processor with integrated graphics with a second graphics solution at about the same performance, with the right software Intel allows both graphics to work asynchrouusly on two different problems. The limit to this in the past has been dictating which graphics is the video out rather than simply a compute accelerator, but Intel believes it has worked it out. However, this is relatively little use for gaming, the topic of today.

Results Summary

In this review, we highlighted that Intel has now implemented its new Xe-LP graphics architecture onto its desktop processor line, and tested the new solutions against our traditional CPU gaming test suite. What we saw, in terms of a generational uplift from the i9-10900K to the i9-11900K, is actually quite impressive:

In our 720p testing, there's a clear generational gain across the board for Rocket Lake, and in most cases the games become a lot more playable. The average gain is 39%. If we flip to our gaming results at the higher resolution and settings:


Games with under 10 FPS across the board are left out

For these titles, the average gain is 153%, showing that Xe-LP is certainly a step up regardless of the workload.

The Future of Integrated Graphics

A key talking point about integrated graphics is whether a company should leverage a strong CPU product at the expense of graphics, or aim for something with strong integrated graphics as a more complete chip at the expense of the mid-range graphics market. The console market for example relies fully on integrated graphics designs, especially as it keeps the manufacturing simpler and number of chips lower. But on the desktop space, because discrete graphics are an option (well, when we're not in a mining craze or semiconductor shortage), there seems to be no impetus for companies to do a full fat integrated graphics solution that competes on the same stage as a mid-range graphics card. AMD could do it, but it might overlap with their console agreements, and Intel hasn't done anything serious since Broadwell.

To put a nod to Broadwell, Intel's 5th Gen processor. It was so powerful at integrated graphics at the time, we are still using it today as a comparison point when comparing against other Intel solutions. Broadwell had dedicated 48% of the die area of its top processor to graphics, and for that product it also added some really fast cache memory as well. Intel's focus on integrated graphics as a function of die size has decreased over time, now with Rocket Lake sits at around 20% of the silicon. It hasn't been this low since Intel first introduced its integrated graphics solutions. For that 20%, we get 32 execution units with eight processing cores. Tiger Lake has 96 EUs which total around 33% of overall die size, but has four cores. If Intel was focused on graphics performance in the same way as it was in Broadwell, we might be looking at a 256+ EU solution.

With Intel taking a renewed approach to graphics with its Xe portfolio, stemming from entry up to high performance compute, there is room here for Intel to develop integrated graphics focused solutions. Intel has detailed that it is moving to chiplets with its future mainstream processors under its 'Client 2.0' strategy, and part of that is allowing customers to select how many IP blocks they want of cores, IO, memory, security, and graphics. In the image above, the Gamer option has half of the die area for graphics. This could at the end of the day be a target that could see Intel making desktop integrated graphics a focus again.

 

 

 

 

 

 

Integrated Graphics Testing
Comments Locked

165 Comments

View All Comments

  • GeoffreyA - Monday, May 10, 2021 - link

    180 or 130 nm. Yikes. No wonder.
  • mode_13h - Tuesday, May 11, 2021 - link

    Thanks for the details. I was confused about which generation he meant. If he'd have supplied even half the specifics you did, that could've been avoided.

    Also, dragging up examples from the 2000's just seems like an egregious stretch to engage in Intel-bashing that's basically irrelevant to the topic at hand.
  • mode_13h - Wednesday, May 12, 2021 - link

    There's another thing that bugs me about his post, and I figured out what it is. Everything he doesn't like seems to be the result of greed or collusion. Whatever happened to plain old incompetence?

    And even competent organizations and teams sometimes build a chip with a fatal bug or run behind schedule and miss their market window. Maybe they *planned* on having a suitable GPU, but the plan fell through and they misjudged the suitability of their fallback solution? Intel has certainly shown itself to be fallible, time and again.
  • GeoffreyA - Wednesday, May 12, 2021 - link

    Incompetence and folly have worked against these companies over and over again. The engineering has almost always been brilliant but the decisions have often been wrong.
  • yeeeeman - Friday, May 7, 2021 - link

    Silvermont was quite good in terms of power and performance. It actually competed very well with the snapdragon 835 at the time and damn the process was efficient. The vcore was 0.35v! and on the igpu you can play rocket league at low stting 720p with 20 fps and 3w total chip power. If that isn't amazing for a 2014 chip then I don't know what it is. Intel actually was very good in between 2006 core 2 duo and 2015 with Skylake. Their process was superior to the competition and the designs are quite good also.
  • mode_13h - Friday, May 7, 2021 - link

    Thanks for the details.
  • SarahKerrigan - Saturday, May 8, 2021 - link

    Silvermont was fine, though not great. The original Atom core family was utterly godawful - dual-issue in-order, and clocked like butt.
  • mode_13h - Sunday, May 9, 2021 - link

    The 1st gen had hyperthreading, which Intel left out of all subsequent generations.

    Gracemont is supposed to be pretty good, but then it's a lot more complex, as well.
  • Oxford Guy - Sunday, May 9, 2021 - link

    The only thing that's really noteworthy about the first generation is how much of a bait and switch the combination of the CPU and chipset + GPU was — and how the netbook hype succeeded

    It was literally paying for pain (the CPU's ideological purist design — pursuing power efficiency too much at the expense of time efficiency) and getting much more pain without (often) knowing it (the disgustingly inefficient chipset + GPU).

    As for the hyperthreading... my very hazy recollection is that it didn't do much to enhance the chip's performance. As to why it was dumped for later iterations — probably corporate greed (i.e. segmentation). Greed is the only reason why Atom was such a disgusting product. Had the CPU been paired with a chipset + GPU designed according to the same purist ideological goal it would have been vastly more acceptable.

    As it was, the CPU was the tech world's equivalent of the 'active ingredient' scam used in pesticides. For example, one fungicide's ostensible active ingredient is 27,000 times less toxic to a species of bee than one of the 'inert' ingredients used in a formulation.
  • watersb - Sunday, May 9, 2021 - link

    The first ever Atom platform did indeed use a chipset that used way more power than the CPU itself. 965G I think. I built a lab of those as tiny desktop platform for my daughter's school at the time. They were good enough for basic Google Earth at full screen, 1024x768.

    The Next Atom I had was a Something Trail tablet, the HP Stream 7 of 2014. These were crippled by using only one of the two available memory channels, which was devastating to the overall platform performance. Low end chips can push pixels at low spec 7-inch displays, if they don't have to block waiting for DRAM. The low power, small CPU cache Atom pretty much requires a decent pipe to RAM, otherwise you blow the power budget just sitting there.

    The most recent Atoms I have used are HP Stream 11, N2000 series little Windows laptops. Perfect for little kids, especially low income families who were caught short this past year, trying to provide one laptop per child as the schools went to remote-only last year.

    Currently the Atom N4000 series makes for a decent Chromebook platform for remote learning.

    So I can get stuff done on Atom laptops. Not competitive to ARM ones, performance or power efficiency, but my MacBook Pro M1 cost the same as that 10-seat school lab. Both the Mac and the lab are very good value for the money. Neither choice will get you Crysis or Tomb Raider.

Log in

Don't have an account? Sign up now