My interest in Intel's future discrete GPUs (and my likely disappointment)

August 25, 2021

As a Linux user who doesn't play games, I have modest needs in graphics that in theory should be easily met by integrated GPUs. My (future) standard is that I want to be able to drive two 4K displays at 60 Hz. But in practice, trying to use integrated GPUs for this limits significantly limits both my choice of CPUs and motherboards. AMD's current Ryzen line has generally limited integrated GPUs to lower performance CPUs (perhaps because of thermal limits), while it can be hard to find Intel CPU motherboards that support integrated graphics, especially if you want one that uses higher end chipsets in order to get other features (for example, two x4 capable NVMe drives and a decent number of SATA ports).

(All of this is an aspect of how I want a type of desktop PC that's generally skipped.)

In recent history, there are effectively only three choices for x86 graphics under Linux; Intel, AMD, and NVIDIA. If you care about open source driver quality, Intel is generally first (with only a few stumbles), AMD is second, and NVIDIA is a very distant third. However, Intel historically didn't make discrete GPUs, and so if you needed a discrete GPU (as I did on my work Ryzen based desktop), people who cared about open source drivers were strongly steered to AMD. And so my work desktop has a basic AMD GPU of the time.

For some time now, Intel has been lurching toward offering discrete GPUs (ie, GPU cards) in addition to their integrated GPUs. Recently we even sort of got a date for when the first ones might be available, which is theoretically the first quarter of next year (from Anandtech). This sounds great, and just what I'd like to make building another PC easier. A solid Intel discrete GPU card that's well supported by open source drivers would open up my choice of CPUs and motherboards while hopefully having fewer issues than AMD GPUs (or at least creating more competition to encourage both companies).

The flaw in this lovely picture of the future is that what Intel is likely to offer is probably not what I want. In a perfectly reasonable decision, Intel is apparently talking about starting with high-performance GPU cards, which are also expensive, probably hot, and unlikely to be passively cooled. This isn't really what I want; my ideal discrete GPU is inexpensive, low power consumption, and passively cooled or at least basically silent. I don't need a powerful gaming or GPU computation GPU, and I doubt I have any software that could use one.

(Well, darktable might be able to use the GPU through OpenCL, if I started taking and processing photos again.)

Even if Intel only offers mid-range and higher GPU cards, I might still end up choosing one. This isn't because I think I'll need the GPU compute power (although maybe someday), but instead because I'm not sure I fully trust low end GPU cards any more. Plus, my impression is that mid-range GPUs are paying more attention to being quiet at low usage levels, since people have realized that this is where they spend a lot of their time.

Written on 25 August 2021.
« Notes on deliberately invoking actions controlled by systemd timers
I'm turning off dnf-makecache on my Fedora machines »

Page tools: View Source, Add Comment.
Search:
Login: Password:
Atom Syndication: Recent Comments.

Last modified: Wed Aug 25 01:12:42 2021
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.