Will (more) powerful discrete GPUs become required in practice in PCs?

June 13, 2025

One of the slow discussions I'm involved in over on the Fediverse started with someone wondering what modern GPU to get to run Linux on Wayland (the current answer is said to be an Intel Arc B580, if you have a modern distribution version). I'm a bit interested in this question but not very much, because I've traditionally considered big discrete GPU cards to be vast overkill for my needs. I use an old, text-focused type of X environment and I don't play games, so apart from needing to drive big displays at 60Hz (or maybe someday better than that), it's been a long time since I needed to care about how powerful my graphics was. These days I use 'onboard' graphics whenever possible, which is to say the modest GPU that Intel and AMD now integrate on many CPU models.

(My office desktop has more or less the lowest end discrete AMD GPU with suitable dual outputs that we could find at the time because my CPU didn't have onboard graphics. My current home desktop uses what is now rather old onboard Intel graphics.)

However, graphics aren't the only thing you can do with GPUs these days (and they haven't been for some time). Increasingly, people do a lot of GPU computing (and not just for LLMs; darktable can use your GPU for image processing on digital photographs). In the old days, this GPU processing was basically not worth even trying on your typical onboard GPU (darktable basically laughed at my onboard Intel graphics), and my impression is that's still mostly the case if you want to do serious work. If you're serious, you want a lot of GPU memory, a lot of GPU processing units, and so on, and you only really get that on dedicated discrete GPUs.

You'll probably always be able to use a desktop for straightforward basic things with only onboard graphics (if only because of laptop systems that have price, power, and thermal limits that don't allow for powerful, power-hungry, and hot GPUs). But that doesn't necessarily mean that it will be practical to be a programmer or system administrator without a discrete GPU that can do serious computing, or at least that you'll enjoy it very much. I can imagine a future where your choices are to have a desktop with a good discrete GPU so that you can do necessary (GPU) computation, bulk analysis, and so on locally, or to remote off to some GPU-equipped machine to do the compute-intensive side of your work.

(An alternate version of this future is that CPU vendors stuff more and more GPU compute capacity into CPUs and the routine GPU computation keeps itself to within what the onboard GPU compute units can deliver. After all, we're already seeing CPU vendors include dedicated GPU computation capacity that's not intended for graphics.)

Even if discrete GPUs don't become outright required, it's possible that they'll become so useful and beneficial that I'll feel the need to get one; not having one would be workable but clearly limiting. I might feel that about a discrete GPU today if I did certain sorts of things, such as large scale photo or video processing.

I don't know if I believe in this general future, where a lot of important things require (good) GPU computation in order to work decently well. It seems a bit extreme. But I've been quite wrong about technology trends in the past that similarly felt extreme, so nowadays I'm not so sure of my judgment.


Comments on this page:

By nyanpasu64 at 2025-06-14 16:59:33:

I would not recommend getting a B580 for Linux right now, because it has a driver bug that drops frames on Wayland: https://gitlab.freedesktop.org/drm/xe/kernel/-/issues/4363

By Simon at 2025-06-14 17:49:47:

If you're serious, you want a lot of GPU memory, a lot of GPU processing units, and so on, and you only really get that on dedicated discrete GPUs.

The memory part of that sentence is not true since memory can be dynamically allocated between CPU and GPU for (modern-ish) integrated GPUs (see for example here). Although it's worth noting that dedicated GPUs probably have faster memory.

From 193.219.181.219 at 2025-06-16 08:47:15:

what modern GPU to get to run Linux on Wayland

I've not seen the discussion so I'm a bit surprised that the conclusion was that it needs getting a modern GPU. There are certain DRI features compositors tend to require, I seem to remember, but it's not particularly rendering-heavy, is it?

(I used to run GNOME with Wayland on a 2008 laptop with embedded Radeon HD5400 – which lasted until around 2018 – so it doesn't seem like an inherently "modern GPU" thing.)

I would've suspected a) browsers and b) widget toolkits like GTK4 going all-in with Vulkan acceleration to be where most of the GPU demand comes from, even on X11...

Written on 13 June 2025.
« What would a multi-user web server look like? (A thought experiment)
Revisiting ZFS's ZIL, separate log devices, and writes »

Page tools: View Source, View Normal.
Search:
Login: Password:

Last modified: Fri Jun 13 23:21:11 2025
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.