The cooling advantage that CPU integrated graphics has

January 24, 2024

Once upon a time, you could readily get basic graphics cards, generally passively cooled and certainly single-width even if they had to have a fan in order to get you dual output support; this is, for example, more or less what I had in my 2011 era machines. These days these cards are mostly extinct, so when I put together my current office desktop I wound up with a dual width, definitely fan-equipped card that wasn't dirt cheap. For some time I've been grumpy about this, and sort of wondering where they went.

The obvious answer for where these cards went is that CPUs got integrated graphics (although not all CPUs, especially higher end ones, so you could wind up using a CPU without an IGP and needing a discrete GPU. When thinking about why integrated graphics displaced such basic cards, it recently struck me that one practical advantage integrated graphics has is cooling.

The integrated graphics circuitry is part of the CPU, or at least on the CPU die. General use CPUs have been actively cooled for well over a decade now, and for a long time they've been the focus of high performance cooling and sophisticated thermal management. The CPU is probably the best cooled thing in a typical desktop (and it needs to be). Cohabiting with this heat source constrains the IGP, but it also means that the IGP can take advantage of the CPU's cooling to cool itself, and that cooling is generally quite good.

A discrete graphics card has no such advantage. It must arrange its own cooling and its own thermal management, both of which cost money and the first of which takes up space (either for fans or for passive heatsinks). This need for its own cooling makes it less competitive against integrated graphics, probably especially so if the card is trying to be passively cooled. I wouldn't be surprised if the options were a card that didn't even compare favorably to integrated graphics or a too-expensive card for the performance you got. There's also the question of whether the discrete GPU chipsets you can get are even focused on low power usage or whether they're designed to assume full cooling to allow performance that's clearly better than integrated graphics.

(Another limit, now that I look, is the amount of power available to a PCIe card, especially one that uses fewer than 16 PCIe lanes; apparently a x4 or x8 card may be limited to 25W total (with an x16 going to 75W), per Wikipedia. However, I don't know how this compares to the amount of power an IGP is allowed to draw, especially in CPUs with more modest overall power usage.)

The more I look at this, the more uncertainties I have about the thermal and power constraints that may or may not face discrete GPU cards that are aiming for low cost while still offering, say, multi-monitor support. I imagine that the readily available and more or less free cooling that integrated graphics gets doesn't help the discrete GPUs, but I'm not sure how much of a difference it really makes.


Comments on this page:

Another advantage is those IGPs, when integrated in a manageability platform like Intel AMT, offer remote KVM functionality for the poor man’s IPMI (works better if you use MeshCommander).

By Ian Z aka nobrowser at 2024-01-25 14:38:21:

Also: in the case of active cooling, the extra noise.

By Barry at 2024-01-25 16:15:23:

For their current desktop graphic cards Nvidia and AMD haven't released entry-level GPUs: everything has triple-digit watt consumption, so passive cooling isn't even an option. Pallit is rumoured to have a KalmX GEForce 3050 in the works, which will be midrange and have an enormous heatsink, but it's something. Any other fanless card will have an even older GPU.

By Anonymous at 2024-01-26 07:01:20:

Another advantage is those IGPs, when integrated in a manageability platform like Intel AMT, offer remote KVM functionality for the poor man’s IPMI (works better if you use MeshCommander).

I thought Intel had discontinued support for MeshCommander ?

By Edward R at 2024-01-26 17:17:12:

Another limit, now that I look, is the amount of power available to a PCIe card

That's not an important limit. Any card that wants more power will just have you plug in a "PCIe power" connector directly from the power supply. Evidently, a 6-pin connector can supply an extra 75 W, an 8-pin adds 150 W, and some high-end cards have two 8-pin connectors (so: 150, 225, or 300 W total).

For what it's worth, I've got a Gigabyte Eagle RX 6600, being the cheapest decent non-Nvidia card I could find at the time (I'm also annoyed at the lack of low-end options). It's a double-width thing that can draw up to 132 watts and has multiple fans, but it's turned out not to be a problem in practice. Most motherboards leave a blank PCIe slot near the main ×16 connector, maybe with an M.2 slot lying flat, and I don't think I've heard the fans—which don't even come on if i'm not gaming. With motherboards having integrated network, sound, and storage controllers, it's rare that anyone would need 7 expansion cards, as ATX cases allow.

If I could've bought a CPU with integrated graphics and ECC RAM support, I'd have done that. Though people with high-end monitors need to consider the constant display-scanout bandwidth: three "4K" monitors at high refresh-rates could degrade RAM performance by perhaps 25%.

Written on 24 January 2024.
« CGI programs have an attractive one step deployment model
In Go, I'm going to avoid using 'any' as an actual type »

Page tools: View Source, View Normal.
Search:
Login: Password:

Last modified: Wed Jan 24 22:22:06 2024
This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.