The cooling advantage that CPU integrated graphics has
Once upon a time, you could readily get basic graphics cards, generally passively cooled and certainly single-width even if they had to have a fan in order to get you dual output support; this is, for example, more or less what I had in my 2011 era machines. These days these cards are mostly extinct, so when I put together my current office desktop I wound up with a dual width, definitely fan-equipped card that wasn't dirt cheap. For some time I've been grumpy about this, and sort of wondering where they went.
The obvious answer for where these cards went is that CPUs got integrated graphics (although not all CPUs, especially higher end ones, so you could wind up using a CPU without an IGP and needing a discrete GPU. When thinking about why integrated graphics displaced such basic cards, it recently struck me that one practical advantage integrated graphics has is cooling.
The integrated graphics circuitry is part of the CPU, or at least on the CPU die. General use CPUs have been actively cooled for well over a decade now, and for a long time they've been the focus of high performance cooling and sophisticated thermal management. The CPU is probably the best cooled thing in a typical desktop (and it needs to be). Cohabiting with this heat source constrains the IGP, but it also means that the IGP can take advantage of the CPU's cooling to cool itself, and that cooling is generally quite good.
A discrete graphics card has no such advantage. It must arrange its own cooling and its own thermal management, both of which cost money and the first of which takes up space (either for fans or for passive heatsinks). This need for its own cooling makes it less competitive against integrated graphics, probably especially so if the card is trying to be passively cooled. I wouldn't be surprised if the options were a card that didn't even compare favorably to integrated graphics or a too-expensive card for the performance you got. There's also the question of whether the discrete GPU chipsets you can get are even focused on low power usage or whether they're designed to assume full cooling to allow performance that's clearly better than integrated graphics.
(Another limit, now that I look, is the amount of power available to a PCIe card, especially one that uses fewer than 16 PCIe lanes; apparently a x4 or x8 card may be limited to 25W total (with an x16 going to 75W), per Wikipedia. However, I don't know how this compares to the amount of power an IGP is allowed to draw, especially in CPUs with more modest overall power usage.)
The more I look at this, the more uncertainties I have about the thermal and power constraints that may or may not face discrete GPU cards that are aiming for low cost while still offering, say, multi-monitor support. I imagine that the readily available and more or less free cooling that integrated graphics gets doesn't help the discrete GPUs, but I'm not sure how much of a difference it really makes.
Comments on this page:
|
|