Wandering Thoughts archives

2018-04-29

My new 4K HiDPI display really does make a visible difference

As peculiar as it sounds, I had no idea if I'd see a clear visible difference from moving up to my new 4K HiDPI display before I got it. In fact I more or less expected it to not be a big change from my previous monitor (a 24" 16:10 Dell U2412M). I'd read people raving about the Retina displays on Mac laptops, but people have different standards for what they consider rave-worthy (and what they notice about displays). And it wasn't as if my existing display was clearly pixelated or low-resolution, the way old computer displays definitely were. I though that perhaps I already was in the zone of diminishing returns for increased display density.

A little bit to my surprise, it turns out that my new display does make a clearly visible difference. I didn't see this right away, when I first set it up, and even after I'd used it for an evening or two. What it took to make this really apparent was having my environment settle down and then going back and forth between work (where I'm using Dell U2412Ms) and home. I use the same fonts at almost the same physical size at home and work, and after things settled down it was clear that my home display was visibly crisper and clearer.

Interestingly, the really clear example for me was the / character in terminal windows (and thus in a monospaced font). At work on a 94 PPI screen, I could notice that it had a bit of 'staircase' pixelization; not glaring or obtrusive, but the kind of thing I could see if I was paying attention. At home, on a 163 PPI screen, there are no pixels even if I peer closely (far more closely than I normally am at); it's a smooth slanted line.

(Now that I've noticed this effect at work, I can't unsee it, which is a tiny bit annoying. Fortunately the monospaced / is the only character where I'm really conscious of the difference, although other characters are also better and crisper.)

Text isn't the only thing that looks better at the display's resolution, either. Many program icons are sharper and nicer to look at (although some of them may also be physically bigger, because of scaling issues). Pictures in things like web browsers are a somewhat more mixed bag, but a fair amount of the time it works well (sometimes they get a bit fuzzy, presumably because browser scaling can only do so much).

Following my standard views, I haven't attempted to use physically smaller fonts on the new display. Small(er) fonts might now be more readable than before because they're crisper and smoother, but I still feel no particular urge to find out or to squeeze more text into the available space.

PS: My new display may also be somewhat brighter, which may have its own effects on apparent crispness and readability. I haven't gotten around to even checking the brightness and contrast settings at work, much less planning out an experiment to give me some sort of light level value (although smartphone cameras and the right apps probably make it possible).

Sidebar: Why I bought a 4K HiDPI display despite uncertainties

I've been looking forward to high-DPI displays for some time for their advantages in crisp, high resolution display of text and other things. I've been around computers long enough that I've seen screen text go from being clearly low-resolution pixelated fonts to being much better, so I was all in favour of taking another step forward toward screens that would look as good as print does. That was my logic for persuading myself to get a high-DPI 4K display; I'd theoretically been looking forward to it for years and ones I considered reasonably affordable were finally available, so it was time to step up to the plate.

(I'm very good at sitting on my hands and not buying things, even things I'll enjoy. It's not logical, but people aren't logical.)

4KHiDPIIsVisible written at 01:08:05; Add Comment

2018-04-18

A CPU's TDP is a misleading headline number

The AMD Ryzen 1800X in my work machine and the Intel Core i7-8700K in my home machine are both 95 watt TDP processors. Before I started measuring things with the actual hardware, I would have confidently guessed that they would have almost the same thermal load and power draw, and that the impact of a 95W TDP CPU over a 65W TDP CPU would be clearly obvious (you can see traces of this in my earlier entry on my hardware plans). Since it's commonly said that AMD CPUs run hotter than Intel ones, I'd expect the Ryzen to be somewhat higher than the Intel, but how much difference would I really expect from two CPUs with the same TDP?

Then I actually measured the power draws of the two machines, both at idle and under various different sorts of load. The result is not even close; the Intel is clearly using less power even after accounting for the 10 watts of extra power the AMD's Radeon RX 550 graphics card draws when it's lit up. It's ahead at idle, and it's also ahead under full load when the CPU should be at maximum power draw. Two processors that I would have expected to be fundamentally the same at full CPU usage are roughly 8% different in measured power draw; at idle they're even further apart on a proportional basis.

(Another way that TDP is misleading to the innocent is that it's not actually a measure of CPU power draw, it's a measure of CPU heat generation; see this informative reddit comment. Generally I'd expect the two to be strongly correlated (that heat has to come from somewhere), but it's possible that something that I don't understand is going on.)

Intellectually, I may have known that a processor's rated TDP was merely a measure of how much heat it could generate at maximum and didn't predict either its power draw when idle or its power draw under load. But in practice I thought that TDP was roughly TDP, and every 95 watt TDP (or 65 watt TDP) processor would be about the same as every other one. My experience with these two machines has usefully smacked me in the face with how this is very much not so. In practice, TDP apparently tells you how big a heatsink you need to be safe and that's it.

(There are all sorts of odd things about the relative power draws of the Ryzen and the Intel under various different sorts of CPU load, but that's going to be for another entry. My capsule summary is that modern CPUs are clearly weird and unpredictable beasts, and AMD and Intel must be designing their power-related internals fairly differently.)

PS: TDP also doesn't necessarily predict your actual observed CPU temperature under various conditions. Some of the difference will be due to BIOS decisions about fan control; for example, my Ryzen work machine appears to be more aggressive about speeding up the CPU fan, and possibly as a result it seems to report lower CPU temperatures under high load and power draw.

(Really, modern PCs are weird beasts. I'm not sure you can do more than putting in good cooling and hoping for the best.)

TDPMisleading written at 02:04:17; Add Comment

2018-04-13

For the first time, my home PC has no expansion cards

When I started out with PCs, you needed a bunch of expansion cards to make them do anything particularly useful. In the era of my first home PC, almost all I used on the motherboard was the CPU and the memory; graphics, sound, Ethernet (if applicable to you), and even a good disk controller were add-on cards. As a result, selecting a motherboard often involved carefully counting how many slots you got and what types they were, to make sure you had enough for what you needed to add.

(Yes, in my first PC I was determined enough to use SCSI instead of IDE. It ran BSDi, and that was one of the recommendations for well supported hardware that would work nicely.)

Bit by bit, that's changed. In the early 00s, things started moving on to the motherboard, starting (I believe) with basic sound (although that didn't always work out for Linux people like me; as late as 2011 I was having to use a separate sound card to get things working). When decent SATA appeared on motherboards it stopped being worth having a separate disk controller card, and eventually the motherboard makers started including not just Ethernet but even decent Ethernet chipsets. Still, in my 2011 home machine I turned to a separate graphics card for various reasons.

With my new home machine, I've taken the final step on this path. Since I'm using the Intel onboard graphics, I no longer need even a separate graphics card and now have absolutely no cards in the machine; everything is on the motherboard. It's sometimes an odd feeling to look at the back of my case and see all of the case's slot covers still in place.

(My new work machine still needs a graphics card and that somehow feels much more normal and proper, especially as I've also added an Ethernet card to it so that I have a second Ethernet port for sysadmin reasons.)

I think one of the reasons that having no expansion cards feels odd to me is that for a long time having an all-onboard machine was a sign that you'd bought a closed box prebuilt PC from a vendor like Dell or HP (and were stuck with whatever options they'd bundled in to the box). These prebuilt PCs have historically not been a great choice for people who wanted to run Linux, especially picky people like me who want unusual things, and I've had the folkloric impression that they were somewhat cheaply put together and not up to the quality standards of a (more expensive) machine you'd select yourself.

As a side note, I do wonder about the business side of how all of this came about. Integrating sound and Ethernet and so on on motherboards isn't completely free (if nothing else, the extra physical connectors cost something), so the motherboard vendors had to have a motivation. Perhaps it was just the cut-throat competition that pushed them to offering more things on the board in order to make themselves more attractive.

(I also wonder what will be the next thing to become pervasive on motherboards. Wireless networking is one possibility, since it's already on higher end motherboards, and perhaps BlueTooth. But it also feels like we're hitting the limits of what can be pulled on to motherboards or added.)

PCAllOnboard written at 22:00:40; Add Comment

2018-04-08

A learning experience with iOS's fingerprint recognition

I have both an iPhone and an iPad, both of which have fingerprint based unlocking, which I use. I interact with the iPhone sufficiently often that I generally unlock it multiple times a day, but for various reasons I use the iPad much less frequently and can even go for a couple of days before I dig it out and poke at it.

It's been winter around here for the past while, and Toronto's winter is dry. These days that dryness is hard on my fingers, especially the fingers of my right hand (I'm right handed, which may contribute to this); my fingertips get chapped and cracked and generally a bit beaten up despite some effort to take care of them by slathering moisturizer on and so on.

(The problem with using moisturizer, especially on your fingertips, is that I generally want to do something with my hands and don't want to get moisturizer all over what I'll be typing on or holding or whatever.)

Over the course of this winter, I gradually noticed that my iPad was getting harder and harder to unlock. I'd have to wiggle my right thumb around to get it to like it, and sometimes it just wouldn't and I'd wind up typing my unlock password. If I remembered to try my left thumb, often that would work, and my iPhone had no problems at all; I'd tap it and pretty much it'd always unlock. For most of the winter, when this happened I'd wipe the sensor clean on the iPad and mutter to myself and just live with it. It had to be a little glitch on the iPad, right? But every so often I'd stare at my roughed-up and increasingly hard to make out right thumb fingerprint and wonder.

When I couldn't unlock the iPad recently, I gave in to frustration and tried something just to see if it would help: I enrolled my right thumb's fingerprint again (as a new fingerprint). The difference was night and day. Suddenly the iPad was unlocking just like my iPhone, like it was supposed to and as I remembered it doing in the beginning; tap the sensor and it unlocked instantly without fuss or problems.

My best guess is the obvious guess; not only does the iOS fingerprint system have some margin for error, but it updates its recognition model over time. Because I unlocked my iPhone often enough, its recognition model could follow along as my right thumb's fingerprint got more and more roughed up over the course of the winter. However I didn't unlock my iPad often enough for these updates to kick in (or they couldn't or didn't move the model fast enough), so as the model and my fingerprint drifted further and further apart it got harder and harder to get it to match up with my cracked-skin fingerprint. Re-enrolling my thumb again added a new recognition model that worked on the current, beaten up state.

(This time around I've actually named that fingerprint, so I can easily remove it later. I may try the experiment of removing it in the summer when my right thumb's fingerprint is all recovered and has good skin condition again. In theory the original enrollment should be good enough at that point.)

Next winter I'm going to try to use my iPad more often or at least unlock it more regularly. Probably I'll aim to unlock it a couple of times every day, even if I'm not going to do anything more than tell it to check for various sorts of updates.

(Or I could start reading books on it. I did recently get pulled into reading a great SF novella on it, which was a pretty good experience, and I certainly have more books I could read there.)

IOSFingerprintSurprise written at 02:40:22; Add Comment


Page tools: See As Normal.
Search:
Login: Password:
Atom Syndication: Recent Pages, Recent Comments.

This dinky wiki is brought to you by the Insane Hackers Guild, Python sub-branch.