What I think OpenSSH 8.2+'s work toward deprecating 'ssh-rsa' means
Today I discovered about what was to me a confusing and alarming note in the OpenSSH 8.3 release notes (via), which has actually been there since OpenSSH 8.2. Here is the text (or the start of it):
Future deprecation notice
It is now possible to perform chosen-prefix attacks against the SHA-1 algorithm for less than USD$50K. For this reason, we will be disabling the "ssh-rsa" public key signature algorithm by default in a near-future release.
For a sysadmin this is somewhat hair raising on initial reading. We have a lot of users with ssh-rsa keys in their authorized keys files, and it would be very disruptive if they someday suddenly had to update those files, either to have their current keys accepted or to change over to new ones. However, there seemed to be a lot of confusion among people discussing this about what it affected (with some people saying that it only affected host keys and personal keypairs should be fine, for example). So I did my best to dig into this, and the summary is I don't think this requires most people to change host key or personal key configurations. I have two reasons for coming to believe this.
On a practical level, the announcement specifically says that one of your alternatives is that you can continue to use 'the same key type' (ie RSA keys) but with RFC8332 RSA SHA-2 signature algorithms. Then if we look in the very latest OpenSSH sshd manpage, its section on the authorized_keys file format doesn't have a special key type for 'RSA with RFC8332 RSA SHA-2 signature algorithms'; the only RSA key type is our old friend 'ssh-rsa'. Nor does ssh-keygen have an option for key generation other than 'generic RSA'. Since there's no particular way to specify or generate RSA keys in a key format other than what we already have, it seems that existing existing ssh-rsa authorized_keys lines pretty much have to keep working.
On a general level, there are two things involved in verifying public keys, the pair of keys themselves and then a signature algorithm that's used to produce a proof that you have access to the private key. While each type of keypair needs at least one signature algorithm in order to be useful, it's possible to have multiple algorithms for a single type of key. What OpenSSH is moving toward deprecating is (as they say) the signature algorithm for RSA keys that uses SHA-1; they continue to support two further SHA-2 based signature algorithms for RSA keys. Where the confusion comes in is that OpenSSH uses the label 'ssh-rsa' for both the name of of the RSA keytype in authorized_keys files and similar places and the name of the original signature algorithm that uses RSA keys. In the beginning this was fine (there was only a single signature algorithm for RSA keys), but now this is confusing if you don't read carefully and see the difference.
For OpenSSH host keys specifically, a server with RSA host keys is fine if one of two things are true; if it supports the additional signature algorithms for RSA keys, or if it has additional host key types. A server is only in trouble (in the future) if it has only a RSA host key and only supports the SHA-1 based 'ssh-rsa' signature algorithm. For personal RSA keypairs, the basic logic is the same; you're fine if the server (and your client) support the additional signature algorithms for RSA keys or if you have non-RSA keys of a keytype that both ends support (sadly not everyone supports Ed25519 keys).
My various settings in X to get programs working on my HiDPI display
Back when I got my HiDPI display (a 27" Dell P2715Q), I wrote an entry about what the core practical problems with HiDPI seemed to be on Linux and talked in general terms about what HiDPI related settings were available but I never wrote about what specific things I was setting and where. Today I'm going to remedy this, partly for my own future use for the hopeful future day when I need to duplicate this at work. Since I'm doing this two years after the fact, there will be an exciting element of software archaeology involved, because now I have to find all of those settings from the clues I left behind in earlier entries.
As mentioned in my old entry, the Dell P2715Q
is a 163 DPI display. To make the X server itself know the correct
DPI, I run it with a '
-dpi 163' command line argument. I don't
use XDM or any other graphical login manager; I
start the X server from a text console with a nest of shell scripts,
so I can supply custom arguments this way. I don't do anything with
xrandr, which came up
with plausible reported screen dimensions of 597mm x 336mm and didn't
appear to need any changes.
I use xsettingsd as my
XSettings daemon, and set two DPI related
Gdk/UnscaledDPI 166912 Xft/DPI 166912
Both of these values are my 163 DPI multiplied by 1024. For Xft/DPI, this is documented in the Xsettings registry. I'm not sure if I found documentation for Gdk/UnscaledDPI or just assumed it would be in the same units as Xft/DPI.
There is also an X resource setting:
As we can see, this is just the DPI.
Then I set some environment variables, which (in 2018) came from Arch's HiDPI page, the Gnome wiki, and the GTK3+ X page. First there is a setting to tell Qt apps to honor the screen DPI:
Then there is a pair of GTK settings to force GTK+ applications to scale their UI elements up to HiDPI but not scale the text, as explained in more depth in my original entry:
export GDK_SCALE=2 export GDK_DPI_SCALE=0.5
These three environment variables are only necessary for Qt and GTK+ applications, not basic X applications. Basic X applications seem to work fine with some combination of the Xft.dpi X resource and the XSettings system.
If you're running remote X applications from your HiDPI X session, as I am these days, they will automatically see your Xft.dpi X resource and your XSettings settings. They won't normally see your (my) specially set environment variables. Fortunately I mostly run basic X applications that only seem to use X resources and perhaps XSettings, and so basically just work the same as your local versions.
(At least after you fix any problems you have with X cursors on the remote machines.)
At the moment I'm not sure if setting the environment variables for
remote X programs (for instance by logging in with '
setting them by hand, and then running the relevant program) works
just the same as setting them locally. Some testing suggests that
it probably is; while I see some visual differences, this is probably
partly just because I haven't adjusted my remote programs that I'm
testing with the way I have my regularly used local ones (after
all, I normally use them on my work regular DPI displays and hopefully
some day I'll be doing that again).
The final setting I make is in Firefox. As mentioned in passing in
this entry, I manually set the
layout.css.devPixelsPerPx to 1.7, which
is down from what would be the default of '2' based on my overall
settings. I found that if I left Firefox alone with these other
settings, its font sizes looked too big to me. A devPixelsPerPx
setting of 1.7 is about right for what the Arch Wiki Firefox
suggests should be correct here, and it looks good to me which
is what I care about most.
Sidebar: X resources tweaks to specific applications
Xterm sizes the width of the scrollbar in pixels, which isn't ideal on a HiDPI display. It is normally 14 pixels, so I increased it to:
Urxvt needs the same tweak but it's called something different:
I think I also tried to scale up XTerm's menu fonts but I'm not sure it actually worked, and I seem to have the same X resource settings (with the same comments) in my work X resource file.