The 'standard set' of Unix programs is something that evolves over time
I've recently been writing about how OmniOS's minimal set of additional programs has made it annoying to deal with. In the process of this I've casually talked about 'the standard set' of programs that I expect a Unix system to have. In a comment on yesterday's entry, David Magda provided a comment that is a perfect lead in for something that I was going to write about anyway:
Define "standard set". :) Isn't that one of the reasons POSIX was created?
There are a number of problems with leaning on POSIX here, but the one I want to focus on today is that it's outdated. At the time POSIX was created, its set of utilities was probably a decent selection (and in some ways it was forward-looking). But that was a long time ago, and the Unix world has not stood still since then (even though it may sometimes feel that way).
There hasn't been any actual standardization or updates to standards, of course, because that is not really how the Unix world works for things like that. Instead, to copy from the IETF, it is more 'rough consensus and sufficiently popular code'. If enough Unix environments provide something and that something is popular, it evolves into a de facto standard. Pragmatically, you when something has reached this point because if you leave it out, people are surprised and annoyed. For instance, here in 2017, shipping a non-bare-bones Unix system with networking but without SSH would get you plenty of grumpy reactions, especially if you included a telnet daemon.
(A lot of people would specifically expect OpenSSH; providing another SSH implementation that wasn't usage-compatible with it would be annoying.)
One consequence of this evolution in the 'standard set' of Unix programs is that any Unix that freezes their set of programs is going to increasingly disappoint and irritate people over time. More and more people are going to show up, try to use the system, and say 'what, it's missing obvious program <X>? This is bogus; how old and outdated is this thing?' This is inconvenient for people building and maintaining Unixes, but that's life. Unix is not a static entity and never has been; Unix has always evolved.
(Your view of what should be included in this standard set is
affected both by what you do and what Unixes you do it on. As a
sysadmin, I have my own biases and they include programs like
sudo. But my set also includes
less and more
ordinary programs like
gzip, and these are probably
on many people's lists.)
PS: Just as people expect new programs to be in this 'standard set' on your Unix, they also expect your versions of existing standard programs to evolve and keep up with the times. One obvious sign of this is that the GNU versions of tools (or close imitations of them) are probably now expected by many people.
Multi-Unix environments are less and less common now
For a long time, the Unix environments that I existed in had a lot
of diversity. There was a diversity of versions of Unix and with
them a diversity of architectures (and sometimes a single vendor
had multiple architectures). This was most pronounced in a number
of places here that used NFS heavily, where your
$HOME could be
shared between several different Unixes and architectures, but even
with an unshared
$HOME I did things like try to keep common
dotfiles. And that era left its mark on Unix itself, for example
in what is now the more or less standard split between
/usr/lib and friends. Distinguishing between 'shared between
architectures' and 'specific to a single architecture' only makes
sense when you might have more than one in the same large-scale
environment, and this is what
/usr/share is about.
As you may have noticed, such Unix environments are increasingly uncommon now, for a number of reasons. For a start, the number of interesting computer architectures for Unix has shrunk dramatically; almost no one cares about anything other than 64-bit x86 now (although ARM is still waiting in the wings). This spills through to Unix versions, since generally all 64-bit x86 hardware will run your choice of Unix. The days when you might have bought a fire-breathing MIPS SMP server for compute work and got SGI Irix with it are long over.
(Buying either the cheapest Unix servers or the fastest affordable ones was one of the ways that multiple Unixes tended to show up around here, at least, because which Unix vendor was on top in either category tended to keep changing over the years.)
With no hardware to force you to pick some specific Unix, there's
a strong motivation to standardize on one Unix that runs on all of
your general-usage hardware, whatever that is. Even if you have a
$HOME, this means you only deal with one set of
personal binaries and so on in a homogenous environment. Different
versions of the same Unix count as a 'big difference' these days.
Beyond that, the fact is that Unixes are pretty similar from a user
perspective these days. There once was a day when Unixes were very different, which meant
that you might need to do a lot of work to deal with those differences.
These days most Unixes feels more or less the same once you have
$PATH set up, partly because in many cases they're using the
same shells and other programs (Bash, for example, as a user shell).
The exceptions tend to make people grumpy and often to cause heartburn
(and people avoid heartburn). The result may technically be a
multi-Unix environment, but it doesn't feel like it and you might
not really notice it.
(With all of this said, I'm sure that there are still multi-Unix environments out there, and some of them are probably still big. There's also the somewhat tricky issue of people who work with Macs as their developer machines and deploy to non-MacOS Unix servers. My impression as a distant bystander is that MacOS takes a fair amount of work to get set up with a productive and modern set of Unix tools, and you have to resort to some third party setup to do it; the result is inevitably a different feel than you get on a non-MacOS server.)