What containers do and don't help you with
In a comment on my entry on when to use upstream versions of software, Albert suggested that containers can be used to solve the problems of using upstream versions and when you have to do this anyway:
A lot of those issues become non-issues if you run the apps in containers (for example Grafana).
Unfortunately this is not the case, because of what containers do and don't help you with.
What containers do is that they isolate the host and the container from each other and make the connection between them simple, legible, and generic. The practical Unix API is very big and allows software to become quite entangled in the operating system and therefor dependent on specific things in unclear ways. Containers turn this into a narrow interface between the software and the host OS and make it explicit (a container has to say clearly at least part of what it wants from the host, such as what ports it wants connected). Containers have also created a social agreement that if you violate the container API, what happens next is your own fault. For example, there is usually nothing stopping you from trying to store persistent data within your theoretically ephemeral container, but if you do it and your container is restarted and you lose all the data, you get blamed, not the host operators.
However, containers do not isolate software from itself and from its own flaws and issues. When you put software in a container, you still have to worry about choosing and building the right version of the software, keeping it secure and bug free, and whether or not to update it (and when). Putting Exim 4.93 in a container doesn't make it any better to use than if you didn't have it in a container. Putting Grafana or Prometheus Pushgateway in a container doesn't make it any easier to manage their upgrades, at least by itself. It can be that the difficulties of doing some things in a container setup drive you to solve problems in a different way, but putting software in a container doesn't generally give it any new features so you could always have solved your problems in those different ways. Containers just gave you a push to change your other practices (or forced you to).
Containers do make it easier to deal with software in one respect, which is that they make it easier to select and change where you get software from. If someone, somewhere, is doing a good job of curating the software, you can probably take advantage of their work. Of course this is just adding a level of indirection; instead of figuring out what version of the software you want to use (and then keeping track of it), you have to figure out which curator you want to follow and keep up with whether they're doing a good job. The more curators and sources you use, the more work this will be.
(Containers also make it easier and less obvious to neglect or outright abandon software while still leaving it running. Partly this is because containers are deliberately opaque to limit the API and to create isolation. This does not magically cure the problems of doing so, it just sweeps them under the rug until things really explode.)
My views on when you should use the official upstream versions of software
Yesterday I wrote about how sometimes it's best to use the upstream versions, with the story of Prometheus here as the example for why you can be pushed into this despite what I've said about the problems inherent in this. But I didn't write anything about when you should do this versus when you should stick with whatever someone else is providing for you (usually your operating system distribution). There's no completely definite answer, partly because everyone's situation is a bit different, but I have accumulated some views here.
In general, what we really care about is not where the software comes from but how well curated what you're getting is, because curating software is work and requires expertise. Usually the best source of curation is packages provided by your OS, which typically add an extra layer of quality assurance over the upstream releases (or over people who put together low-curation OS specific packages from upstream releases). OS packages also come with automatic updating, or at a minimum central notification of updates being available so that you don't have to hunt down odd ways of keeping informed about updates.
The obvious reason to use the upstream version (building it yourself if necessary) is when there's no other option because, for example, you use Ubuntu and there are no official packages of it. Whether you want to do this depends on how much you need the package, how easy it is to build and operate, and how likely it is to have problems. We do this for some of the Prometheus exporters we use, but they have the advantage of being simple to build (Go programs usually make this easy), simple to operate, and extremely unlikely to have problems. They also aren't critical components, so if we had to drop one because of problems it wouldn't be a big deal. We also do this for Grafana, because we absolutely have to have Grafana and there is no Ubuntu package for it, so our best option left is the upstream binary releases.
If your OS provides packages but the packages are outdated, it's not necessarily a reason to switch (especially if you have to build it yourself). Often outdated versions of packages still work fine; our Ubuntu systems run a lot of outdated versions of things like Exim, Dovecot, and Apache, because the Ubuntu versions are always behind the official releases. What drove us to switch with Prometheus was that the Ubuntu versions being outdated actively mattered. They weren't just outdated, they were buggy and limited.
(Sometimes sticking with OS packages will lead you to skip entire OS releases, because one release has an okay outdated version but a newer one has a broken outdated version. But this can be perfectly okay, as it is for us in the case of Exim and Ubuntu 20.04. But if Ubuntu 22.04 also turns out to have a version of Exim that we don't want to use, we'll have to change course and use an upstream version.)
A related reason is if the upstream strongly recommends against using the OS packages. This is the case with rspamd, where the official site specifically urges you not to use the Debian and Ubuntu packages. Like Prometheus, rspamd provides its own pre-built binaries that are officially supported, so we use those rather than take whatever risks are there with the Ubuntu version. Spam filtering is also one of those fields where the software needs to keep up with the ever changing Internet (spam) landscape in order to be as effective as possible.
(Of course now that I've looked I've discovered that there isn't even an rspamd package for Ubuntu 18.04. But we made the decision based on that being what the upstream strongly recommended, and we're going to stick with it even for Ubuntu releases where Ubuntu does provide an official rspamd package.)
Once you start using an upstream version you have to decide how often to update it. My views here depend on how frequently the upstream does releases, how rapidly they evolve the program, and generally how much trouble you're going to have with catching up later with a whole bunch of changes at once (and how much the upstream believes in backward compatibility). A project with frequent and regular releases, a significant churn in features and options, and a low commitment to long term backward compatibility is one where you really want to keep up. Otherwise you can consider freezing your version, especially if you have to build and update things manually.