Thinking about the different models of supplying computing
It's the time of year when new graduate students show up here, so one of the things on my mind has been the various ways that computers can be supplied to people in an environment like ours. There are at least three that come to mind.
First is the 'bring your own device' model where every incoming graduate student (or professor) is expected to bring their own computer (probably a laptop) and, as a corollary, to look after it. Perhaps we'd supply some niceties like external screens to hook up to them. The BYOD approach is popular partly because any number of people are going to do this anyways.
Then there is the 'hardware only' model, where we hand a computer to every new graduate student but make no attempt to manage or control it beyond that; the graduate student can run whatever they want in whatever configuration they want. Probably we'd preinstall some OS in a recommended configuration just for convenience (and many grad students would leave it as-is). Lots of people like this model for its freedom and similarity to the BYOD experience (at least until the OS install blows up in their face).
The final model is managed desktops, where we both supply hardware and maintain the OS installed on it. On the one hand, we guarantee that it works right; on the other hand, people lose the freedom to run whatever they want and have to generally live with our choices. 'We don't support that' will probably get said a lot.
(Note that these are not necessarily a good set of options for any environment other than our peculiar one.)
As you might suspect, in practice right now we have a mix of all three options. The historical evolution of our environment is that we started out providing fully managed computing because computing was too expensive for any other answer, but over time the decrease in computing costs (especially compared to staff costs) has caused more and more people to shift towards BYOD and 'here, have a box'.
(I will skip a discussion of trying to do managed Windows installs and just say that we were and are primarily a Unix shop without much expertise in that area. This leads to non-technical issues beyond the scope of this entry.)
I'm mulling this over partly because how computing get supplied to people has a big impact on what services they're interested in consuming from us (and how). For one obvious example, in the days when we provided serial terminals on everyone's desk, having Unix servers for people to log in to was a big deal and they were very important. Today an increasing number of people here have probably only used our login servers to change their password.
(Since we're a Computer Science department, you could actually argue that we should actively push people to interact with Unix because Unix is an important part of effective, practical computing and so something they should be learning. But that's another debate entirely.)