The scariness of uncertainty
One of the issues that I'm facing right now (and have been for a while) is that being uncertain can be a daunting thing. As sysadmins we deal with uncertainty all of the time, of course, and if we were paralyzed by it in general we'd never get anywhere. It's usually easy enough to overcome uncertainty and move forward in small situations or important situations (for various reasons). Where uncertainty can dig in is in dauntingly big and complex projects that are not essential. If you don't have to have whatever and building anything is clearly a lot of work for an uncertain reward, it's very easy to defer and defer action in favour of various stalling measures (or other work).
All of this sounds rather hand waving, so let me tell you about my project with gathering OS level performance statistics. Or rather my non-project.
If you look around, there are a lot of options for gathering, aggregating, and graphing OS performance stats (in tools, full systems, and ecologies of tools). Beyond a certain basic level it's unclear which ones of them are going to work best for us and which ones will be crawling failures, but at the same time it's also clear that any of them that look good are going to take a significant amount of work and time to set up and try out (and I'm going to have to try them in production).
As a result I have been circling around this project for literally years now. Every so often I poke and prod at the issue; I read more about some tool or another, I look at pretty pictures, I hear about something new, and so on and so forth. But I've never sat down to really do something. I've always found higher priority things to do or other excuses.
(Here in the academy this behavior in graduate students is well known and gets called 'thesis avoidance'.)
The scariness of uncertainty is not the only reason for this, of course, but it's a significant contributing factor. In a way it raises the stakes for making a choice.
(The uncertainty comes from two directions. One is simply trying to select which system to use; the other is whether not the whole idea is going to be worthwhile. The latter is a bit stupid since we're probably not going to be left with a white elephant of a system that we ignore and then quietly abandon, but the possibility gnaws at me and feeds other uncertainties and doubts.)
I don't have any answers, but maybe writing this entry has made it more likely that I do something here. And maybe I should embrace the possibility of failure as a sign that I am finally taking enough risk.
(I feel divided about that idea but I need to think about it more and then write another entry on it.)
|
|