It's important to get the real costs right
Here is an obvious yet important thing:
When you make decisions between the costs of development and the costs of operation, it is quite important to get the actual costs right (on both sides); otherwise you will be balancing things based on bad data, which usually doesn't end well. One would like to think that this is easy, but in fact there has usually been a lot of mythology about these costs floating around (I suspect especially so when the actual costs are changing rapidly).
The classical example is the 'cost' of garbage collection. For a long time people argued that automatic garbage collection was both significantly less efficient than manual storage management and unnecessary because it was easy to manage storage by hand. Actual practice has shown that both are false; in large scale programs it's clear that manual storage management is too error prone, and I believe that modern GC systems actually have a lower overhead (in both code execution time and space) than manual storage allocation.
Another, older example is the arguments between assembly programming and high level languages (by which we mean things like C, not what is called 'high level' today). Although the debate was won long before then, starting in the early 1990s I think that the efficiency argument actually went in favour of the compiled languages, as compilers got increasingly sophisticated and started doing global optimizations that were just not feasible if you were coding assembly by hand. These days modern Just-In-Time environments have pushed this even further, since the JIT can produce ultra-specialized code on the fly.
In a way, similar things are happening on the 'cost of operation' side now, where things like detailed charging systems for cloud computing or strict limits on what you can do are making people conscious of just how much their code is actually doing.
|
|