A side note on the cost of operations
It is popular in some quarters to characterize the growing realization of the (potential) costs of operation as sloppy developers finally having to grow up and live in the real world (where real men program in C and are proud of it), instead of getting to paper over their sloppyness with Moore's Law and endless hardware budgets.
This is not merely a mistake, it is wrong.
There is a rule in optimization: you optimize where the program spends its time, not where it doesn't. We can rephrase this to 'optimize what matters', and then observe that a significant part of development is figuring out what matters and what doesn't. You can never optimize everything, not on any real program, because you never have enough time (programs are not finished, they are released), so you must pick and choose.
When Moore's Law was handing people 'free' performance increases every year, performance that was less than ideal was not something that mattered. Well, generally; there were always environments that operated at such scale or with such thin margins that such cost of operation issues really did matter. But they were rare (and when you have a rare need, you pay extra to have it met).
For much of the past decade or two, the truth (however unpleasant to people who hate 'wasteful' programs) was that optimizing for the cost of operation was in general a mistake and something that a good rational programmer would avoid. Writing (not too) inefficient code in a 'sloppy' high level language was the right choice; writing highly efficient code in C or assembler, just because, was the wrong one.
(And that things may be changing now does not change that; it just means that you should make different development decisions now.)