The costs of development versus the costs of operation
At one level, the whole issue of program energy efficiency is nothing new; it is yet another round of the eternal conflict between the costs of development and the costs of operation. These have pretty much always been in conflict, in that you could do more development work to lower the costs of running a program, but since development isn't free there is always a point where more development is not economically justifiable, where you can't lower your cost of operation by more than you'd spend on development.
(Where this point is depends in part on what scale you operate on. For example, a big datacenter cares about efficiency gains that someone with one machine wouldn't even notice.)
Where this point is exactly has been teetering back and forth for at least as long as high level languages have existed. Roughly speaking, I think that it has usually tilted towards development (ie cheaper development but higher operational costs) at times of rapid change and of technology growth, which is what we've had for going on two decades now.
(Rapid change means that your more efficient code may not run for long enough to pay back the investment in development; consider the fate of, say, the world's most efficient HTTP/0.9 server for static content. Rapid growth means that your development work is in effect in a race with the reduced costs of operation that time will bring all on its own, which makes the relative return on investment lower.)
Things like program energy efficiency may be pushing this balance back towards favouring the cost of operations, where developers will do more work in order to make their programs cost less to run. If so, it's not a revolutionary change (or an inevitable return to the way that things should be); instead, it's a natural shift, of a sort that has happened before and will likely happen again.