In a previous post, I pointed out the highly counterintuitive fact that increasing the energy efficiency of any technology leads to wider use of that technology and eventually greater overall usage of energy by that technology. As a result of this effect, attempts to foster energy conservation by using more energy-efficient technologies backfire.
I think I have figured out the economics of why this happens.
When we think about conservation-through-efficiency we usually think of it in terms of the energy consumption of the old technology versus the energy consumption of the new technology. Obviously, if the new technology uses less energy then the old then substituting the new for the old saves energy. Saving energy almost always means saving money. Yet that common sense view doesn’t capture the true economics of the situation.
The true cost of anything has nothing to do with money. Instead, the true cost of any action is the opportunity cost. Increasing the energy efficiency of any technology lowers the opportunity cost and raises the marginal utility of the technology. Increasing the marginal utility leads to a wider use of the technology, which eventually swamps the initial energy savings.
Take lightbulbs (remember, though, that this effect occurs with all technology). Marginal utility does not concern itself with the opportunity cost of the first lightbulb or fixture that someone installs somewhere, but rather the last. It’s the cost of installing one more fixture that matters. Say a site manager considers whether to add additional lighting for security and accident safety. He calculates each new fixture costs $10 per year to operate but that his savings from reduced insurance will only be $8 a year. Economically, he won’t decide to install the new lights. Adding the new lights won’t pay off in improved safety. He does not base this decision on his current cost of lighting but rather on the benefit that each new light brings. Then an energy efficiency improvement drops the cost of operating the lights to $5 per year. Now the site manager can justify installing more lights even though overall his total cost for lighting may well rise. Each new light saves him 3 bucks a year in insurance payments. The savings in insurance, not the savings in his energy bill, drives the new installation.
The efficiency paradox occurs because increased energy efficiency makes it less costly to use a technology for newer or wider purposes. Worse, the greater the improvements in relative efficiency the more rapidly the technology spreads and the more energy it consumes. Only if all other factors remained constant and the new technology merely replaced the old one one-for-one would increased energy efficiency actually save energy. Technological history suggests that that never happens.
If improved energy efficiency actually saved energy then the evolution of our technology would look vastly different. Today, we could make a vehicle with all the performance characteristics of a Model-T that got 100+ miles to the gallon. Instead, we used each new leap in energy efficiency in the internal combustion engine to make the car perform new tasks. We use cars to carry more, more quickly, farther, with greater safety and less pollution. As a result, per capita consumption of energy for internal-combustion-engine based transportation rose steadily over the technology’s entire history. Today, we could make a computer comparable to a desktop circa 1980 that ran off mere body heat. Instead, we build more and more power-hungry systems and cram them into every conceivable niche. Increased energy efficiency reduces the tradeoff cost of using a technology in ways or scales not done before.
Any attempt to conserve energy through efficiency will fail in the end unless we somehow forcibly restrict the use of the new technology only to direct substitution of the older technology. Doing so, however, will have far worse consequences in the long run.