The de-electrification of American economy



For more than a century after the advent of commercial electrical power in the late 1800s, electricity use in the US rose and rose and rose. Sure, there were pauses during recessions, but the general trajectory was up.
The initial drop in electricity use in 2008 and 2009 could be attributed partly to the economic downturn. But the economy grew again in 2010, and every year since. Electricity use in the US, meanwhile, is still below its 2007 level, and seemingly flatlining.
Per-capita electricity use has fallen for six years in a row. We’re now back to the levels of the mid-1990s, and seemingly headed lower.
This is a really big deal! For one thing, it’s yet another explanation — along with tighter federal emissions rules, the natural gas fracking boom, and the rise of solar and wind power — for why the past few years have been so tough on coal miners. It means that even a big pro-coal policy shift from Washington may not result in higher demand for thermal coal.
For another, it seems to settle a turn-of-the-millennium debate about the electricity demands of the digital economy. Businessman and technology analyst Mark P. Mills, now a senior fellow at the right-leaning Manhattan Institute, kicked things off in 1999 with a report stating that computers and the internet were already responsible for 13 percent of US electricity demand and would be consuming 30 percent to 50 percent within two decades.
Today’s microprocessors are much more efficient than their forerunners at turning electricity into computations. But total demand for digital power is rising far faster than bit efficiencies are. We are using more chips — and bigger ones — and crunching more numbers. The bottom line: Taken all together, chips are running hotter, fans are whirring faster, and the power consumption of our disk drives and screens is rising. For the old thermoelectrical power complex, widely thought to be in senescent decline, the implications are staggering.
A group of scientists at Lawrence Berkeley National Laboratory who studied energy use were dubious of these claims, and published a series of reports calling them into question. One 2003 paper concluded that direct power use by computers and other office and network equipment accounted for just 2 percent of electricity consumption in 1999 — 3 percent if you counted the energy used in manufacturing them.
Since then, the digital takeover of the economy has continued apace. But it hasn’t translated into an explosion in electricity demand.
Part of the reason is that a grim new economic era dawned in 2000 or 2001 that has been characterized by slow growth, declining labor-force participation and general malaise — all of which tend to depress energy demand. But if you measure electricity use per dollar of real gross domestic product, the decline is just as pronounced, and it began much earlier than the fall in per-capita demand
In an article published in the Electricity Journal in 2015, former Lawrence Berkeley energy researcher Jonathan G. Koomey, now a consultant and a lecturer at Stanford, and Virginia Tech historian of science Richard F. Hirsh offered five hypotheses for why electricity demand had decoupled from economic growth.
1. State and federal efficiency standards for buildings and appliances have enabled us to get by with less electricity.
2. Increased use of information and communications technologies have also allowed people to conduct business and communicate more efficiently.
3. Higher prices for electricity in some areas have depressed its use.
4. Structural changes in the economy have reduced demand.
5. Electricity use is being underestimated because of the lack of reliable data on how much energy is being produced by rooftop solar panels.
The Energy Information Administration actually started estimating power generation from small-scale solar installations at the end of 2015, after Koomey and Hirsh’s paper came out, and found that it accounted for only about 1 percent of US electricity. That estimate could be off, and there’s surely room for more study, but mismeasurement of solar generation doesn’t seem to be the main explanation here.
Which leaves, mostly, the possibility that life in the US is changing in ways that allow us to get by with less electricity. This still isn’t necessarily good news — those “structural changes in the economy” include a shift away from manufacturing toward sectors that may not provide the kinds of jobs or competitive advantages that factories do.
Consider the shift to cloud computing. From 2000 to 2005, electricity use by data centers in the US increased 90 percent. From 2005 to 2010, the gain was 24 percent. As of 2014, data centers accounted for 1.8 percent of US electricity use, according to a 2016 Lawrence Berkeley study, but their electricity demand growth had slowed to a crawl (4 percent from 2010 to 2014). What happened? The nation outsourced its computing needs to cloud providers, for whom cutting the massive electricity costs of their data centers became a competitive imperative. In much of the world, of course, electricity demand is still growing. In China, per-capita electricity use has more than quadrupled since 1999. Still, most other developed countries have experienced a plateauing or decline in electricity use similar to that in the US over the past decade. And while the phenomenon has been most pronounced in countries such as UK where the economy has been especially weak, it’s also apparent in Australia, which hasn’t experienced a recession since 1991.
So is electricity use in the developed world fated to decline for years to come? Well, not exactly fated. Transportation now accounts for just 0.3 percent of retail electricity use in the US If the shift to electric vehicles ever picks up real momentum, that’s going to start growing, and fast. Dig more coal (or drill for more natural gas, or build more nuclear reactors, or put up more windmills and solar panels) — the Teslas are coming.

justin-headshot-2014 copy

Justin Fox is a Bloomberg View columnist. He was the editorial director of Harvard Business Review and wrote for Time, Fortune and American Banker. He is the author of “The Myth of the Rational Market”

Leave a Reply

Send this to a friend