Quick background which, BTW, does not make me right about any of
my opinions. But you have to admit, I have seen a pretty stretch of
the electric utility industry, covering 75 years. I fell into the power industry as a law firm associate, then moved into being a hybrid legal / technology consultant, working in the industry for over 2 decades during the nuclear build-out, rate cases (covering much else, that defines the time frame). In that time I spent a over a year (with some fellow attorneys) dredging up, studying and writing analytical papers on corporate decision-making going back to the 1930's, worked on rate and anti-trust cases which involved study (and some direct contact with executives), and worked closelly with (and in many cases socialized with) a generation of persons now dominant in the power business here in NC include the current CEO of Progress Energy, one of Jim Rogers's chief lieutenants at Duke Energy (also a law school classmate), the current and 2 past chairs of the utilities commission (current was also a law school classmate), the head of the utilities public staff, etc., etc. Between the mid-1970s and 1990, I consulted for Texas Utilities, CP&L (now Progress) and Duke here in NC, SCE&G and Virginia Power, on plant and rate issues. Don't get me wrong, I was not a big shot, I was a functionary, first a law firm associate, then a technology consultant. I was Zelig. But I saw all of it up close, and I know how they think.
My apologies for the length as well, but I think you'll see why. This is obviously pretty much off the top of my head; I have cited stats and other references from memory, rather than checking as if I'd written an article. I'm pretty sure that they are correct, or close.
I believe that the coal article misses a major probable development that will greatly alter the medium term future of power consumption, and thus the demand for central station generation, and thus, coal. The article takes the central station, baseload model of power requirements, and thus generation, as a given. I believe that will not be the case.
The fundamental question relates to whether the central station / baseload model, where power is generated in bulk at central station plants, and transported out to millions of customers over 1,000s of miles of lines is the desired growth mode. Big plant generation will play a huge role for decades, no matter what direction we take, but is it where we want to go?
An anecdote and simplistic example. A neighbor was in the auto resale business and so required a simple car lot. He rented the land, set up a basic modular structure, lights, etc. The facility was open for maybe 20-25 hours / week. At some point, he became frustrated with dealing with the local power company, and did the following:
- obtained several partly damaged but usable solar panels surplus from the state
- took a pile of car batteries (which he has, because of this line of work)
- linked the two together.
- plugged in his junction box.
And voila, an off the grid power solution. It wasn't pretty. He (of course) had to turn everything off each day when he leaves. He had no backup through link to the grid or otherwise. Most people couldn't cobble such a system together, and most people don't have 25 car batteries lying around. Which are ridiculously inefficient for this purpose to begin with. And yet, it worked. Tweaked with better 'stuff' at a few points? It represents at least a major part of the future, and an exemplar of switching away from sole reliance on central station power.
First, a comparison with the central station model. Let's look at a simplified version of the central station model, coal-fired variety. Let's also leave aside all the necessary precursors (dig giant holes, buy very expensive equipment, train a bunch of guys, go through very expensive licensing, build railroads) and get to the day to day.
1 - dig coal 2 - bring coal to surface 3 - put coal on train 4 – deliver coal to power plant 5 - stuff coal into generating facility 6 – burn coal 7 - do all that sequestering stuff, usually not locally 8 - transport power through lines 9 - switch in neighborhood of use 10 - divert to house or business 11 - turn on electric-powered device and use 12 - user pays utility at the end of the month
Now look at the 'eccentric man's car lot' model, again, leaving aside the startup items.
1 - turn on electric-powered device and use
Puts the car lot model in perspective, doesn't it? Who wouldn't favor that approach, all other things being equal?
Let me be clear where I'm going with this. At some point in the not ridiculously long future (certainly well before 2050), the notion that electricity generated 200 miles away is the preferred power source will be thought absurd for most usages. Sure, the burn and ship (or nuke and ship) method will be with us for a long time; there are some applications for which the critical mass of power is simply too large for current technologies other than large central station (the steel plant, say). Even those will, however, diminish on any number of fronts.
The downward slope of demand for central station will also be enabled by continued efficiency gains of the average appliance, computer, machine, etc., as well as serious emphasis on basics like insulation (both technical and just plain old technique). That downward draw will also have further enabled the distributed, generate it near where you use it, model, and in turn have provided a positive loop incentive for it.
I know that, as you say, many of the computing analogies don't apply; power generation is subject to no Moore's law, turbines will only get incrementally more efficient, all that. But the basic model (central vs. distributed, mainframe vs. PC)? That analogy applies in spades. The certainty that central station baseload (certainly focusing on coal, but including nukes and large hydro) will continue to dominate is closely analogous to the attitude held by all the smart people in computing way back when. Mainframe vs. PC. Distributed vs. central. Put power out there where regular people can configure and use it vs. keep it among the priesthood. We know how that one ends.
So, given that no one would choose the nasty, expensive, complex, rickety 12 step process if the 1 step process would do, why did the central station model win out? We all know the answers, of course, relating to reliable onsite methods being far less available, but also to the key 'use it or lose' it factor relating to electricity. The growth of storage technologies (batteries, etc) changes the whole equation.
So the next question arises, why (other than habit, and the need to preserve a business model revolving around central station generation) would we continue to choose central station for most uses? Why wouldn't we move toward the onsite / nearby model for many of our uses? How much do we have to 'fix' about the messy car lot example above to make it the winner for many usages?
- Batteries, of course, are the key, and the better they get, the more profound the change. Batteries are the enabling technology for everything else.
- Those batteries get better every year, if only incrementally. More important, they get cheaper every year.
- Solar panels (among other technologies) get better every year. Much, much more important, they get cheaper and more reliable.
- Setup becomes more practical, and more people and businesses are available to provide reliable, inexpensive service. The service industry approaches critical mass, even as we speak.
The final link in the chain is, ironically, the grid itself, backed by what will (at first) be mostly central station power. Various technologies support, and regulations provide backing for, easy and relatively inexpensive integration to the grid. Onsite / nearby adopters are no longer forced to work without a net, and can stick their toe in, e.g., Walmart getting a really pretty amazing 13% from solar panels, etc, in their new stores, while still getting the rest conveniently and reliably from the grid.
The only real question is money; when are the panels, batteries and other equipment cheap enough, and how does that happen? Review of one of the article quotes provides the answer:
"He pointed out the huge engineering achievement it has taken to raise the efficiency of solar photovoltaic cells from about 25 percent to about 30 percent; whereas “to make them useful, you would need improvements of two- or threefold in cost,” say from about 18 cents per kilowatt-hour to 6 cents."
Comparing technical efficiency to cost is apples to oranges. Assuming the panels and batteries are compact enough to fit the space (easy with home or office), cost is all that counts. While efficiency is a factor, being able to mass produce the things cheaply is far more significant. That 3-1 fall in price comes from manufacturing experience and volume; think flat screen TVs (which are WAY more complicated to make than solar panels). On a more specific, household level, lets face it, if I'm in that 'post-Home Depot' market and don't have enough juice to accomplish my needs (whether running a workshop, the car lot or my house), I'm unlikely to say 'dang, if only I can wring some more efficiency out of that solar cell. damn that 30% technical limit!'. I'm going to say 'I need me a few more of these, I'll be back from the Home Depot in an hour'.
So, how did Home Depot get into the equation? Friedmann's analysis also ignores the commercial tipping point factor; sure, Livermore and their peers, various startups and some big business have put in lots of money in trying to improve efficiencies, but have spent little of that in trying to mass produce cheap, reliable panels. No criticism, that's not their gig. That part comes when the relative efficiencies have crept upward, and perceived demand has gotten sufficiently high (maybe with the help of a tax break or two). Once it happens, the next steps are incremental, but of the fast-forward variety. Design firms and manufacturers start to squeeze out costs, and Home Depot starts to squeeze supply chain. A new generation comes along that is 2x 'competitive' costs, and gets snapped up by early adopters. Home Depot starts to realize they can sell these to every consumers and contractors in America. Everybody gets better at all of it, and prices fall to 1x. And fall some more.
That process is well underway here and there, you've seen high end housing, the solar panels running huge electric fences (why run transmission lines all the way out there, after all) and supplying other farm and ranch needs? It's specialty now, but if everybody is looking to hook up a house battery and solar chargers ...
If I'm right, and the dominant role of central station soon to be undermined for major parts of the market, how is it that all the smart guys are missing it? I have the greatest respect for the competence and dedication of most people in the industry I worked with (a few exceptions, as anywhere). But historically, their judgment deserves no more, and in many cases less, deference with respect to one key element; major changes in the structure of how the power market works. The miss it, collectively, pretty much every time.
The best example occurred mainly through the 1970's. Up to that point, power usage (and thus generation) had risen pretty much in lockstep with population and economic growth. The rise of nukes in the 60's and 70's was seen a s a way to match that expected / projected growth; nuclear plant orders went through the roof. TVA ordered something like 17 units at 7 locations. These were not just some line item in a planning document, they signed on the dotted line, putting up major money in down payments. They ended up with 3 or 4. Several of the rest got partial starts and were abandoned or semi-abandoned; they dropped $600 million into the Bellefonte units before having the creditors come in and take back anything of value (against any logic, they are attempting to revive the plant). The percentage is relatively typical; in no case did any utility I'm aware of end up taking even 1/2 of the orders. Many 2 unit plants resulted in buildout of 1 unit, with expensive holes in the ground (e.g., $100 million at Texas Utilities Comanche Peak unit 2, now filled in with dirt) resulting.
For all these utilities, these orders and cancellations were expensive and ultimately at least somewhat embarrassing, one reason the regulators (who had, after all, approved these expensive orders and shared the blame) let them rescind them so easily (and with so little publicity).
Basically, the cancellations had little to do with the fact that the plants were mostly nukes, and everything to do with the over-order which they had engaged in. The 1970's saw a demand flattening, against all past experience; there was growth, to be sure, just not what past growth predicted. The run to order plants ran into a fundamental (and, as it turned out, long term) change in usage patterns and a decoupling of economic growth from the electricity usage curve. The flattening was a result of a combination of maturing of the market (everybody gets that first freezer and air conditioning, returning the market to replacement), manufacturers finally putting some attention on more efficient units, and other factors. Relatively few units have been built since (and that's 30 years!). Literally no industry insiders saw it coming.
Does coal still continue to play a role for some decades? Of course. Does clean coal therefore matter? Absolutely. Mainframes still matter, and they are better than ever (based on advances in PC- driven chip and application design, but who's counting). But recognizing that coal / central station is an endgame, not the wave of the future, dramatically changes everything, including where the smart money goes next and how we control it.
Whether the US or China (or both) gain the most? I'm out of ink, leave that one to you.