Freeman

FEATURE

Bring Back the Gilded Age – Part One

OCTOBER 07, 2013 by NATHAN SMITH

In the decades before World War I, sometimes called “the Gilded Age,” institutions in America and Europe were more conducive to progress than they are today. Old bonds of class and custom had lost most of their power to lock people into traditional roles. Absolutist and arbitrary government had lost ground to the rule of law. The limited-liability corporation had taken shape and was in the ascendant. In short, capitalism had taken command. But socialism, communism, progressivism, fascism, the welfare state, migration control, and other bad ideas that bedeviled the twentieth century were still young and weak. The result was a mighty wave of betterment of the human condition whose momentum carried it well into the twentieth century—long after the eclipse of the nineteenth-century liberalism that had set the wave in motion.

By 1890, the world could look back on decades of economic progress. Railroads had first appeared in Britain in the 1820s, and their expansion accelerated after the 1860s, when the invention of the Bessemer process supplied the world with cheap steel. Transatlantic steamship service had begun in the 1840s, and steamboats had opened the Mississippi River basin to the new settlement and thriving commerce that Mark Twain’s novels depict. The completion of the Transcontinental Railroad in 1869 had reduced a New York-to-San Francisco journey from six months to six days. All over the world, vast new regions had been settled, brought under cultivation, and integrated into world civilization: the American West, the Argentine pampas, Siberia, much of Canada and Australia, parts of Africa. Commercial telegraphy had given the world instantaneous long-distance communication since the 1840s, and transatlantic telegraph lines were successfully laid in 1866. Bicycles, invented in 1817, enjoyed a golden age in the 1890s.

Yet capitalism’s transformation of the world was, if anything, accelerating. Key inventions were already in place. The telephone had been invented in 1876, Thomas Edison’s light bulb in 1879, the automobile in 1886, and an early airship made a few flights in 1884–85. The Wright brothers’ plane would follow in 1900, and Henry Ford’s assembly line in 1913. In 1900, these marvels of modern technology were still largely the preserve of the rich. But in the next few decades, cars, electricity, household appliances, radios, and TVs reached the American everyman, changing cities, entertainment, and domesticity forever.

By the 1950s, middle-class American life looked a lot like it does today. Ordinary people lived in detached suburban houses and had cars, refrigerators, washing machines, indoor plumbing, and home telephones.

And that is the puzzle. L. P. Hartley began his 1953 novel The Go-Between with the line: “The past is a foreign country: they do things differently there.” In 1953, that was true. The normalcy experienced by a middle-class 1950s suburban family in a detached home watching TV with a car sitting in the driveway was something new under the sun. If the TV was playing a Western movie, it was showing a time still within living memory for old people then, yet archaic enough to be the stuff of old legends, like The Iliad. But the past is not such a foreign country now. Not only do middle-class suburban families still sit around the TV, but the TV shows from the 1950s, 1960s, and 1970s that they might be watching represent a material lifestyle that is still normal for us. There has been social change: less racism, more marital instability, rap music. But with the important exceptions of communications technology and micro medicine, our stuff has not improved that much. Economist Tyler Cowen calls this phenomenon “the Great Stagnation.”

The IT and communications industries are the exception that proves the rule. First, they serve as a reminder of what robust capitalist progress looks like. We see rags-to-riches stories: Sergey Brin and Larry Paige, bright young graduate students zooming into the super-elite with a company that changed the world; Mark Zuckerberg, a college dropout whose company became a household name and made him a billionaire; Bill Gates, Steve Jobs, and the rest; and a vast range of cheap-to-free services that people hardly dreamt of before, pouring down upon a grateful population. In the Gilded Age, most industries were like that. Where we have computers, the Internet, and smartphones, they had light bulbs, railroads, steamships, sewing machines, vaccines, pasteurization, electricity, the automobile, the airplane, the telephone . . . the list is endless. Why have IT and mobile technology broken out of the Great Stagnation? There is always a good deal of serendipity in the evolution of technology, but two policy-related reasons suggest themselves.

First, IT and mobile telephony are peculiarly immaculate industries, having little or no impact on the environment and demanding relatively little labor. In these industries, the government has had little pretext for blocking progress.

Second, the Internet in particular requires hardly any capital, since almost the only input needed to make a website is labor.

The Great Stagnation is controversial, and it would be nice to back it up with numbers. We can, sort of. According to the latest numbers from the Census Bureau, the income of the median American household is about the same as in 1989. Average wages have done better, but total factor productivity growth (a statistical measure often interpreted as technological progress) has slipped from 1.97 percent in the 1920s and 2.66 percent in the 1930s to to under 1 percent from 1973 through 2000, according to economic historian Alexander J. Field’s meticulously researched book A Great Leap Forward. True, productivity growth seemed to accelerate a bit after 1996, albeit not to pre-1973 levels. But in hindsight, even that may have been an illusion—an artifact of the dot-com and housing bubbles.

Now, economists know that all such numbers are imperfect for a variety of reasons, but especially because it is difficult to measure changes in the price level when new goods are being introduced and corporate R&D is constantly improving and upgrading existing goods. But there is a tentative but increasingly influential body of opinion among economists and historians that the dazzling economic progress of the late nineteenth and early twentieth centuries has gradually slowed or stalled in recent decades.

It is important here to distinguish between short-run business cycles and long-run economic growth. We are not talking about current high unemployment, or how we went from the Clinton boom to the Obama slump. We are talking about much more long-term changes. Annual fluctuations in GDP are easier to measure and more likely to grab headlines. In a sense, though, they do not matter much, since it is usually a question of 1 percent or 2 percent deviations from a trend. Recessions are unpopular, but they have their uses as concentrated episodes of “creative destruction” that weed out lazy or obsolete enterprises and clear the way for the next economic upsurge. Even major crises, like the Great Depression, may just leave more room for the next boom to get a running start. The U.S. economy enjoyed healthy growth in the 1950s and 1960s in large part because it was exploiting opportunities that had been available for years, but that the 1930s economy had been too dysfunctional to exploit.

What matters most is the trend, the underlying rate at which economic potential is increasing, which economists sometimes interpret as “technology” and sometimes refer to by neutral, technical terms like “total factor productivity.” A quick way to approximate the long-term effect of annual percent changes is the “rule of 70”: 70 divided by the growth rate is the number of years it takes a quantity to double. Thus, at the growth rates the United States achieved in the 1920s and 1930s, it might take about 30 years for living standards to double. At the growth rates that have prevailed since 1973, it would take a lifetime for GDP per capita to double. Ten years ago, when I was in grad school, trend growth was regarded as basically steady. Total factor productivity in English-speaking North America seemed to have followed a fairly steady upward trajectory for centuries, and the productivity slowdown from 1973 through 1996 seemed exceptional and transient. With the “New Economy” of the 1990s, growth got back to normal. Since then, a couple of jobless recoveries and a financial crisis have cast their shadows over the interpretation of the long run data. Probably few economists would go as far as Robert Gordon, who recently wrote a paper asking, “Is U.S. economic growth over?” But the Great Stagnation is becoming conventional wisdom.

If economic growth has slowed down, what explains it?

One possibility is that no one is to blame. The slowdown simply reflects a diminution of valuable technological opportunities at the economic frontier. There is something inherently mysterious about the technological frontier, because invention is unpredictable. Lots of things are hard to forecast, but invention is inherently unpredictable, as an example taken from Alasdair MacIntyre’s After Virtue illustrates. Suppose that sometime in prehistory, one caveman had said to another, “I predict that in ten years the wheel will be invented.” His companion asks him, what’s a wheel? “A wheel,” he explains, “is a round object that can move across the ground easily, because it doesn’t slide, but rolls,” and he proceeds to speculate on a few of its uses. And then it hits him: “Wait a minute, no one will invent the wheel, for I have just invented it.” Invention is the creation of new ideas, and tomorrow’s new ideas are unthinkable today, because if anyone had thought of them yet, they would be today’s new ideas.

That said, one can track the emergence of new ideas retrospectively and say not what will be invented, but what was. But the economic value of innovation in retrospect is hard to assess: partly because technologies have diverse uses and productivity gains often spill over into other industries, partly because it is hard to separate the potential productivity gains from a new invention from the actual productivity gains, which may fall short of their potential for non-technological reasons. If the slowdown happened because, while inventors are doing a great job coming up with new ideas, over-regulated late-twentieth-century capitalism is doing a lousy job of implementing them, one would get the wrong diagnosis by simply looking at revenues and job creation by the hottest technology firms.

But the Great Stagnation hypothesis is counterintuitive, because people have the sense that technology is doing amazing things all around them: smartphones, nanotech, biotech, the shale oil revolution, etc. Watch a few TED talks, or talk to a few engineers, or read Brynjolfsson and MacAfee’s Race Against the Machine, and you get the idea that there are plenty of bright new ideas around.

One of history’s lessons is that societies that are good at inventing may not be good at commercializing their inventions. Thus, China had gunpowder, paper, printing, and coal mining long before the West did, yet never made an Industrial Revolution. China could explore, but not exploit. Its failure to make thorough use of its own technology may have had something to do with the fact that China was a paternalistic civilization run by an intelligentsia, humane and cultured after its own fashion, but with little respect for business, commerce, or individual rights.

The post-WWII West, compared to the West in the Gilded Age, has taken on a few of the features of traditional China. It is increasingly dominated by a supercilious intelligentsia that regards commerce and industry with distaste. Business and commerce have few rights; they must constantly ask permission from the Mandarinate, and lobby for its noninterference, in return for which they get protected from competition by regulations that make entry difficult. Business is still profitable, but the consumer loses, and so does progress. Economists have lately gotten into a very bad habit of seeing pure ideas as the sole drivers of economic growth. Ideas are not enough. They need entrepreneurs to implement them and capitalists to finance them. They need governments to enforce contracts, property rights, and sometimes patents, while credibly promising not to confiscate profits or strangle new ventures in regulation and red tape.

It may not be that science is falling down on the job. Rather, capitalism ain’t what she used to be.

A key to understanding economic history is that there are often long lags between cause and effect. Thus, when Alexander Field, in the history of U.S. productivity quoted above, finds that the 1930s saw the highest productivity growth in the twentieth century, it’s not because policy was wise in the 1930s—far from it—but because even amid disastrously high unemployment and rock-bottom business investment, the economy was still absorbing the innovations of Henry Ford, Thomas Edison, and the other tinkerer-tycoons of the heyday of capitalism. Electrification was transforming manufacturing and powering new household appliances. Road networks were being built out and truckers were transforming logistics. The internal combustion engine was revolutionizing agriculture.

John Steinbeck’s The Grapes of Wrath is usually remembered as a poignant evocation of human suffering in the Great Depression, but it has another theme: rapid technological change. The young Joads, Tom and Al, know how to fix cars that their elders barely know how to drive. Sleazy mechanics cheat Okie farmers into selling their mules too cheap by telling them “this is the Machine Age,” and they believe it. At rest stops along the highway, destitute Okies cross paths with thriving new truckers, privileged representatives of the brave new world that Henry Ford made. In a labor collective in California, the Joad children are frightened by a flush toilet, something they have never seen before. The Okies are on the brink of starvation, not amid famine, but amid the abundance of a thriving California that “men of science” have made possible. What Field’s research showed, Steinbeck, living through it, understood.


Filed Under : Capitalism

ABOUT

NATHAN SMITH

Nathan Smith is a professor of economics and finance at Fresno Pacific University and the author of Principles of a Free Society and Complexity, Competition, and Growth. He blogs at Open Borders: The Case (openborders.info).

comments powered by Disqus

EMAIL UPDATES

* indicates required

CURRENT ISSUE

October 2014

Heavily-armed police and their supporters will tell you they need all those armored trucks and heavy guns. It's a dangerous job, not least because Americans have so many guns. But the numbers just don't support these claims: Policing is safer than ever--and it's safer than a lot of common jobs by comparison. Daniel Bier has the analysis. Plus, Iain Murray and Wendy McElroy look at how the Feds are recruiting more and more Americans to do their policework for them.
Download Free PDF

PAST ISSUES

SUBSCRIBE

RENEW YOUR SUBSCRIPTION