March 30, 2002

Productivity growth in the 2000s: what will it be?

I'm supposed to forecast what productivity growth will be in the United States over the next decade, and what policies could accelerate or retard it, in three different papers and for five different audiences over the course of this spring and summer. The tricky part is that it is not at all clear to me what the right way to proceed is. But it may not matter much: whatever approach I take, I keep getting optimistic answers--although I'm not sure whether this describes the set of possible worlds we might live in or my own disposition. There are natural optimists, like former IMF head Michel Camdessus, of whom his subordinate Michael Mussa once cracked, "He's so optimistic he sees the glass as half full even when there's no glass at all"...

In 1995 American productivity growth, which had been motionless and prostrate, face-down on the ground since the early 1970s productivity slowdown, suddenly picked up its mat and walked. By pre-1973 standards the pace of growth of labor productivity was nothing special. But by the standards of 1973-1995 it seemed nothing less than an economic growth miracle. Between the beginning of 1995 and the semi-official NBER business cycle peak in March 2001, U.S. nonfarm-business output per person-hour worked grew at an annual rate of 2.80 percent per year. Between the beginning of 1995 and the semi-official NBER business cycle peak in March 2001, U.S. real GDP grew at a pace of 4.21 percent per year.

As the computer revolution proceeded, nominal spending on information technology capital rose from about one percent of GDP in 1960 to about two percent of GDP by 1980 to about three percent of GDP by 1990 to between five and six percent of GDP by 2000. All throughout this time, Moore’s Law–the rule of thumb enunciated by Intel cofounder Gordon Moore that every twelve to eighteen months saw a doubling of the density of transistors that his and other companies could put onto a silicon wafer–meant that the real price of information technology capital was falling as well. As the nominal spending share of GDP spent on information technology capital grew at a rate of 5 percent per year, the price of data processing–and in recent decades data communications–equipment fell at a rate of between 10 and 15 percent per year as well. At chain-weighted real values constructed using 1996 as a base year, real investment in information technology equipment and software was an amount equal to 1.7 percent of real GDP in 1987. By 2000 it was an amount equal to 6.8 percent of real GDP.

The acceleration in the growth rate of labor productivity and of real GDP in the second half of the 1990s effectively wiped out all the effects of the post-1973 productivity slowdown. The U.S. economy in the second half of the 1990s was, according to official statistics and measurements, performing as well in terms of economic growth as it had routinely performed in the first post-World War II generation. It is a marker of how much expectations had been changed by the 1973 to 1995 period of slow growth that 1995-2001 growth was viewed as extraordinary and remarkable.

One approach is to use standard growth theory to model the impact of the computer revolution. Assume that technological revolution is lowering the cost of one particular form of capital--information technology capital--at a constant rate, and that that form of capital is an imperfect substitute for capital in general. Then use the standard tools of growth theory--the calculation of steady-state growth paths, convergence analysis, and so forth--to forecast future growth. The conclusions are that the economy's long-run labor productivity growth rate jumps up as it experiences a secularly rising information technology capital-output ratio, and that the decline in the price of capital looks very much like additional total factor productivity growth as far as its effects on labor productivity is concerned. Hence standard analysis paints a bright future.

But standard analysis assumes that a number of quantities are stable--the nominal share of expenditure on information-technology goods, the "income share" of high technology capital, and so on. All the evidence of the past two decades, however, suggests that these are all rising as information technology becomes cheaper and finds more and more uses. If it is indeed the case that uses of information technology are growing faster than their prices are declining--perhaps because this is one of the few waves of innovation that are true general-purpose technologies--than the late 1990s are likely to substantially underestimate what future productivity growth is likely to be.

The same conclusion--underestimation--follows from taking seriously Basu, Fernald, and Shapiro's point that times of rapid runup in capital-output ratios are times of high adjustment costs that lead measured output growth to undershoot long-run potential output growth. Here, too, the forecast is bright.

A fourth and last approach would be to inquire where the unexplained productivity "residual" comes from. If--as Paul David, Nick Crafts, and others have argued--it is really the case that long-run benefits from general-purpose technologies flow only after they have spread themselves throughout the economy, and that the true efficiencies come from learning how to reconfigure production to take proper advatnage of them, this provides yet another reason for believing that the full effect of information technology on economic growth has not yet been felt.

By contrast, I am having a very difficult time coming up with reasons why growth in the 2000s should be slower than growth in the late 1990s. Even though journalists write of the "new economy" going smash with the end of the NASDAQ bubble and the recession, it looks to me like the fundamentals of growth are still there...

Posted by DeLong at March 30, 2002 10:10 PM | TrackBack

Comments
Post a comment