Late last winter America's central bank--the Federal Reserve--was busy congratulating itself. The flow of economic data indicated that its cut of the nominal short-term Federal Funds interest rate to 1.75 percent per year had done the job: the recession of 2001 was coming to an end. Even with the new, more pessimistic consensus expectations of the payoffs from the information technology revolution, even with the consequences of the terror-attack on the World Trade Center, money at 1.75 percent per year was cheap enough for America's businesses to once again increase their borrowings and continue to invest.
Then in the late spring, came the revelations of the size of the corporate governance crisis: everyone learned just how unreliable the accounts reported by American telecommunications companies were, and just how much the American system of corporate surveillance and control had deteriorated during the bubble of the 1990s. The American stock market fell: it is now 15 to 20 percent below its levels of last winter. The spreads between the interest rate at which America's government could borrow and the interest rates at which America's corporations could borrow widened. Suddenly, there was less cause for self-congratulation at the Federal Reserve: a 1.75 percent interest rate might have been the right interest rate to fuel a business-cycle recovery when the Dow-Jones stock market index was at 10000, but that meant a 1.75 percent interest rate was too high when the Dow-Jones index was at 8500. Through the summer news about corporate investment remained disappointing. And more and more analysts began to talk about the possibility of a "double dip"--a second American recession in 2002 to follow closely on the heels of the recession of 2001.
Yet throughout the spring and early summer, the Federal Reserve remained passive--and the short-term interest rates it controlled remained constant. Only in mid-August did the Federal Reserve give any signs that American interest rates might be cut further at some time in the future.
The informal and unofficial rationale leaking out of the Federal Reserve had two parts. First, short-term interest rates were already so low that everyone would believe that further cuts would be temporary cuts only, and so the cut in short-term rates would have little effect on the longer-term rates that were important for driving business investment. Second, short-term interest rates were already so low and so much beneath their equilibrium values that further cuts would panic the financial markets: if even the Federal Reserve thought the situation was that serious, the argument went, businesses would respond not by increasing but by reducing investment. The Federal Reserve's judgment appeared to be that it was largely if not completely powerless: that it had done nearly all that it could do, and that the levers of monetary policy were no longer strongly connected to the level of economic activity.
Thus the United States in 2002 joined Japan in what economists have for sixty-five years called a liquidity trap: a situation in which the short-term nominal interest rates the central bank controls are so low and so loosely connected to the level of aggregate demand that further reductions in interest rates are not effective ways of fighting recession. The United States's situation was not unique: Japan had been in such a liquidity trap since the mid-1990s. But there had been no other examples since the Great Depression of the 1930s.
Whether America in the middle of 2002 really is in a liquidity trap is uncertain. How long this state of affairs will last is unknown. Nevertheless, even if America is only on the edge of a liquidity trap and even if it moves away from the current state of affairs soon, this is a frightening situation to be in. If monetary policy is not effective, the only lever the American government has to manage its economy is fiscal policy: changes in the government's tax and spending plans to change the government's direct contribution to aggregate demand. But the lesson of the decades since World War II is that the U.S. government--with its complex, baroque, eighteenth-century organization--is incapable of changing policy fast enough to make effective use of fiscal policy as a tool for managing the economy. It simply takes too long for changes in taxes and spending to work their way through the Congress, and then to work their way through the bureaucracy. A United States caught in a liqudity trap is a country with no effective tools of macroeconomic management at all.
There have been two eras since World War II when policymakers--American policymakers, at least--have allowed themselves to believe that they have solved the riddle of the business cycle, and learned how to successfully manage a modern industrial or post-industrial economy. The first was the Keynesian high-water mark of confidence in demand management of the 1960s. It was destroyed by the erosion of confidence in price stability and by the oil-price shocks that together created the inflation of the 1970s. The second was the past decade of successful business-cycle management by the independent, apolitical, technocratic Federal Reserve. But this second era may not last much longer than the first.
It is nearly eighty years since John Maynard Keynes first argued that governments had to take responsibility for maintaining full employment and price stability--that the pre-World War I gold standard had not been as much of a golden age as people had thought, and that what success it had was the result of a lucky combination of circumstances that was unlikely to be repeated. Keynes was an optimist: confident that governments could learn to successfully manage the business cycle. He would be shocked to look at the world today--at continental Europe with its stubbornly high unemployment, at Japan still mired in a decade of near stagnation, and now at a United States that lacks the policy tools to deal with any further unexpected bad news about the state of its economy.Posted by DeLong at August 17, 2002 11:14 AM | Trackback