Nobody knows anything. It's the one constant in Hollywood, according to William Goldman's Adventures in the Screen Trade. The film studios may have more formulas than face-lifts, but nobody can confidently predict whether a picture will be Basic Instinct or Showgirls. Making movies is a crapshoot: Differences invisible to investors and executives alike are enough to produce genius - or a total dog.
Sound familiar? The titans of the silicon age continue to calculate the pace and direction of technological change, at fast and furious Internet speeds. But nobody knows what works.
A century ago business forecasting and economic policymaking were in roughly the same situation: Nobody knew anything quantitative, nobody knew anything reliable. This void called forth a response: Bureaus of Labor Statistics and Economic Analysis were born, along with think tanks, and national income accountants. Their data collection and analysis efforts gave us the statistics that to this day define the economy. And, for a while, it kind of worked. But then came the information age.
Past industrial revolutions - steel, for example, or the coming of mass production to the automobile - had seen explosions of technology that drove the prices of key commodities (railroad rails, the Model T) down by 5 to 10 percent a year for one, two, or three decades. The information age is not your father's Oldsmobile: The price of computation, according to Yale's Bill Nordhaus, has dropped 42 percent per year over 60 years - a trillion-fold fall since 1940.
Today's technological revolution has so far lasted between two and six times as long as previous revolutions. It is between five and ten times as fast, the equivalent of a race between a cheetah and a possum. And it is a larger share of the economy, changing what people do in their work, where it is done, and even what economic activity is. The problem: We are not sure how.
Suppose you want to know how much the US as a nation is earning through the export of high-technology goods and services. The Department of Commerce publishes its estimates of international trade by commodity. But, as Paul Krugman has said, if you look at the data what do you learn? You learn that the product that the US leads the world in is the export of "errors and omissions": we don't even know reliably what the US is selling to the rest of the world.
Suppose even that you want to know how fast conventionally measured production has been growing. The system of national accounts has consistency checks, one of which is that "product" - the value of stuff made - equals "income" - the amount of money people earn by selling stuff. But it doesn't: The size of the late-1990s productivity boom changes substantially depending on whether we use the product or the income estimates. And it changes even more if you ask not whether the system of national accounts passes its own consistency checks but how large are the biases in estimates of growth imposed by the way we conventionally measure real GDP.
Suppose you want to know how much Internet traffic there is. A company like Telegeography will provide beautiful information graphics and statistics on capacity. The University of Minnesota's Andrew Odlyzko will say that Internet traffic each year is between 1.7 times and 2.5 times as much as it had been the year before, and that "data" traffic will become a larger share of telecommunications than "voice" traffic sometime in the next several years. But what we really want to know is not capacity but use. And the difference between 170 percent and 250 percent annual growth (let alone the eightfold increase claimed by ex-WorldCom executives) is the difference between prosperity and bankruptcy for multibillion-dollar companies.
Suppose you want to know the size of the Web. Netcraft will tell you about the 35 million Web servers that it can find, and that number has fallen in the past two months after growing from 1 million to 35 million in the previous five years. But Netcraft cannot tell you whether a "Web server" is a shared slice of a single computer, or 20 different computers networked together. And Netcraft cannot tell you how busy those Web servers are.
Four years ago it did not matter as much that nobody knew anything. High-tech growth was faster than anybody could handle, and the right strategy was to throw as many resources into the sector as we could. But now the first phase of explosive growth is over, and a great deal hinges on our being able to form a coherent picture of what high-tech will be like five or ten years down the road. We don't want to invest too much, and create the equivalent of our fiber-optics glut elsewhere in the economy. But we don't want to invest too little, and fail to grasp our opportunities.
We try to tease out a partial picture of what is happening by combining individual pieces of information and extrapolating. It turns out the most reliable pieces are not those about quantities, but prices. Intel and AMD are confident that for this decade at least they will continue to deliver double the microprocessor power at the same cost every two years. Disk drive manufacturers: double the storage every 15 months. Memory manufacturers: double the memory every 18 months. As telecommunications companies flame out, go bankrupt, and get recapitalized, the price of intermediate (not last-mile) connectivity will fall rapidly and reliably as well.
We know very well how fast our information-processing and information-producing capabilities are growing. What we don't know, however, is how they will be used - how valuable those uses will turn out to be, and how rapidly they will diffuse.
"Who the hell wants to hear actors talk?" the head of Warner Bros. famously proclaimed the same year words were first spoken on screen. Like talking pictures or the transcontinental railroad before it, the Internet changes everything. If only we could understand how.