September 12, 2003

The Pace of Information Technology Investment

An interesting graph:

One of the many interesting things to glean from this graph is that--if we trust our hedonic computer price deflators--the pace of American investment in computers and peripherals today vastly outstrips the pace of investment during the bubble-boom.

Of course, telecom investment is another story. And software investment is--again, if we trust our deflators--about where it was at the peak of the bubble-boom.

Posted by DeLong at September 12, 2003 04:22 PM | TrackBack

Comments

As a computer scientist, I don't believe it.

Largely because, starting in 98 and accelerating bigtime in 2000/2001, the "Mr Bill" corrolary to Moore's law ("Groves Giveth and Gates Taketh Away") generally stopped being true: outside games, CAD, and graphical design, software-gets-slower stopped holding true, greatly reducing the upgrade pressure.

We have a Pentium Pro 200 (circa 1996) in our office here in Soda that gets regular use and quite happily runs Windows 2K and office. It's a little slow when web surfing, but its generally usable. My soda hall desktop is a PII 450 (Circa 1998/1999), and its quite happy with Win2k.

Likewise, the market is so saturated that how much is new purchase verses replacement and update? With a slowing upgrade cycle, how does this behave in a saturated market?

Posted by: Nicholas Weaver on September 12, 2003 05:03 PM

One more thing: What do the graphs look like without the deflators, just in raw "current dollars" amounts?

Is the deflator for computer hardware a constant over the period measured, or has it changed between 97 and 2003?

Posted by: Nicholas Weaver on September 12, 2003 05:24 PM

Are ubiquitous computing devices (cell phones, PDAs, automotive CPUs, etc, etc, etc) included in the blue line?

Posted by: Randolph Fritz on September 12, 2003 05:52 PM

the bump up in 2000 could be Y2K stuff, too. Since it's past 2000, that goes back to the trendline.

Posted by: snore on September 12, 2003 07:50 PM

I disbelieve whatever deflator they use for hardware (at least desktop hardware) for 2 reasons.

1) Most PCs uselessly spin most of their cycles, and it's pretty clear that most increases in computer speeds over the last 3 or 4 years are more to feed marketing needs than meet customer demands. You just can't *get* a 1 GHz PC these days that doesn't come from a slimy non-brand-name source even though it would work just as well as a 2.4 GHz power-sucking monster for most tasks.

2) Software is really the flip-side of hardware. It would be weird for huge real investment increases in one to be unaccompanied by increases in the other. Just as today's hardware is faster, it almost certainly is true that software is better and runs faster, too. But increases in software investments lag the hardware trend line by a LOT. And, no, I don't think it's caused by everybody buying Linux instead, or by Microsoft lowering their prices a whole lot. The software line looks like a pretty reasonable proxy for the tech sector. The telecom line is going to look weird since the over-investment in some things (e.g., transcontinental fiber) was so huge. The hardware line looks completely nuts.

Is there a readable treatment of the hardware deflator you would recommend?

Posted by: Jonathan King on September 12, 2003 09:33 PM

Beware of gadget bias - from everyday experience of computers, the by far largest investment in the IT sector has to be the users learning-by-doing efforts. We have all invested huge amounts of our time to develop our computer skills. That should have an larger impact than hardware and software investments.

Posted by: Mats on September 13, 2003 02:38 AM

I believe it. The key is that this is hardware purchased rather than hardware needed or hardware used. The hedonistic corrections emphasize this by tracking the 1996 equivalent cost of the total capability purchased rather than the capability used. When someone buys a 2GHz machine to do a job that could be done by a 1GHz machine you get this divergence between software and hardware.

The actual utility most likely tracks the software curve. The difference indicates the gradual buildup of a larger unused capacity. Most people have a vast unused capacity of computing cycles, which either spins idly doing nothing or sits turned off.

Assessing the economic impact is difficult. If productive uses are found for all this unused capacity, then there is a significant impact. If not, it is "wasted". But the waste is not readily recoverable. The cost of producing the 1GHz machine is almost the same as the cost of producing the 2GHz machine, so it is a good choice to purchase the 2GHz machine and "waste" the cycles. It is a very low cost bet that at some time during the life of the machine a productive use will be found for that capacity and an investment in extra peak load capacity that is only occasionally needed.

Posted by: rjh on September 13, 2003 06:47 AM

So, what does this mean for technology stock prices and valuations??? Suppose we forget about masked costs of options, how are we supposed to value an Intel or Cisco or Microsoft or Applied Materials? What sort of revenue growth can we expect, and what is reasonable to pay for such growth?

Posted by: jd on September 13, 2003 08:17 AM

I find hedonic quality adjustments, which were proposed by the Boskin commission in 1996 (?) , very questionable. The concept of statisticians predetermining the change in pleasure/utilty of product A versus product B, is hard to stomach, because the change in pleasure/utility can only be known with hindsight. Statisticians have to make judgments about quality and quality improvements, something which is very subjective. Furthermore, comparing data in an international survey becomes very difficult because not all countries use hedonic adjusting and the countries which use hedonic adjusting seem to use different assumptions for hedonic adjusting procedures.

Information technology investment is also hard to measure because a lot of good software can be downloaded freely (open source software like Linux) from ftp-servers, which can be as seen an "investment" but is not counted as such.

Furthermore, US agencies count software purchases as investment, whereas agencies in European countries count software purchases as current expenses. So it's all very confusing, isn't it?

It's what I call a statistical quagmire, Professor

Posted by: Nescio on September 13, 2003 08:47 AM

All I'd want to see is the "adjusted for inflation" #'s.

Posted by: Ian Welsh on September 13, 2003 09:05 AM

Does the graph just chart business investment or does is also include consumer purchases? If it's the latter then American's have been on a gadget spending spree since late 2001.

Posted by: Sean Hackbarth on September 13, 2003 01:22 PM

I believe it. Every major American (and the majority of the important global ones) corporation has invested a tremendous amout of money to understand and track every process they can think of. Computing hardware and software are almost everywere. Think about the realtive processing power of your new cellphone vs. the first home computers.

Yes, your new car too.

Grocery mart inventories.

Wal-Mart.

Email.

The fact that your are discussing this on a website.

Posted by: jjj on September 13, 2003 07:11 PM
Post a comment