Steve Lohr of the New York Times makes a point that too many people still ignore: lower prices and "normal" profits (or losses!) for high-tech firms are an enormous benefit to consumers, and greatly encourage the spread of technology.
If you believe--as Stanford's Paul David does--that the big economic and social payoffs will come from serendipitous uses that are only discovered after lots of experience, then low prices are necessary for this process of learning-by-using to commence.
I think Paul David has a very good chance of being right on this.
...over the long haul, the technology sector will generate more than its share of business opportunity and economic growth -- as it has since the 1960's, when computers appeared in big companies, universities and government agencies. Sure, excess capacity and price-cutting make the current business environment brutal for the producers. But not for the consumers. The numbers for the technology in use worldwide -- personal computers, cellphones, handhelds, digital cameras, DVD players, MP3 music players and households online -- all continue to grow apace. And the expansion of the technology goes beyond digital devices. Increasingly, computer science has spilled over into other fields. Computing has helped transform everything from the way scientists plumb the mysteries of biology, chemistry and physics to the way Detroit designs cars and Hollywood makes movies. As it moves beyond traditional calculation, computer science is becoming more and more interdisciplinary, introducing the computing arts to a wider circle of people including more women. And in good times or bad, the scientists, inventors, engineers and entrepreneurs -- the beating heart of the technology industry for decades -- are still in the game.
iven the free fall of technology stocks and the waves of layoffs and bankruptcies lately, now might seem an odd time to say this, but it's true: the digital revolution rolls on, and it may have only just begun.
Yes, the technology industry appears to be a place of unrelieved gloom these days. Hopes for a modest recovery later this year were probably buried for good last Thursday when the Taiwan Semiconductor Manufacturing Company, the world's largest supplier of computer chips to other companies, delivered a surprisingly bleak outlook of weak demand for the second half of 2002.
Announcements come daily of companies cutting back on spending and payrolls, from struggling start-ups to industry stalwarts like Intel and I.B.M. By some measures, the slide in technology stocks since their peak in March 2000 now approaches the magnitude, in percentage terms, of the market collapse in the worst years of the Depression.
Clearly, a lot of the money, glamour and people have left the technology business. And yet, the technology itself continues to spread, in every way.
That fact does not spell success for any particular company or justify high stock prices. But it does mean that over the long haul, the technology sector will generate more than its share of business opportunity and economic growth — as it has since the 1960's, when computers appeared in big companies, universities and government agencies. Sure, excess capacity and price-cutting make the current business environment brutal for the producers. But not for the consumers. The numbers for the technology in use worldwide — personal computers, cellphones, handhelds, digital cameras, DVD players, MP3 music players and households online — all continue to grow apace.
And the expansion of the technology goes beyond digital devices. Increasingly, computer science has spilled over into other fields. Computing has helped transform everything from the way scientists plumb the mysteries of biology, chemistry and physics to the way Detroit designs cars and Hollywood makes movies. As it moves beyond traditional calculation, computer science is becoming more and more interdisciplinary, introducing the computing arts to a wider circle of people including more women.
And in good times or bad, the scientists, inventors, engineers and entrepreneurs — the beating heart of the technology industry for decades — are still in the game. It is such people who push the technology ahead on its uncertain but inexorable journey toward a few, often unanticipated, markets that prove to be lucrative — despite the many inevitable failures along the way.
Dan Bricklin, 51, the co-inventor of the electronic spreadsheet nearly a quarter-century ago and still an active entrepreneur, observed last week, "The people who really have a love for this stuff — the technology and what it can do — never stop."
Mr. Bricklin and Bob Frankston developed VisiCalc, a spreadsheet program, and began marketing it in 1979. VisiCalc was the first "killer app," or killer application, in the personal computer business — an industry begun, notably, in the deep economic recession of the mid-1970's. The spreadsheet showed a practical, business use for the desktop machines.
Over the years, Mr. Bricklin, a programmer and a graduate of the Harvard Business School, has followed his muse through a series of start-ups, mostly modest successes but some disappointments as well. In 1990, he was a founder of the Slate Corporation, which was to make software applications for pen-based computers, like those produced by Go and Apple's Newton. "The whole idea," Mr. Bricklin recalled, "was that pen computers would sell well," which they did not.
His current company, the Trellix Corporation, founded in 1995, produces and sells software that allows small businesses to build and operate their own Web sites. Its customers have included Cnet, Bizland, Terra Lycos and Thomas Register. In the last two months, Mr. Bricklin said, Trellix has seen a strong increase in business. "The Internet is a fantastic success, a force that has entered lives of regular people, and it is not going away," he said.
Venture capital investments, to be sure, are less than in the boom times of 2000. But the financing spigot is far from shut off. In the first two quarters of this year, venture investments — mostly in information technology start-ups — were running at a quarterly rate of about $6 billion, according to VentureWire Research, which tracks venture capital.
That level is way below the peak of $27.4 billion invested in the second quarter of 2000. But the recent rate is also 40 percent higher than at any time before the second quarter of 1999, going back more than a decade.
Today, the fledgling companies that receive financing tend to be ones that sell useful technology that can save corporations money in basic data-processing chores like storage, security and network management. "Those companies are still doing well and getting money," said Richard Shaffer, editor in chief of VentureWire.
Softricity, a software start-up in Boston, has just received an additional $15 million in venture financing — even though the company has adopted its technology, changed its name and overhauled its business model since it was founded in 1999. The technology grew out of software written for the Computer Museum in Boston, which distributed educational game software for children to many personal computers over a network. At the time, David Greschler, a founder of the company, was the director of exhibits at the museum.
The company retooled its technology into a service for downloading software from big Web portals, like ZD Net, and raised $22 million in the height of the dot-com bubble and called itself SoftwareWow.com. That idea soon looked doubtful, and the founders went back to the drawing board. They fine-tuned their software to make it a tool for running and managing software applications on PC's in corporate networks, and renamed the company Softricity, a utilitarian moniker, combining software and electricity.
For Mr. Greschler, the start-up experience has been an exhilarating education in the patience, persistence and flexibility required to be a technology entrepreneur. The appeal is to "take an idea you've worked on for years and try to get it over the line," he said. "We're not there yet, but we're moving in the right direction."
The Siggraph convention in San Antonio last week provided a snapshot of both the slump and the long-term trend in technology. Attendance at the show, an annual gathering of the computer graphics community, was estimated at 20,000 to 25,000, down sharply from last year. "But the hard core is here and just as excited as it has always been," said Andy van Dam, a professor at Brown and a computer graphics pioneer.
Mr. van Dam recalled the trouble he had in 1960's when he set up a special interest committee for graphics within the association for computing machinery — a move that led eventually to the huge Siggraph annual gatherings. Back then, he had a hard time finding 30 people to sign a petition, and the computer association was skeptical about granting committee status to computer graphics.
Today, Mr. van Dam says, a standard desktop PC, with a $200 graphics card, can render images of cinematic quality that five years ago could have been achieved only with a professional work station that might have cost hundreds of thousands of dollars. "We're talking about revolutionary progress, better than Moore's Law," Mr. van Dam said, referring to the observation years ago by Gordon Moore, a co-founder of Intel, that computer processing power doubles every 18 months. "And we're still at the beginning."
Computer graphics is only one facet of computing, but it has become crucial to fields as diverse as crash-testing in the auto industry, molecular simulations in drug design and special effects in Hollywood. "Even the movies that are not science fiction are half digital," noted Rick Rashid, director of research at Microsoft. "Look at `Gladiator.' You need the Roman Coliseum? You just build it digitally."
In one field after another, computing has made ever deeper and broader inroads, not just as a tool but as different approach to problem solving. Biology is perhaps the most obvious example, a discipline that moved out of the petri dish and onto the hard disk. The metaphor of human genome research is computational: humans are information processing machines, and DNA is the code that humans execute. "It's not data processing, it's a new view of biology," said Edward Lazowska, a professor at the University of Washington. "And that's happening all over. Computer science is pervading all other disciplines."
That has led to a lot more interdisciplinary work, so that these days much of what people do in various fields involves computer science, even though its practitioners do not think of themselves as computer geeks — particularly among the generation that takes PC's and the Internet as much for granted as it does light bulbs and telephones.
A case in point is Jennifer Walk, 20, who is entering her junior year as an economics and math major at Yale this fall. Ms. Walk has programmed in C, C++ and some statistical-analysis programming languages. She likes the practical problem-solving of computing — so different, she says, from the abstract theory of math. In fact, she took a summer job at Microsoft.
But Ms. Walk says she has no interest in becoming a computer scientist. "I can't picture myself programming 8 or 10 hours a day as a career," Ms. Walk said. "And that's how I think of it."
Some university computer science departments have begun offering interdisciplinary programs. Partly as a result, the number of woman computer science graduates is inching up, although women still represent only 19 percent of those receiving bachelor's degrees in the field.
For many of this generation, computing often shapes the way they see the world and solve problems. "They think about the world in terms of information and information processing," said Seth Lloyd, a professor at M.I.T.
Seema Ramchandani, for example, entered Brown in 1998 and pursued a traditional pre-med program. "I wanted to be a doctor all my life," she explained. But after taking one of Mr. van Dam's courses as a sophomore, she changed her mind. Ms. Ramchandani, 20, will begin a master's program in the fall that combines computer science and neuroscience, which she describes as "more a computational view of what goes on in the brain."
Her research, she explained, is endlessly fascinating. "How can something so random work so well," she observed. "The brain is just a hack."
"You and I are quatranary," Ms. Ramchandani said, referring to the four letters, designated A, G, C and T, after the chemical units in DNA.
Pointing to a notebook computer in the room, she said, "And that's binary," a reference to the 1's and 0's of computer code.