Monday, Oct. 23, 1989

Who's

By Philip Elmer-DeWitt

America's high-tech companies do not have to look back: they know the Japanese are coming. U.S. computer-chip manufacturers, concerned that their survival is threatened, have gone to Congress for protection. And fear is rising that if the chipmakers go down, it will be only a matter of time before Japan overtakes the U.S. in the computer business. That would put an end to America's high-tech supremacy.

But are such apocalyptic visions justified? Not at all, argues conservative pundit George Gilder in his new book, Microcosm: The Quantum Revolution in Economics and Technology (Simon & Schuster; $19.95), a lively look at the history and prospects of the U.S. microelectronics industry. Gilder, author of the best-selling Wealth and Poverty, thinks that as computer-chip technology advances, America will widen its lead.

At the heart of Gilder's argument is the notion that the breakthroughs in quantum physics in the early 20th century, which provided the theoretical basis for microelectronics, also laid the groundwork for sweeping changes in the world's economy. In the past, a nation's wealth sprang from its natural resources and its ability to fashion raw materials into manufactured products. But the computer has put a premium on information, not raw materials or manufacturing prowess.

Using the new knowledge of the microcosm -- the invisible region populated by protons, electrons and other subatomic particles -- computer-chip manufacturers have been able to pack more and more information (and value) onto slivers of silicon whose material content represents less than 1% of their total expense. As chips are incorporated into everything from furnaces to cars, the value of these products resides increasingly in the "intelligence" stored in their electronic components. In the future, industrial might will depend less on mass production and more on the creative use of information technology. Gilder calls this phenomenon the "overthrow of matter" by ideas.

The book uses this theoretical framework to focus on what has happened in the semiconductor industry. In particular, Gilder's analysis attacks the conventional view that the U.S. blundered in letting Japan take over the market for mass-produced memory chips. As he points out, the key component for a computer is not hardware but software, the instructions that make the machine work. When programs like Lotus 1-2-3 made the personal computer a runaway success in the early 1980s, IBM and other firms made a strategic decision to let Japan supply the demand for memory chips that U.S. chipmakers could not meet. The Japanese built costly factories to fabricate an enormous supply of chips. But then their price plummeted way below the cost of production, saddling Japan's conglomerates with huge losses.

Meanwhile, Americans were working on far more valuable computer parts. Using systems called silicon compilers, U.S. engineers have been able to design a vast array of custom chips to suit almost any purpose. These specialized chips can be much more profitable than the commodity chips mass-produced by the Japanese. As more and more instructions are etched onto chips, the balance of power in electronics is shifting from manufacturing prowess, Japan's strength, toward software and design, in which the U.S. excels.

Gilder's arguments, while forceful, are not always persuasive. He seems to forget that Japan, an island nation rich in know-how and poor in resources, is itself a prime beneficiary of the triumph of ideas over matter. The Japanese may not be also-rans in software and custom chips forever. But at a time when so many books talk only about what is wrong with the U.S., Gilder's optimism about the future of American high-tech is refreshing.