Monday, Jun. 10, 2002
High Tech Evolves
By Eric Roston
Software engineers will tell you that the longer they labor to solve complex problems by manually writing code, the more they respect the reasoning powers of the human brain. For years, artificial-intelligence researchers have gained some of their most useful insights from experts in brain function. And today the biological sciences are making similar contributions to all sorts of technologies useful to business, from software that "grows," "heals" and "reproduces" to tiny carbon tubes that will allow computer transistors to shrink to atomic dimensions even as they grow more powerful.
Last month TIME convened a five-member Board of Technologists to discuss how evolutionary biology--think of it as Earth's R. and D. department--is influencing the way we build computers, write software and organize companies. One member of our panel, Ray Kurzweil, an inventor, technology futurist and entrepreneur, observes that the human brain has no single "chief executive officer neuron." What gives the brain its power is not one boss but the ability of billions of neurons to conduct trillions of operations instantaneously. In computer lingo, that's called parallel processing, and it is something that today's man-made computers can accomplish only crudely. In everything from biology to business, this principle--a complex whole created by simple parts--is called emergence.
Emergence has become a common way of thinking about the world: zillions of little units--in this case neurons--do their own thing but together make up an entity of almost unfathomable complexity. Change the word neurons to "six-legged workers," and you'd have an ant colony. Change "ants" to "online buyers and sellers," and you'd have eBay. Change that to "stock traders," and you'd have NASDAQ.
The five visionaries on our panel are--in addition to Kurzweil--Paul Horn, IBM senior vice president for research; Sandeep Malhotra, vice president for nanotechnology at Ardesta, an Ann Arbor, Mich., venture-capital firm and industry incubator; Chris Meyer, director of Cap Gemini Ernst & Young's Center for Business Innovation in Cambridge, Mass.; and Melanie Mitchell, a research professor at the Santa Fe Institute in New Mexico. They offer a glimpse of technologies--most of them already in use--that will reshape the way businesses are run and profits are made in the years ahead.
MAKE 'EM SWEAT
"It has become almost impossible for human beings to manage the kinds of information-technology systems that we have created and we have foisted on the rest of the world." --Paul Horn
Here's a head scratcher. New products for enterprise computing come online daily, automating more and more human activities, yet the networks are so complicated that companies must hire an increasing number of IT specialists to keep them running. Why not automate the automation?
It's a tough job and one that will take much more effort than Horn and his 3,200 IBM researchers can muster: it will take an entire global industry. Horn has begun a crusade to make a reality of what he calls "autonomic computing," a network equivalent to the body's autonomic nervous system. That's what tells your heart to beat faster when you run to catch a bus or tells you to sweat when you are hot.
IBM's idea is more metaphor than actual nervous system, but it is a metaphor wired into the "biology of business" Zeitgeist. Other industry thinkers have diagnosed the same problem and are competing for leadership with models of their own. Companies spend hundreds of millions of dollars each year paying technicians to regulate systems that theoretically could regulate themselves. Research at Big Blue has already led to products such as its Intelligent Resource Director software, which helps high-end zSeries mainframes allocate processing power where it is needed most. These products will offer big savings: It costs twice as much to manage storage systems as it does to buy them.
But until now there has been no holistic approach to networks--just efforts to make storage or servers more efficient on their own, Horn says. And though the recession has shrunk technology budgets, financial constraints often encourage this kind of enhancement to efficiency. "The biggest demand for automation often occurs in economic downturns," he says. "I can't go to a company that doesn't say, 'I need to automate. I've got to get my costs down.'"
To some extent, computers and other machines already "sweat," after two generations of automating blue-collar jobs. And technology keeps climbing the occupational ladder. Asked how firms are making money by implementing new technology, Chris Meyer says, "There is a simple answer: the automation of white-collar work." Already, travel agents and stockbrokers have seen their business eroded by online travel and trading sites. Meyer adds that as the professional-services technologies improve, other occupations--including doctors and lawyers--may join automation's hit parade.
The hope is that those displaced won't be unemployed for long. Horn argues that innovation has historically created new opportunities far greater than the jobs it destroys. Says Kurzweil: "Jobs are going to be more fulfilling, because in order to contribute meaningfully, we will have to operate at a really human level, which still today most human jobs do not require." That could bring a new luster to the teaching and nursing professions.
LOSE CONTROL
"Evolution produces adaptation to changing environments. That is very much lacking in the world of computer science." --Melanie Mitchell
Microsoft Word 2002 takes up 96 million bytes on a user's hard drive. The DNA equivalent of just 12 million bytes of compressed data, by Kurzweil's estimate, controls human brain development. Clearly, something other than size matters in the design of instructions.
And that, of course, is all that software amounts to: instructions that tell the hardware what to do. But given the growing complexity of our computing needs, for tasks like designing a way to fit more transistors onto a chip, even groups of genius programmers may not be up to the undertaking anymore.
A rapidly growing branch of software development frees programmers from the limitations of the old command-and-control style of writing code. Mitchell specializes in evolutionary computing, a category of programming that simulates the process by which single cells morphed into a world of critters over the course of 3 1/2 billion years. Just as the DNA of your parents combined, allowing certain of their qualities to survive in you, in a computer two sample tracts of software code can "mate" and produce an offspring that is fitter to perform a task than either of its parents. After many generations of testing and recombining features, the fittest program should emerge or, as Darwin would say, survive. "Evolution produces innovation," Mitchell says. "But we can't predict what it will do next very easily."
When General Electric used genetic algorithms to help design the Boeing 777 engine in the late 1980s, the technique was so novel that people still talk about it, even though academics developed "GAs" in the previous decade. Since then, applications have proliferated. For example, ACD Associates in Okemos, Mich., has used genetic algorithms to help General Motors design new frames that both weigh less and absorb more energy in crashes.
NOT SO SECRET AGENTS
"The core idea to take away is self-assembly. It applies to crowds of people, to eBay, to epinions.com.. We're going from having one person in charge to having the parts figure out on their own where their highest and best uses are. That is a big change." --Chris Meyer
Another type of software program models the way a community of individuals--whether honeybees, supermarket shoppers, stock traders or U.S. Marines--develops from the diverse decisions of its members. Meyer and his colleagues have produced software that allows organizations to estimate how they and their customers and competitors will behave under various conditions. When Southwest Airlines needed help unclogging its cargo service, Meyer's group created a simulation that revealed the company was wasting a lot of time moving boxes from one plane to another to minimize the layover time for packages. "They were treating packages as if they were people," Meyer says. The software searched other possibilities and found a way to move fewer boxes and still get them delivered on time. Overnight cargo storage dropped 75%, and the company saved $10 million. For British Telecommunications, Meyer and Co. showed how software models can be used to better plan the routes of their trucks so that technicians maximize income and minimize overlapping territory.
Many organizations are trying to strengthen themselves by loosening their grip--by becoming less hierarchical and more adaptable. A useful metaphor, again, is the brain. "If you cut a wire in a machine, it will break," Kurzweil says. "In the brain, there are a few wires you don't want to cut, but many you can dispense with"--as stroke victims demonstrate when, with therapy, they "wire around" damaged brain regions. The parts are sufficiently autonomous to keep the whole running.
A central feature of natural systems is self-organization: the interaction of parts shapes and propels the whole. The textbook example of the self-organizing business is eBay. Executives there have limited control over who sells what, but have become skilled at encouraging growth by giving their customers good tools for shopping, communication and payment.
The distribution business Li & Fung, based in Hong Kong, has applied self-organizing models from the natural world to stitch together a network of thousands of microsuppliers to keep its complicated--and generally low-margin--business running at a 30% profit margin.
The U.S. military is studying similar ideas to prevent a catastrophe at headquarters from disabling the whole organization. This is, of course, not an entirely new idea: as Lenin plotted the Russian Revolution, he mastered the organization of independent cells. What is new is the tools that allow strategists to explore the possibilities open to a modern group organized in cells and called al-Qaeda.
ON THE HEAD OF A PIN
"Living or nonliving, we are all nanosystems. We are all composed of the same matter." --Sandeep Malhotra
Technology isn't just stealing evolution's methods and embedding them in software. Companies are poking and prodding the fundamentals of the physical world. They are slinking down from the micron scale (one-millionth of a meter) to what is known as the nano scale, inhabited by devices that are less than 100 nanometers wide, or less than one-thousandth the width of a human hair.
Venture capitalists are investing heavily in nanotechnology, which promises industrial advancements, such as nanoscale powders that strengthen steel and materials that allow for smaller chips. Samsung is using carbon nanotubes, fibers just 2 nm wide, to develop high-resolution TV screens; dozens of newcomers will use them to build everything from next-generation transistors to stronger outdoor lights.
Eric Drexler, who popularized the phrase nanotechnology in 1986, predicts we will have nanobots coursing through our bloodstream, destroying our cancer cells within a generation or two. Few share his vision, but Kurzweil defends conceptual engineers like Drexler and points out that a University of Illinois at Chicago bioengineer is developing a capsule that secretes insulin through pores as small as 7 nm.
Ardesta, the Ann Arbor venture-capital firm and self-styled "accelerator" of small technologies, has raised about $100 million in capital to nurture companies such as Discera, which is trying to shrink key cell-phone components onto a square-centimeter microchip, and Sensicore, which develops products that analyze water and blood. "I would tell you we are talking about this as a revolution," says Malhotra, "but I view nanotechnology as an evolution."
After the dotcom crash, the appetite for technological upheaval is slim. And that's just as well: most really big technologies, from the railroad to electricity, have made their impact gradually. "I don't think you ever sell a big paradigm shift," Meyer says. "It happens one application at a time."
TECH-UTOPIA?
"People will beam their flow of sensory experience on the Web, so you can plug in and be someone else, a la Being John Malkovich. We will be able to expand human intelligence." --Ray Kurzweil
When skeptical listeners scoff at Kurzweil's sci-fi predictions, saying, "Oh, we won't see that for 100 years," he points out that when it comes to "innovation time," 100 years melts down to about 25. That's because, as he says, "our rate of exponential growth is growing exponentially." Evolution accelerates: it took 100 million years for the human brain to develop, but computing power is expected to surpass it within a generation. "By 2040 or 2050 nonbiological intelligence will be trillions of times more powerful than biological intelligence," he says.
Kurzweil delivers bold predictions with confidence and the persuasive calm of someone whose work has revolutionized several fields. With a lifetime of inventions and entrepreneurship behind him--he sold his first company as an M.I.T. sophomore--Kurzweil has pursued a side career charting technological trends. His critics include such luminaries as Bill Joy, a founder of Sun Microsystems and author of the 1999 dystopian polemic "Why the Future Doesn't Need Us." Joy wrote that piece for Wired partly in response to some of Kurzweil's ideas and warned against such horrors as a plague of self-replicating nanobots.
Kurzweil shares some of Joy's concerns but takes a more optimistic view of technology and man's ability to control it. Some critics challenge Kurzweil's claim that software advances can keep up with such trends as rising processor speed, or that the brain is not too complex to reverse-engineer. And pragmatists might see many predictions--not just Kurzweil's--as divorced from larger social issues. What good are eye computers when we aren't sure where much of the world's freshwater will come from? In his next book, The Singularity Is Near, coming in early 2003, Kurzweil rejoins the debate with his critics.
In the meantime, Kurzweil juggles several lucrative projects under the banner Kurzweil Technologies, descendants of his original work in pattern-recognition programs, which simulate human cognition: Medical Learning Co. provides doctors with broad health-care information and a "virtual patient" educational tool. At FatKat, Kurzweil runs both simulated trades and an actual experimental investment fund that uses pattern-recognition software to spot trends in financial markets; a hedge fund is in the works.
BRAVE NEW WORLD
As the merger of biology and digital technology gathers pace, it will generate debates as heated as the one now under way over human cloning. Mitchell predicts in a future much farther off than Kurzweil's a change in "what is meant by 'we'? As we become more mechanized and machines become more lifelike, the difference between living systems and machines will narrow. 'We' will eventually include machines."
There is as yet no common word to describe this stuff. "Biotechnology" describes the advancements in biology, and especially the creation of new medicines, thanks to computers. "Technobiology" might seem a good candidate, but it already describes the engineering of new plants. Peter Bentley, a British computer scientist, recently published a book called Digital Biology. But the phrase "biology of business" is heard as often as any other, perhaps because now it's investment by business, rather than government, that's driving most of the advances. Money is already being made by the technology's developers--and there's much more to be made and saved by its business users.