DN Staff

November 4, 1996

16 Min Read
Man-machine barriers begin to crumble

At a time when "computer" meant a 16-ton behemoth with 15,000 vacuum tubes, could anyone have foreseen a Pentium laptop? Desktop analysis? 'Round-the-clock information via the Internet?

After the dizzying advances of the last five decades, little seems out of reach in the next:

  • Computers made of biological, not silicon, components.

  • Disks that can store the entire Library of Congress in the size of a coin.

  • Systems that can respond to where you look--or even what you think.

"It is not far-fetched to say, 'Wait, and almost everything will be possible,'" believes Chip Holt, vice president of Xerox's Joseph C. Wilson Center for Research and Technology.

That's progress. Computer power has been marching forward at an almost eerily predictable rate for a quarter-century, doubling roughly every 18 months. In the future, "capabilities of the machines will outstrip anything we can think of today," says Jack Brown at Motorola.

How long can these rates of improvement keep up? Many in the industry expect silicon technology to "hit the wall" around 2010 to 2015, when submicron semiconductor features become so tiny as to be just a few molecules thick. Then, a major shift in computing fundamentals appears likely.

A possible first step: augmenting the conventional. "Silicon as a technology will be very hard to replace," argues Greg Papadopoulos, vice president and chief technology officer at Sun Microsystems Computer Co. "The capital investments are unparalleled in industry. The question then becomes: What can you do with that fabrication technology?"

Adding iron to silicon, for example, could increase densities two- to four-fold, and such experiments have been going on for a decade. One problem to overcome, though: iron rusts. IBM and Hughes Electronics have added germanium to conventional silicon, offering substantial reductions in power consumption, weight, and size for comparable performance.

Another alternative is optical computing, using fiber optics to transmit and process data at the speed of light. Forest Baskett, Silicon Graphics' senior vice president and chief technology officer, foresees "smart optic communications." However, latency problems--the lag in switching back and forth from electro- and optical systems--remain to be solved.

Some scientists believe optics have a bright future in computer storage. IBM researchers are using technology initially developed for a new type of optical microscope to develop CD-like storage techniques that would pack a hundred times more data on a disc than the next-generation CDs known as DVDs. A disc-like surface with silicon-dioxide layers is finely marked with an electronic beam, then covered with metal film. Sensors would then read interactive light scattered from the metal strips. Theoretically, features of just 20 angstroms would be possible at access speeds of 100 Mbits/sec, versus 5,000 angstroms for Digital Video Discs.

"The pieces exist, and experiments agree with the theory," says H. K. Wickramasinghe, manager, physical measurements, at IBM's Thomas J. Watson Research Center. What is needed is a practical manufacturing process to finely stamp the discs, he says.

Merger of man and machine? In the long term, researchers are looking at molecular-based systems, and the sci-fi-sounding field of biocomputing. A molecular biocomputer might use the presence or absence of a particular molecule as a memory system, and even be self-repairing.

"Over time, biocomputing devices will be created," Frank Casanova, director of Apple Computer's Advanced Prototype Research Lab, predicts confidently. "Many will actually be grown, harvested out of labs instead of carved out of silicon."

Biocomputing could someday lead to the merger of man and machine--allowing the brain to continue sending impulses to paralyzed muscles, or helping the blind to see.

"There are a lot of experiments going on with artificial retinas," according to Robert Parker, deputy director of the information technology office at ARPA, the U.S. government's Advanced Research Projects Agency. Earlier this year, scientists in the U.S. and Hungary jointly announced a "single-chip supercomputer" they claim is robust enough--1 trillion operations per second--to power an artificial eye.

More than horsepower will be needed, however, for a computer to produce human vision. "It would probably take another 50 years before neurologists discover how artificial signals could be interfaced with the human brain," Leon Chura, professor of engineering and computer science at the University of California in Berkeley, told the Reuters news service. Other long-term potential applications: repairing nerves damaged in an accident or stroke.

"So far, it's at the lab-curiosity phase," Baskett says of biocomputing. "It doesn't scale at all. Nevertheless, the ideas are worth looking at."

If scientists ever develop a practical room-temperature su-perconductor, "there's nothing that can't be done," says Vladimir Alkalaj, head of Slovenia's National Supercomputer Center. "You could do a whole Cray in a chip, memory and all: multilayered wafers with vertical connections, any density you can manufacture--right down to the picometer level. With superconductive elements, there's virtually no heat generation. Then the only limit would be the speed of light."

Feedback. Even before silicon runs out of steam, many systems engineers expect a change in the way circuitry is designed. Now, circuits are tested and re-tested to ensure they work 100% as designed. However, ever-smaller features mean that devices are increasingly subject to noise; in addition, it becomes more difficult to test the massive numbers of circuitry features as they interact with each other.

"The technology is going way past our ability to handle it," says Douglas D. Wood, director of EDA solutions at Digital Equipment Corp. "The technology is outstripping the design tools."

The answer may be a whole new generation of tools that can test and verify such complex systems and the noise they create. Or, it may be moving from today's "yes" and "no" gate designs to layouts that can also register "maybes"--"majority voting," as ARPA's Robert Parker puts it.

"Feedback is a powerful concept," Papadopoulos adds. "Why aren't the principles of feedback used in computer design?" A one-time mechanical engineer, he laments that he's "moved into a world that doesn't understand how to measure itself and correct its behavior." But as noise problems intensify, "it can't be that we insist it all will work," he says. "That's doomed."

As the pace of technological change picks up, researchers are also investigating how to deal with hardware obsolescence. Parker sees a new wave of "adaptive computing," which would allow a user to adjust the internal hardware configuration of a system. It's the same concept as a field-programmable gate array--but for an entire system. "You could buy your computing by the square inch," he says. Users might download various configurations for a chip's gates and memory over the Internet, depending on the specific task at hand. This would make it simple to update hardware as semiconductor technology improved, eliminating the problem of out-of-date chips.

Configurable computing is beyond today's compiler technology, and would require a whole new class of chip. But "it looks like there will be some significant progress there," Parker says. "Maybe in 10 or 15 years, the Intel x86 will have a configurable device."

Future generations of computers might also someday free both users and system designers from what Papadopoulos considers the "tyranny" of specific architectures such as SPARC, Intel, Alpha, and PA/RISC. "I'm betting we are right at the cusp of a revolution," he says. If the industry could somehow eliminate the need for "backward compatibility" with older systems, he says, "I think that brings a whole new renaissance to computer design."

Something dramatic is about to happen, he argues; just look at history. When the mainframe was at its prime, the minicomputer knocked it off its perch. During the heyday of minis, along came the desktop computer. Now, with desktop computers everywhere, he believes it's "time to hit the reset button again" and develop yet another way to compute.

A more natural experience. Will users of the future be able to forsake the keyboard, mouse, and trackball for easier interaction with their machines? If not completely, it's still likely that computers of the future will be able to recognize speech, track eye movements, and perhaps even read your mind.

But all of this will take vast amounts of computing power. "Performance makes a computer truly personal," Casanova at Apple says.

For those working on speech recognition, "computer power has been a blessing," according to Salim Roukos, manager, Language Modeling for Speech at IBM's Thomas J. Watson Re-search Center. Already, the center has an 80%-accurate prototype that can understand someone saying,"Show me all the available flights tomorrow from Boston to New York." Now, such natural-language recognition takes the full power of a PC; but future computers will need only part of their capabilities. "Ten years from now, you should be able to talk to your computer, tell it what you want," he says.

He imagines the family of the future sitting in front of a screen together, planning a vacation by talking with the machine. A merged computer-TV could be instructed to monitor the day's newscasts, storing reports on a certain topic. And engineers may be interacting with their CAD programs by speech as well, eliminating the need to click through layers of menus.

Future computers might also respond to a directed glance. "If a computer knew what you were looking at, you could do things that you can't imagine today," according to Baskett. As computers get more powerful, a "brute-force" approach impractical now could provide such capabilities by using a camera to watch the eye and a computer to figure out what's being looked at. Several small companies, meanwhile, say early work is going on to develop computers that respond to brain waves.

Computers of the next century are almost certain to offer vastly more realistic experiences than we have today. "Three-D virtual reality is definitely something that will be commonplace," predicts Holt at Xerox.

While many envision realistic computing of the future to involve immersive headgear with VR goggles, Baskett at Silicon Graphics advises engineers not to dismiss the possibilities of 2-D displays creating 3-D experiences.

"We haven't seen how well we can do yet," he says. "What we have on the drawing board is way beyond where we are now." He predicts immersive experiences with apparently conventional technology. How? Faster, better-designed systems, much higher image quality, and schemes to incorporate peripheral vision in the experience. The SGI Visionarium, for example, uses three curved, 2-D screens to create a realistic, computer-generated experience.

Tying it all together. The problem with electronic devices of the '90s is they're too big and too isolated, says Casanova at Apple. "They live in a space that's totally unrelated," he argues. "They've evolved on technical islands." PCs, cellular phones, desk phones, pagers, and the Newton all store telephone numbers, for example, but don't share the information very well. "The worst thing you can do to me is change your telephone number," Casanova laughs.

A Newton-like device could be attached to a refrigerator, he speculates. Write "need milk and eggs," and the message would be sent and translated into another pocket-held device with a GPS receiver. That device would "know" when you were approaching the store--and beep you to detour on your way home.

"The goal is to make 'computers' disappear," Casanova says. "The large, loud, gray, boxy devices will fade away." Alternatives range from nice flat-panel displays to pocket devices to "wearable computers."

Microsoft CEO Bill Gates has said he expects the "Wallet PC" might be available within five years, which could take the place of everything from credit cards to theater tickets--and include GPS with maps to point you where you want to go.

"Ten or 20 years from now, I think we'll all carry a common version of a pager," Peter Daniel Kirchner at IBM's Watson Center says. They will be much "smarter" than pagers of today, and interact via infrared signals. "The limit is communications infrastructure, not our ability to develop devices."

Apple is working on an architecture that would combine such disparate technologies and allow data in digital, image, voice, or e-mail form to easily interact. "The trick is to extract stuff that's important to you," Casanova says. The vast increase in future computer power could be harnessed to "extract knowledge from information," he says, such as an "army of information agents scouring the world's information network for you."

At IBM, researchers have developed "data-mining" software to extract useful patterns from mountains of numbers. Among the early test users: National Basketball Association coaches, to find out, say, that player X has vastly improved shooting percentages when driving from the left side against opponent Y.

For engineers, such software could be used in defect analysis, to discover what contributing factors are most important, or intelligently discern important patterns when analyzing thermodynamic tests. This type of program could be going through all that Internet data pouring into your PC, picking out the useful nuggets of information.

"We'll have new ways of using computers to solve problems we don't even think about today," says Baskett at SGI. "We have only scratched the surface."


Coming Attractions

  • 3-D virtual reality

  • Optical computing

  • Biocomputing

  • Natural-language recognition

  • Eye-tracking

  • Ubiquitous wireless networks


"Over time, biocomputing devices will be created. Many will actually be grown, harvested out of labs instead of carved out of silicon." --Frank Casanova, director of Apple Computer's Advanced Prototype Research Lab


Components to Watch

  • Silicon compounds

  • Super-capacity disks

  • Higher-resolution displays


What You See....

Processing power keeps racing forward and memory price-performance has plunged. But when it comes to what you look at, the technology has been comparatively static.

"Display technology has progressed very slowly compared to anything else," sighs Forest Baskett, Silicon Graphics senior vice president and chief technology officer. "It sure would be nice if display technology were radically improved."

In some futurist visions of computing, the monitor is replaced by goggles, immersive projections, or holograms. Others, though, believe the big desktop machine will be replaced in part by flat panels on the wall. But before that can happen, manufacturers must come up with a way to cram millions of devices into a relatively small area. Defect rates are unforgiving: If a couple of pixels burn out next to each other, the display will be noticeably degraded.

Today's 72-dot-per-inch standard for computer-monitor displays is also woefully inadequate for serious long-time reading, notes the director of Apple's advanced prototyping lab, Frank Casanova. Most good computer printers output at 300 dpi; meanwhile, current display experiments are just looking at 150 dpi. "I fully expect 300-dpi displays in the future," he says confidently. But engineers must first develop ways to shoot light at higher speeds with pinpoint definition and reliability--and without flickering.

"There just haven't been many ways to generate light," Baskett says. The limit on CRTs is how well and fast beams can be focused, as well as their quality. And the difficulty in manufacturing defect-free flat panels has made them expensive.

"Lasers might be something that comes into play here," he says. "So far that hasn't quite happened." But with the potential for very fine spot size and high-speed switching, in half a century lasers might be used not only for computer displays, Baskett predicts, but replacements for the TV tube as well.


The Engineer's Role

"We are on the verge of huge changes in work patterns," predicts Ty Rabe at Digital Equipment Corp. Computer networks make it feasible to collaborate with engineers in other offices; why not in the home? "The technology is very close," Rabe argues. In 10 or 15 years high-speed fiber optics may come to the home--and with it, lightning-fast access to large design files and at-home video conferencing.

Increased power will mean the ability to simulate larger products in greater detail, eliminating the need for physical prototyping and cutting time to market. And virtual reality will allow engineers to understand what the products they design on computer will be like in the real world.

"Designers are going to be in heaven," says Apple Computer's Frank Casanova about 3-D virtual worlds of the future. "We will lose them in there. They'll never come out."


Predictions from the past

"I think there's a world market for about five computers." --Top industry executive at the dawn of the computer age

"Computers in the future may weigh no more than 1.5 tons." --Popular Mechanics, 1949

"I have traveled the length and breadth of this country and talked with the best people, and I can assure you that data processing is a fad that won't last out the year." --editor at Prentice Hall, 1957

"There is no reason anyone would want a computer in their home." --founder of a major minicomputer company, 1977


Technical Hurdles

Deal with the physical limits of submicron technology, when noise jumps sharply and features will eventually be just a few molecules wide.

Learn to test and debug very complex circuitry.

Develop new programming and compiler technology to take advantage of more flexible hardware.

Sign up for the Design News Daily newsletter.

You May Also Like