"Imagination is more important than knowledge." ~ Albert Einstein
Science fiction has long been an inspiration for science and especially technology. The early Apple Newton was in some ways a first attempt at the tricorder from the 1960’s Star Trek TV series. What effect does science fiction really have on technology? Are we seeing the impact of such inspirations in the biomedical space? These are a few of the questions I asked Lou Anders, a long-time and well-known editor and author of science fiction and fantasy novels.
Design News: What affect does science fiction have on technology?
Lou Anders: There is a wonderful website called “Technovelgy.com” – where science meets fiction – on which they list every sci-fi idea that has become reality. The last time I went to the site, they had something like 2,500 entries listing both the device and the expression of the device. A great many of the devices are there because someone read about them in a sci-fi story.
Design News: How about that other way around, i.e., what affect does technology have on Sci-Fi?
Lou Anders: William Bison and Bruce Sterling created the cyberpunk movement in science fiction. Gibson first wrote about cyberspace on a manual typewriter. Later, he talked about getting his first computer, sent to him by a company that wanted his endorsement. He took apart to the computer and was absolutely depressed to find a disk inside. He said, “well, this is just a record player.” He had expected to see some kind of crystalline thing with red lasers shooting out it. Instead, he found a record player. He said he never would have written cyberspace in “Neuromancer” if I had known that it was implemented on little more than a record player.
Design News: Record player? I assume what he found was the computer’s hard drive or perhaps an early floppy disk. Both systems do look like record players. That brings up an importance difference between science fiction and technology innovation. Most technology improvement, as brought forth by engineers, is accomplished by incremental changes. That’s because most designs are constrained by cost and time-to-market pressures to use existing technology.
Lou Anders: Do you recall Microsoft’s Project Natal demonstrations? It was the Nintendo Wii minus any kind of physical controller. A camera sat on top of the Xbox monitor just tracks what you’re doing. I saw the demo that they showed their game developer partners event. Microsoft was showing their partners what was coming so the partners could start thinking about what games to put on it. Here’s one example: A kid walks into the living room. On the screen is a monk, who sees him walk in. The monk spontaneous says, “ I see you have returned for another lesson.” Then the kids and the monk battle each other. The kid has no hardware on him at all, not controller or anything. But his image suddenly appears on the screen and his motioned are copied realtime into the game. It blew my mind.
Design News: I knew that Intel and others have been developing commercial grade facial recognition systems, but this application is amazing. It is far more interesting than the digital signature application that I’ve written about. Variations on that theme include headbands that respond to thoughts in the brain, as well as recent developments in chips implants.
Lou Anders: I wouldn’t mind wearing a chip, as soon as I was sure they couldn’t spam it. Nothing frustrates me more than having my computer’s browser stop working when you can’t make a connection. I’d hate to not be able to access my own brain.
We have an author named David Louis Edelman who wrote a trilogy called The Jump 225 Trilogy. It’s a world where, at some point, there was a robot revolution which caused a backlash against technology. Now the society is rebuilding. The way that the people deal with their fears of external technology is to restrict all tech to internal systems. Everybody has nanite threads throughout their bodies and software companies compete for the rights to build the software that runs on it. In this society, you have small 4 and 5 person companies who compete to write this software. One programs is called Poker Face 3.5, which you run during a business meeting so you don’t give anything away during negotiations. All of these software programs are loaded into your body. Whenever a new program comes out, it’s ranked based upon popularity and performance.
But remember; this author wrote this book in 2000. Again, the model is not huge corporations, but smaller five person teams writing quick software that is dumped into a data sea and then ranted instantly. It mirrors what have become the applications on an iPhone!
The crux of the story, though, is the creation of a program called “multi-real,” which allows instantaneous parallel processing of anything you might want to do. So the nanites in a person’s body that run multi-real can do anything. It’s a real game changer for that society.
Design News: Even in this example, science fiction touches upon reality. Embedded multicore systems are everyone, although not yet in our bodies. But few of these multicore devices are true parallel processors. I recently interview a multicore software expert at Intel – Max Domeika – who reiterated that the software challenges in true multicore processing are significant. Here, too, we find that technology moves by increments. Although multicore processors are now readily, software technology is lagging. Most programs are still using non-parallel languages on multicore like C/C++. We must use legacy system for economic and other reasons. That is the inertia. There are “game changing” technologies, like superconductors, nanotech, and other. But they take a while to be realized. Still, the direction we select may be greatly influenced by our imagination – not the engineers, but the writers of Sci-Fi.
Lou Anders: I remember a quote from Robert Anton Wilson: "The future begins first in imagination, then in will, then in reality."
|Image Source: Photo by Stefan Cosma on Unsplash|