We’ve been taught by movies since HAL refused to open the pod bay doors to be wary of artificial intelligence.
Although the movie 2001: A Space Odyssey premiered in 1968, research house Gartner predicts that mainstream adoption of smart machines – those that utilize AI, cognitive computing, machine learning, or deep learning – will reach 30% by large companies in 2021.
AI becoming reality within the next few years does not have to mean, however, that the smart machines AI brings will guarantee negative interference or impact on the human way of life.
|Woz, co-founder of Apple, will share histhoughts on AI, robotics, IoT, and more at Atlantic Design & Manufacturing on June 13. Register for the event here!|
“You can’t really stop progress,” said Steve Wozniak , co-founder of Apple and the engineer behind the Apple II, the world-changing first mainstream personal computer, in a conversation with Design News . “Learning, science, being able to make things that never existed before—You can never stop that. Those things can turn out to have bad aspects. Study the atom and you get the atomic bomb. Learn how to build machines that can make clothing and you could have a lot of people out of work and people have to do other things.”
Even with progress coming, seemingly faster and faster each day, we are years away from any AI capable of a HAL-like learning and dedication, let alone robots stealing everyday jobs.
“There’s sort of a fear with artificial intelligence that machines could become so intelligence and versatile that they could totally replace a person so there wouldn’t be other jobs to go to, but that is so far off it’s an unrealistic fear at this stage. It would take decades and decades,” Wozniak said.
Even machines like IBM’s Watson – which not only has bested human players at Jeopardy! but has shown promise in besting doctors’ abilities in making and managing medical diagnoses, as well as having proven use in allowing legal firms to quickly and accurately extract relevant details from dense legal briefs – has been programmed as to how to approach its cognitive computing.
“We have machines that can learn to play a game faster and better than a human,” Wozniak noted. “For 200 years, we’ve had those machines that can make clothing better than a human. It seems like they are thinking better and faster than us but, we told them what to think about, what to work on, what to learn and the method to learn it by—and then it learned very well.”
Wozniak points out, too, that it’s not the machines themselves that have born their intelligence. It’s still humans who tell the machines what to do and how to learn.
“We do not have a machine yet that says, ‘what should I learn, what should I tell myself to go learn, what are the important things to go do.’ And the ethical fear is just a little bit of that those machines will want the things we [humans] want,” he said.
So, for now at least,