Should AI Be Able to Feel as Well as Think?

An ESC Minneapolis panel discussed the merits and challenges of creating artificial intelligence with an emotional component.
(Image source: Gen Dalton on Unsplash)

At their core, our attempts to create artificial intelligence are really attempts at mimicking and re-creating the workings of the human brain. While neural network and deep learning architectures are showing great progress and taking inspiration from our understanding of the thought processes of the brain, there's another element that doesn't get nearly as much attention: emotion.

There is a strong argument for AI to understand our emotions, with companies like Affectiva applying AI to recognize and respond to human emotions for customer service and even autonomous driving applications. But what if the AI itself exhibits emotion?

A panel at the 2018 Embedded Systems Conference (ESC) in Minneapolis discussed this very question. Design News senior editor, Chris Wiltz, spoke with a group of experts including Phil Magney, founder & principal advisor at VSI Labs, Dr. Gunnar Newquist, founder & CEO of Brain2Bot, and Justin Grammens, co-founder of Lab 651.

Famed futurist Ray Kurzweil once said that any neural process can be reproduced digitally in a computer. Is that true? Should it be? Can we make AI smarter, and more emotional, by understanding our own brains? And will emotional AI lead us down the path to a robot rebellion?

Watch the full ESC Minneapolis panel, “Creating AI That Thinks and Feels,” below—and share your thoughts in the comments!

Comments (5)

Please log in or to post comments.
  • Oldest First
  • Newest First
Loading Comments...