|(Image source: Microsoft)|
The next wave of automotive innovation depends on artificial intelligence (AI).
At his DesignCon 2017 keynote, Doug Seven, Microsoft's Group Program Manager – Things That Move, told the audience that in his meetings with automotive execs cars aren't being talked about like machines anymore. “Cars are essentially data centers on wheels,” Seven said.
Seven said there's enough pressure on the automotive industry right now to make changes toward autonomous cars, but there are also key innovations that need to happen. Paramount among these for Microsoft is that cars need to become intelligent.
“When you think about what it takes from a software and hardware standpoint to make a car autonomous it's daunting,” Seven said. “It's not just about sensors and connectivity, it's about intelligence and low latency.”
Machine learning is already being applied to vehicles and holds promise in areas such as predictive maintenance and real-time analytics. At CES 2017, Microsoft unveiled the Microsoft Connected Vehicle Platform, a cloud-based platform that serves as a reference for automakers and design engineers to build solutions in five key areas: telematics and predictive maintenance; productivity and digital life; connected advanced driver assistance systems (ADAS); advanced navigation; and customer insight and engagement.
In addition to this, Microsoft has already released its Microsoft Cognitive Toolkit, an open-source toolkit for creating applications capable of deep learning across clustered environments, as well as the Microsoft Cognitive Services API.
Seven talked about how Microsoft's Cognitive Services API already includes an Emotion API capable of understanding the emotions of someone's face, including anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise. He asked the audience to imagine applying this same capability to autonomous vehicles. “If we could detect things like road rage or stress, we could [have a vehicle] do things to alter the environment for the driver or passengers.”
“The Best Manufacturing and Design Tradeshow of the Season.” Register today for Pacific Design & Manufacturing , Feb. 7-9, 2017, at the Anaheim Convention Center, and see why your colleagues and competitors attend North America’s largest annual advanced design and manufacturing event year after year.
But that term “cloud-based” is the obstacle. Right now the Cognitive API and Cognitive Toolkit are cloud-based, which won't suffice for self-driving cars.
“What we can do in the cloud with our AI capabilities is to build algorithms and models to let cars become intelligent and make decisions,” Seven told the DesignCon audience. “But we can't rely on the cloud because we might lose connectivity or their might be latency issues."
The next step is to create AI within the cars themselves that will have meaningful interactions with drivers, passengers, and the car's environment. "We're talking about putting hardware in a car that is capable of processing data for the purposes of AI," Seven said.
“We have to move the workload from the cloud to the edge,” Seven said. “We need hardware in the car capable of processing that data in real time.” According to Seven, self-driving cars need latencies in the sub-millisecond range to be fully effective on the road. “The question is how do we move that tech from big data centers to data centers on wheels?”
When you think about all the systems needed inside of an autonomous car: a control unit; infotainment; navigation; cameras; RADAR; LIDAR; not to mention some sort of sensor fusion system to bring it all together so that the car can make actionable decisions, the need for robust, on-board hardware becomes obvious.
In its own efforts at CES 2017, Nissan released a video as part of a new partnership with Microsoft to bring Microsoft's Cortana virtual AI assistant (essentially Microsoft's answer to Siri) to Nissan vehicles. The video features a driver interacting with his car in much the same way as a human assistant – with both making suggestions and decisions on the fly as far as his driving route and personal schedule.
Referring to the video, Seven told the DesignCon audience, “The important thing to look at is not just the voice recognition, but that the AI is responding to the human, being proactive and able to control the vehicle in an important context. It's not as simple as telling the car to do something. It's an intelligent interaction that includes proactive capabilities. That's how the auto industry is looking to evolve.” And that's the functionality that needs to happen inside of the car itself, and not via cloud computing.
To be clear Microsoft has released a statement saying that it doesn't want to build its own car: “Microsoft is not building its own connected car. Instead, we want to help automakers create connected car solutions that fit seamlessly with their brands, address their customers’ unique needs, competitively differentiate their products and generate new and sustainable revenue streams.”
But AI for cars is only a step in what Seven said Microsoft believes will be an even larger ecosystem of “intelligent things that move.”
“As we move into new spaces like drones and what place they will play in our world, we can look at automotive as a use case and start using that to open up into others spaces like logistics, retail, and even smart cities,” Seven concluded.
Chris Wiltz is the Managing Editor of Design News.