Design News is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Meet Misty II: The Robot As a Platform, Not a Tool

Meet Misty II: The Robot As a Platform, Not a Tool
Misty II is a development platform for engineers and makers that was created to change how we think about robots.

Misty Robotics says Misty II is a development platform, not just another personal robot. (Image source: Misty Robotics)

Developers may remember a time when you'd boot up your computer and all you'd get was a blank screen and blinking cursor. It was up to engineers and coders to build the content; the computer was just a platform. Ian Bernstein, founder and head of product at Misty Robotics, believes robots today are in that same place that computers were decades ago. “We're at that same point with robots today, where people are just building robots over and over with Raspberry Pis and Arduinos,” Bernstein told Design News.

Bernstein is calling for a departure from thinking of robots as tools and machines to thinking of them more as platforms. Misty Robotics has designed its flagship robot of the same name, Misty, with that idea in mind. “It's about giving people enough functionality to start to do useful things—but not too much, where it becomes too expensive or complicated,” Bernstein said. “It's also about complexity. For developers, it is not approachable if you don't know where to start.”

Boulder, Colorado-based Misty Robotics' upcoming product, Misty II, is a 2-ft-tall, 6-lb robot. It is designed to do what the smartphone has done for mobile app developers, but for robotics engineers and makers—provide access to powerful features to open up the robot for a variety of applications. At its core, Misty II is driven by a deep learning processor capable of a variety of machine learning tasks, such as facial and object recognition, distance detection, spatial mapping, and sound and touch sensing. Developers can also 3D print (or even laser cut or CNC machine) custom parts to attach to Misty to expand its functionality for moving and manipulating objects. Misty II will also feature USB and serial connectors as well as an optional Arduino attachment to allow for hardware expansion with additional sensors and other peripherals. (One planned for release by the company is a thermal imaging camera.)

There are already several single-purpose robots available to consumers to use in the home. People will be most familiar with the Roomba robotic vacuum, but there are also robotic window washers, lawnmowers, security guards, and even pool cleaners currently available.

Speaking with Design News ahead of CES 2019, where Misty II was available for hands-on demonstrations, Bernstein said that, while the idea of a smart home full of connected robots all going about their various tasks sounds like the wave of the future, he doesn't find this vision particularly feasible. “It's not going to be economical to have single-purpose robots or eight different robots in your home,” he said. “A big part of that is cost. Robots require movements and motors, and you can't bring that raw materials cost down.”

Rather than moving toward a world of a collaborative robot for every job, we should be heading toward having a singular cobot that can be configured for a plethora of tasks, he said.

In a Galaxy Far, Far Away....

Misty Robotics is an offshoot of the same company that created Sphero. (Image source: Sphero)

The journey toward Misty II begins with Star Wars. But not just because of the inspiration that can be derived from characters like R2-D2 and C3-PO.

In 2014, Bernstein and his team were part of the Disney Accelerator program focused on supporting tech startups. While at Disney, Bernstein and his company (then called Orbotix) were working on a robot in a very simple form factor—a ball.

It was around this time that production was gearing up for Star Wars: The Force Awakens, the series entry that would introduce a new fan-favorite robot character, BB-8. While BB-8 was brought to life on-screen using puppeteering and other special effects, the team at Disney also wanted to create a real-life working model of BB-8. Orbotix's work caught the attention of Disney CEO Bob Iger, who instantly recognized the team's work as the solution to bringing BB-8 into the real world.

The time at Disney would allow Bernstein and his team to develop and release their first commercial product. They changed their company name to Sphero and released a spherical robot of the same name. Since its release, Sphero has found success as a consumer product and can be found in many stores. It has also found a home as an education product and has spawned a vibrant community of schools and educators that use it to teach STEM. Today, Sphero is used in over 10,000 schools worldwide, according to Bernstein.

There is also, of course, a toy model of BB-8 that is essentially a Sphero with a BB-8 skin on top.

There Is No Killer App

But you can't spend any amount of time with the team making the next Star Wars without picking up some new ideas. “Disney got us thinking about adding personality and story elements to our robots with Sphero,” Bernstein said. “We started thinking about what could really be done with robots. We had prototyped some more advanced robots at Sphero—telepresence robots and things like that—but they didn't feel quite right.”

What the Sphero team was searching for was how to create a product to which users would feel a deep, personal connection. R2-D2 and BB-8 are entertainment, but why couldn't they make the real version? “It was thinking about this idea of a robot in every home and office, and why couldn't a robot do useful things and have a personality and character?" Bernstein said.

They decided to dedicate an entire team at Sphero to developing this home robot product. In 2017, that team was spun off into its own separate company—Misty Robotics.

According to Bernstein, what tripped the company up in those early stages on the road to Misty was the search for a killer application. “We wanted to find the robot with the killer app,” he said. “ But every six months, every week even, we were driving toward something different.”

As a lifelong robotics enthusiast (he was building his own robots at home at the age of 12), Bernstein and his team realized that what was exciting about those early years working with robots wasn't the end result of having the robot perform some programmed function. It was the possibility—the sense that the robot could do anything. There was no killer app, the Misty team realized. The robot itself was the killer app.

Aesthetically, Misty is a far cry from the simplicity offered by Sphero, but it carries a lot of the same design philosophy. Bernstein said Sphero opted for a ball-shaped robot because of the inherent openness of the design. As soon as you give a robot a shape, you define its purpose. The goal with Sphero was to keep its potential as open as possible.

Even by collaborative robot standards, Misty's design is comparatively simple. The robot forgoes more of the humanoid designs seen in robots like Softbank's Pepper or other models and instead opts for treads for movement; simple, detachable arms; a head that uses 3-degree-of-freedom articulation; and a simple LCD display with two animated eyes. Overall, the look and feel of the robot is definitely more akin to a character than a tool. True to the company's history, Misty's design owes more to Disney animation than industrial design.

A featured demo from CES 2019 from Misty Robotics demonstrates Misty II's security capabilities. (Image source: Misty Robotics)

Bernstein said the design for Misty is the result of collaboration with human-robot interaction experts as well as character artists. “We went through 150 different versions of the physical appearance,” he said. “We wanted to create something that felt really approachable. And to do that, we're looking at things like what axis do we want to do head movement on. We started with up and down, then added roll for more expression.”

He added, “We talked to a lot of character artists and asked, 'If you could pick one feature of a person to help you identify and relate with them, what would you pick?' Eyes and eyebrows naturally came up, but many people also said posture—things like leaning forward and back when being talked to. So those were all things we took into consideration.”

Building a Platform

Misty I, an early version of the forthcoming Misty II. (Image source: Misty Robotics) 

Working with Sphero eventually led to a relationship with Qualcomm, the supplier of the chip hardware behind Misty II. The main applications on Misty II are handled by a Snapdragon 410 processor while perception, navigation, and mapping tasks are handled by a Snapdragon 820 running Qualcomm's Snapdragon Neural Processing Engine for artificial intelligence.

On the software end, Misty II runs a Windows IoT Core for its main processing and uses Android 8 for navigation and computer vision. Upon Misty II's release later this year, software developers will have access to a Javascript SDK (for skills such as facial recognition and mapping) and a C++ REST SDK (for supporting cloud-based server communication). There are also plans to release C# and Python SDKs as well as a Perception Engine SDK for accessing Misty's vision, touch, and sound capabilities.

While more enthusiastic makers will bemoan the lack of open source support for Misty, Bernstein said the decision to shy away from open source had to do with maintaining a level of customer satisfaction. “While Sphero isn't open source, it's as open as possible and we liked that model,” he said, adding that Misty's SDK allows for adjustment as deep as tuning the proportional, integral, and derivative (PID) constants for the robot's instruments. “But we do make sure you can't tune something so that you would break the unit.”

Let the Robots Handle It

Thus far, Bernstein says Misty Robotics has seen a lot of early interest in its robot as a foundational tool. “There's a lot of interest around elder care and creating a robot companion with a strong personality. There's also security, lots of entertainment types of things, and of course education and teaching...

“The reason robots work so well for education is they stimulate all your senses. You're touching it, it moves around, there's sound as it's crashing into something...I think you just train the information so much better. Going forward, I think we're going to see robots being able to help with teaching lots of different subjects. We have huge teacher shortages. Maybe there will be robot assistant teachers?”

The world is certainly many advancements away from having robots like The Jetsons' Rosie around our homes. But Bernstein thinks there's plenty to be excited for moving forward. “I think we'll see people really start to dig in with Misty II and other people will jump into more end applications,” he said. “There's a phase, whenever anything comes out, where it's a little bit novel. But at some point, the single, multipurpose robot will be able to do a bunch of things very well.

“I'm excited about robots doing more interesting, useful stuff for us. And that's along with taking a lot of the mundane things out of our lives. Let's do more meaningful things as people...It's robots that should be doing laundry.”

DesignCon 2019 engineering education By Engineers, for Engineers
 Join our in-depth conference program with over 100 technical paper sessions, panels, and tutorials spanning 15 tracks. Learn more: DesignCon. Jan. 29-31, 2019, in Santa Clara, CA. Register to attend, hosted by Design News’ parent company UBM.

Chris Wiltz is a Senior Editor at  Design News covering emerging technologies including AI, VR/AR, and robotics.

Hide comments


  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.