Sponsored By

Great Interface Design Is the Key to Human-robot PerformanceGreat Interface Design Is the Key to Human-robot Performance

UX designers have tremendous influence on how well we can coexist with, and benefit from, robots. Here are some key points to consider.

Stephanie Van Ness

February 26, 2020

7 Min Read
Great Interface Design Is the Key to Human-robot Performance


As robots take on more and more tasks, the need for great UX increases.  (Image source: Adobe Stock)

Welcome to the age of IoRT — the Internet of Robotic Things.

The robotics market, valued at $39.72 billion in 2019, according to Mordor Intelligence, is experiencing a sea change as robots move beyond the military and industry, where they’ve long built widgets and assembled heavy equipment, like the shiny new SUV in your driveway.

Robots are perfectly suited to tasks that are repetitive, physically demanding, or potentially hazardous to humans. But they’re capable of a lot more. Today’s robots can access big data, cloud computing services, and distributed intelligence in sensor-enabled environments. They can interpret information from myriad sources and can activate other robotic systems. They’re capable of making decisions without human guidance.

These days, robots are doing everything from assisting surgeons in the operating room to driving for Uber to delivering packages for Amazon. They’ll even help carry your groceries home in style. In the mining industry drones serve as “eyes in the sky” for ground-based robotic vehicles. Boston Dynamics' Spot robot is sniffing out explosives for the Massachusetts State Police. Modern robots are handling more tasks than ever and also working collaboratively.

In the retail industry, a growing legion of stores have ordered fleets of collaborative robots (cobots) to ease labor shortages during the busy holiday shopping season. For instance in 2019, Japanese mega-retailer Rakuten ordered 200 cobots for its Super Logistics division, up from 40 the company used in 2018.

Robots are also getting smarter. Alphabet X announced it’s working on a R&D project called The Everyday Robot. The intent is to develop a “general-purpose learning robot.” Right now robots can only successfully perform very specific, specialized tasks. Alphabet X is combining complex machine learning software with cameras with the aim of letting robots learn from observing the world around them. This way, they won’t need to be taught how to respond in every potential situation they may encounter.

While these technological advances are exciting, they’ve fundamentally changed how we interact with robots – some of which which we may no longer have direct control over. For this reason, we need to create intuitive human-machine interfaces (HMI) and well-balanced user experiences that allow us to best interact with these machines.

Designing User Experiences for Robots

Yes, The Brookings Institution predicts that 36 million Americans will face the possibility that more than 70 percent of their role may be taken over by artificial intelligence. But fortunately, there are still a lot of areas where robots are primarily intended to support, rather than replace, humans. Robotic surgery, for instance, still requires the hand of a skilled human surgeon.

These types of robots are there to increase users’ safety and efficiency (and often precision) and shrink the time it takes to execute certain tasks. Determining the best, most natural ways to interact with these machines requires a hefty dose of design thinking, including deeply exploring human behaviors.

“As these are autonomous entities, the context of user’s interaction with them will change depending on the situation,” explained Boris Savic, Associate Director of User Experience (UX) at Boston UX. “Those variations go beyond simply different robot behaviors in different settings, but extend to the possible array of human responses to that robot behavior. We should not underestimate the impact on the user of essentially a foreign body/object, even if the robot is performing a benign or familiar task.”

To that point, sometimes it’s necessary to include what appear to be superfluous actions to the normal/required robot activity designed to put the end-user at ease. “The more familiar the task, the more ingrained the user’s expectation is of how the task should be performed,” Savic said. “Users can have pretty sophisticated expectations that may not always be in line with the capabilities of the robot. Good UX design can help bridge that gap.”

Best Practices

A seamless user experience and frictionless user interface (UI) are essential for fostering collaboration between humans and robots. Because there is not yet a unifying platform, the best UX practices in robotics are still being developed. “At this point, every robotics project is a whole new experience and a lot of work is based on assumptions,” Savic said. “For this reason, it is essential to have a solid UX process in place; one that is built upon best practices and known patterns gleaned from adjacent areas of design. This process can provide the problem-solving framework designers need to perform well in this new arena.”

While designing UX for robots is more complex than designing for two-dimensional touch devices, the core process is quite similar. It begins by applying the same design thinking used to solve problems outside of robotics. And it hinges on a designer’s ability to gain a solid understanding of the human who will interact with the robot, including identifying their needs and use scenarios.

Once there’s an understanding, the rest of the process is the same as for other types of UX projects: Define the problem the bot interaction must solve; determine how to measure success; craft solutions; create prototypes; test prototypes with real users; analyze the feedback; and iterate the solution.

As there are some aspects of robot UX that require more exploration than for flat-screen design, here are four key things to pay particular attention to in order to improve human-robot interaction:

1.) Hardware

UX designers must clearly understand the capabilities of the robot’s hardware -- know what’s possible and what’s not -- before creating the UX.

2.) User Journeys

How do people actually use the robot? To answer this most basic question, designers need to understand and appreciate the user’s perspective. This includes exploring the needs of all user constituencies, each with its own context of usage. And don’t forget "passive" users -- people who may have casual interaction with the robot without controlling it or knowing its expected pattern. (For instance, if Boston’s MBTA used robots on its trains, commuters would be the passive users.) This is where user journeys and mapping scenarios shed light -- and are of even more value when designing for robots than say, designing a website.

3.) Voice Control

Voice controls and how they combine with touch to direct the actions of the robot are a big consideration. Are the controls remote or on-robot? Combination of the two? Knowing what to choose starts with understanding the desired action and the context and designing for the optimal experience.

“Controlling something that by definition interacts with the environment means there is an art to designing touch or voice interactions for robots,” Savic said. “As UX designers, we must understand that the impact of any action by the robot plays out in the physical world, not just the virtual one. That’s why greater emphasis on elements like pace, spacial context, and human factoring is essential, as is guarding against unwanted or exaggerated gestures that could cause an accident or create a perceived threat to either an end-user or someone in close proximity.”

4.) User Testing

Certainly, user testing is crucial for any type of design project to confirm whether the UX is implemented properly. But whereas you might be able to get away with some cursory testing with a small number of users if you’re building a new corporate website, that’s not the case if you’re creating human-robot interactions for devices that can learn and operate autonomously. In this case, extensive user testing is the rule. Discovering pain points early and then iterating improved UX solutions is the best way to make certain the user journey is seamless, comfortable, and safe.
So, here’s the takeaway.

Robots are here to stay. UX designers can have tremendous influence on how well robots are embraced by humans and integrated into our daily lives. By designing intuitive HMIs and approachable, smooth user experiences, UX designers can help us coexist with -- and benefit from -- these intelligent machines.


Head of content development, Stephanie Van Ness is Sr. Marketing Communications Manager and Chief Storyteller at ICS & Boston UX. An experienced copywriter with a Boston University J-school degree, she writes about user experience (UX) design and innovations in technology, from self-driving vehicles to gesture-controlled medical devices. Her work has appeared in a number of industry publications, including Medical Design & Outsourcing, Mass Device, Connected World, Medical Device + Diagnostics Industry, UX Collective and Prototypr.

About the Author(s)

Stephanie Van Ness

Head of content development, Stephanie Van Ness is Sr. Marketing Communications Manager and Chief Storyteller at Boston UX and ICS. An experienced copywriter with a Boston University J-school degree, she has written extensively about both medical device development and cybersecurity for companies including FM Global, Sovos, ADP and FIS Financial.

Sign up for the Design News Daily newsletter.

You May Also Like