NI's Kamran Shah explains why theright development platform is essential for using the right sensors and actuators in robotics.
How does NI view the robotics market?
Everyone is used to the idea of robotic arms automating production of things like cars and more recently we've seen robots in our houses with robotic vacuum cleaners and toys like the LEGO MINDSTORMS NXT. There's also been significant investment in the area of autonomous vehicles by the military and soon we should see some of those technologies showing up in our everyday lives. Robots, or elements of robotics, should start to surface in our lives more and more over the next decade.
What are the key things people creating robots should consider?
Fundamentally, engineers have to interface with the right sensors and actuators. These could include analog input and output, digital lines, GPS sensors, LIDARs, cameras, motors and CAN interfaces for vehicles. This makes software a key component of any robotic system. Different sensors must be combined in a robotics system and developers need to make sure they use a development platform that supports this. Developers also need to combine algorithms from basic filters to more complex image processing for robotic systems. The brains of robots can range from PCs to embedded controllers. It is very convenient to design and develop initial prototypes using a PC and deploy functional prototypes on real-time embedded controllers. This has been a focus with LabVIEW for NI, where LabVIEW code can be run on a PC as well as on real-time embedded controllers that ensure deterministic execution of the control systems in robots.
What are some key enabling technologies for robotics?
Robots are inherently parallel. In traditional single-core systems and multi-tasking OSs we are really just time slicing to get the appearance of different parts of the application running at the same time. This can be sufficient for some applications but when many operations are being performed or high-speed response is needed, parallel architectures become very important. Multi-core processors and FPGAs are technologies that can greatly benefit robots. With a multi-core processor running LabVIEW Real-time, developers can isolate the control algorithm for the robot onto one core to ensure it runs at the desired loop rate and use the other cores on the processor for lower priority tasks or to perform specific signal processing. With FPGAs, developers are able to define as many parallel running portions of their application as the FPGA fabric allows. The challenge with FPGAs, however, is the specialized VHDL programming knowledge required to program them. A focus of LabVIEW FPGA, which supports graphical programming of FPGA-based systems such as NI CompactRIO, has been to allow domain experts to take advantage of the parallel execution of FPGAs without really needing to be experts in VHDL.
What are some existing examples of robots you've been impressed by?
Virginia Tech's RoMeLa, or Robotics and Mechanisms Lab., has created an autonomous humanoid soccer-playing robot called DARwin which was the first U.S. entry in the humanoid division of RoboCup. Virginia Tech also worked with TORC technologies and successfully competed in the DARPA Urban Challenge, coming in third place with their autonomous vehicle. What's exciting about both applications is they were developed mainly by mechanical engineers, not computer scientists. It's been a goal of NI with LabVIEW to empower domain experts and seeing what the students were able to accomplish is very gratifying.