Listen to , creator of the AUR robotic lamp.
MIT student Guy Hoffman, a PhD candidate involved with the Robotic Life department of the MIT Media Lab, developed a robotic lamp that automatically follows the movement of its user and adjusts its lighting volume and color using motion sensors and voice-activated commands. Named AUR, the lamp is made from aluminum, steel, plastic and some acrylics.
The lamp head resembles an eye, and the movement of its arm gives it an organic look and feel. The lighting fixture from ColorKinetics (the company that was recently purchased by Philips) is composed of red, green and blue LEDs to provide a full spectrum of luminescent colors and has an aperture, which opens and closes to increase or decrease lighting. The lights are controlled using open-source drivers and custom code to create DMX commands, a theatre lighting protocol.
To operate the lamp, the user wears a glove with motion sensors, which are illuminated by a lighting rig set up in the room with red lights. The lamp follows the movement of the user’s hand in the glove using a built-in sense of movement based on relative motion, so that it only reacts to broader movements made by the user. “It’s not going to start jittering just because I’m writing,” says Hoffman, “so only when I go to another place in a settled position it will come and illuminate another position.”
The robotic arm that the lamp uses stems out of another project at the Media Lab called RoCo (robotic computer), where a computer and monitor are built around a robotic arm that adjusts the screen to the user’s position and movement. The robotic arm uses DC motors with optical encoders and harmonic drive gears.
Hoffman’s primary focus with this project is to better understand the relationship that people have with technology, specifically technological objects. “The one thing that I was interested in was thinking about this idea of robotic objects, to have robots that are a sort of evolution of what we have at home anyway,” says Hoffman. “What if we just take objects and make them more alive, how would that make these robots behave, and how would that make our relationship with these robots different?”