Manohar Srikanth, Kavita Bala, and Fredo Durand have innovated a drone that dynamically lights moving subjects with a designated rim lighting effect at the photographer’s discretion. While these drones are still in the prototype stage, they are a testament to what can be done with drones in photo and film studios.
The Litrobot uses a combination of Lidar and imagery from the main camera to ensure moving subjects stay lit in the perfect amount of light to capture the desired lighting effect. Lidar scanning uses lasers to measure the distance and position of the drone from the subject and the photographer. Lidar is typically used to render HQ topographies of geographical locations, so this technology enables these Litrobots to be able to fly to the perfect position, accurately.
The MIT Litrobot photographs a dynamic subject with two light sources.
On the main camera, the photographer can specify the lighting effect and customize parameters. For now, the Litrobots are only programmed to create rim lighting effects, which were chosen because they are the hardest lighting effect to produce. Hence, the skill and accuracy involved to generate dynamic rim shots is impressive and stands as a testament to the possibilities of Litrobots. The width of the rim-shot border can be customized and reproduced with accuracy as well, which is more than most professional lighting studios can do.
The main camera takes a picture 20 times per second. This picture is sent to a main computer and analyzed using an algorithm that relays information to the drones. Basically, this process is meant to control the accuracy of the lighting parameters by analyzing and communicating the width of the rim lighting to the drones. The drones can then use this information to adjust their position to get an accurate and consistent rim shot of the subject. The Litrobot currently has an accuracy within millimeters, which is more accurate than using humans could be -- particularly for a moving subject.
The team plans to continue refining its system for practical use in the real world. Considering these impressive results, I have no doubt that the photography and film industries will change dramatically in the next 10 or 20 years with drones doing lighting, camera, and sound work in tandem.
The Litrobots can already work in tandem with each other, a moving subject, and the photographer. Imagine how easily this technology can be scaled, considering existing technology already accommodates these possibilities.
The flying RED Epic (or Scarlet) drone already allows professional film quality from anywhere. Couple these drones with the Litrobot, and you can have professional film quality shots in real-time. Audio can also be captured by drones with boom mics and the like flying over the actors. However, they’ll have to remove the whizzing sound of the drone in the post-production. It has even spawned a few companies that specialize is using the RED equipped drone.
The possibilities are continuing to open up for drone technology as more than just a hobby project. It looks very possible that film companies will soon be able to hire out a team of film, audio, and lighting robots for their latest productions. These Litrobots are a very impressive step towards the future of human-computer collaborations in art and film.