The Raven II software that runs the robots is based on the Robot Operating System (ROS) open-source robotics code project. It provides a common framework of libraries and tools for several different kinds of robots, including service and research configurations.
Once they are installed at each campus, the robots will be networked together to allow data sharing and collaboration experiments. This will make it easier for researchers to collaborate in various ways, including sharing software and replicating experiments.
Closeup of open platform surgical robot Raven II, showing mechanical wrists with tiny pincers. (Source: University of Washington)
Each Raven II system includes a surgical robot with two robotic arms, a camera for viewing the operational field, and a surgeon-interface system for remote operation. The system is precise enough for use in research on advanced robotic surgery techniques, such as online telesurgery. The earlier Raven I, completed in 2005, was also used for this type of research at UW. The Raven II has more compact electronics and dexterous hands that can hold wristed surgical tools, similar to the newest commercial surgical robots. Like those models, a surgeon viewing a screen can guide instruments to perform a task such as suturing by looking through the Raven II's cameras.
Rosen and UCSC postdoctoral researcher Daniel Glozman have developed a Raven IV surgical robotics system with four robotic arms and two cameras. It is designed for collaboration between two surgeons working in separate locations, connected over the Internet.
If this effort follows other open source trajectories, there should be some significant progress around future surgical robotics developments. Giving research organizations a standard platform to build upon and fostering a more open exchange of ideas and design sharing can only serve to unearth far more compelling procedures and applications for the robotics systems.
This is an interesting looking device for something that is going to cut you. I saw an ad this morning for prostate cancer, and there was a picture of a robotic surgeon (a da Vinci, I think). It is all a bit intimidating. I guess that some surgeons can be as well.
It is interesting to see open source applied in this realm, though. The vision system and its interaction with the actuators must be interesting. This would make it easier to develop specific lighting and processing applications.
Thanks for your comments. Beth, I agree, the open source development platform is exciting. naperlou, I don't know what tools human surgeons use and don't want to. You're right, this is a bit scary looking.
Your verbal imagery makes me double over in anticipatory pain, Naperlou. However, I think that as robotics R&D as applied to automated surgery and/or telemedicine increases, the (perhaps irrational) thought that I for one have, namely, what the heck happens if these things make a mistake, will disappear and they'll become a valuable part of the spectrum of options available to physicians and surgeons.
One thing is if the robot makes a mistake. My fear is of mechanical failure during the operation. I try not to think about what would happen if during mid operation a servo goes out or a bearring siezes up. Would you call a maintenance tech to work on it? It truly brings to light the need for a top notch PM program.
I'm a consultant engineer and I work from my home.
Someday surgeons will be able to work from their homes. In fact, they will be able to perform a surgery any place from any place. I saw a city to city test on this - I can't remember off hand. Now this concept is WIDE.
Just don't give me a robotic nurse in the recovery room! If all I saw was machines when I came to after my surgery, I wouldn't have made it. You need to see those recovering angels at this low point.
Surgical robots like the DaVinci are of course not autonomous -- they're under direct control of the surgeon(s), usually in the same room as the patient, and act as "enhanced" versions of tools they've used for decades or centuries: scalpels, cauterizers, retractors, etc. They have to undergo the same VERY rigorous validation processes that any other medical device does, including failure-mode analysis and risk management. They are designed to fail safely -- like surgeons themselves, the operating principle is "do no harm." In the event the device stops working, the surgical team can quickly remove the robot and perform the surgery manually.
Intuitive Surgical's DaVinci is a brilliantly designed and built tool, the first successful second-generation surgical robot (previous ones were primarily surgical assistants). Its very high price tag reflects the research and care that company put into making it so good. Applying the open-source idea will almost certainly allow the next generation to be developed faster and produced more cheaply.
Tim, I think you've got a really good point. I'll bet that the makers of da Vinci, the leading commercial surgical robot, have some extremely high QA standards. Think of the lawsuits! OTOH, whether it's a bearing failing in a robot, or your surgeon arriving drunk, without enough sleep, or having just had a fight with his/her spouse, seems to me like a tossup.
Considering the Open Source development of Medical Devices, such as surgical robotics or others even less complicated, the most difficult hurdle to realization of such a goal is the integration of the development process in an open sense with the requirements of the FDA. The typically tight controls and meticulous documentation and process standards imposed on PMA medical devices may render truly open source development, such as that found in the software world, an unreachable star.
I agree, Kevin. I think the area where OS techniques will affect medical devices is in the research and development arena, rather than FDA-qualified products. Anything that can speed up and simplify the realization of such complex and critical systems is a Good Thing.
Samsung's Galaxy line of smartphones used to fare quite well in the repairability department, but last year's flagship S5 model took a tumble, scoring a meh-inducing 5/10. Will the newly redesigned S6 lead us back into star-studded territory, or will we sink further into the depths of a repairability black hole?
In 2003, the world contained just over 500 million Internet-connected devices. By 2010, this figure had risen to 12.5 billion connected objects, almost six devices per individual with access to the Internet. Now, as we move into 2015, the number of connected 'things' is expected to reach 25 billion, ultimately edging toward 50 billion by the end of the decade.
NASA engineer Brian Trease studied abroad in Japan as a high school student and used to fold fast-food wrappers into cranes using origami techniques he learned in library books. Inspired by this, he began to imagine that origami could be applied to building spacecraft components, particularly solar panels that could one day send solar power from space to be used on earth.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.