Chuck, between you and Ann's articles, there definitely seems to be strong headwinds behind the use of 3D machine vision technology on the factory floor and based on your most recent piece, a lot of ground being broken in the automotive industry specifically. I'm curious if all the data they are collecting with these 3D cameras and such is relevant and can find its way back into other 3D design tools like simulation and CAD? I would think particularly for the inspection and quality applications, that is invaluable information for engineers finetuning their designs or fixing design flaws.
Beth, I think you are right on with your comment. We are now moving into a design world where 3D design, printing and machine vision could streamline design and manufacturing and flow between them. Just imagine the possibilities for continuous improvement when you can combine the three.
Chuck, thanks for a clear, comprehensive article on this subject. I think Alex's point is well taken: that the increase in the use of 3D techniques in MV are all about quality control, and good evidence for the revival of US manufacturing skills, ability and quality. Beth's and naperlou's comments are also intriguing: the implications of such convergence could be mind boggling.
I like your way of thinking, Naperlou. Combining all of those disparate, but related 3D technologies can certainly play a huge role in helping manufacturers address quality issues in an expedient, cost-effective, and continuous manner. Armed with that kind of real-time ability to respond and improve, perhaps American manufacturing can move forward in a positive way as Alex's comments suggest.
Chuck, Excellent article. This should be a very interesting area for automation and control over the next decade. As computing platforms and operating systems continue to move to the next level, vision should be an area ripe for innovation. Any ideas about how quickly we'll start to see wider adoption?
This is the cover story in the February print issue of Design News. My comment would be, the increased quality control enabled by wider deployment of 3D vision plays into the whole meme of the resurgence of U.S. manufacturing. Not that this technology won't be used worldwide -- it will. But its deployment Stateside, particularly in the resurgent domestic automotive industry, will go a long way towards keeping U.S. manufacturing on par with its tough worldwide competition.
I think wider adoption will still be slow, Al. The wild card, however, might be the economy. If auto sales jump, we could see wider adoption of 3D for a lot of different automotive applications, such as inspection of braking components, fuel lines and tire treads.
One question for 3D vision in automation applications is how much this technology expands the processing requirements required for these more sophisticated vision applications. With the move toward one controller, one network and PC-based systems where users are gaining confidence in the reliability of the real-time operating systems, a sharp increase in processing for 3D would potentially inhibit adoption at some level. Chuck, any ideas on the impact this might have from your research?
Nice article, Chuck. I look forward to the next one.
What was the tipping point on the adoption of 3D vision. Was it the cost of cameras coming down? Was it software developments that made it easier to deploy (i.e. users don't have to do complex original programming)? Or something else?
There's a convrgence of trends that are making 3D more viable, Rob. Availability of software libraries has been important, but equally important is the availability of multicore processors to run the applications. We're hearing of users who are employing 8, 10, and even 12 cores to do the processing. 3D is also getting a bit simpler. In-house engineer without 3D experience can do simple applications, such as de-palletizing. And now there are more system integrators who can handle robotics applications. It's still complex, but 3D no longer requires the expertise of Ph.D.-level specialists.
Thanks, Chuck. I'm sure the processing requirements are high but multi-core processors are perfect for this type of application. As Moore's Law marches on in automation controllers, vision is definitely one of the apps that should be able to expand by using the extra available processing horsepower.
Festo's BionicKangaroo combines pneumatic and electrical drive technology, plus very precise controls and condition monitoring. Like a real kangaroo, the BionicKangaroo robot harvests the kinetic energy of each takeoff and immediately uses it to power the next jump.
Design News and Digi-Key presents: Creating & Testing Your First RTOS Application Using MQX, a crash course that will look at defining a project, selecting a target processor, blocking code, defining tasks, completing code, and debugging.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.