I would think that the trend of supporting general-purpose protocols is really a must in order for these new multi-camera vision systems to gain traction in all of the interesting applications you mentioned, Ann. With more and more cameras deployed on the factory floor or for medical applications, there's got to be a need to integrate the plethora of images with mainstream systems in real time in order to truly leverage the capabilities and achieve any kind of benefits. Beyond bus interfaces are there any other efforts going on to leverage standards and mainstream computing protocols to address this integration challenge?
Ann, one of the articles referenced talks about 10-gigabit Ethernet. The question is, do vision systems need that much bandwidth. Do you see support for 10-gigabit Ethernet in systems like Vision Systems in the near future?
Beth, can you clarify your question? What kind of integration are you thinking of?
Regarding 10-Gbit Ethernet, I think that article and the comments attached to it cover those applications in quite some detail. In genetral, they are medical, military, and high-value quality inspection applications in multi-camera systems, and any app that can take advantage of high speed.
naperlou, I would suggest that in the embedded vision systems mentioned in the article, the main goal of the vision system is to collect lots, and lots of data; spend some time analyzing the images with the intent of making a decision; and then dumping the images after the decision has been made. Perhaps for archival purposes, a modest-resolution consumer video camera can be used in the loop for forensic logging. As the decisions become higher level, for example, reject or accept, routing, or even multi-sensor target recognition, or autonomous navigation, the analysis algorithms would love to have as much time as possible to make a correct decision. High-speed networks, such as 10-GBit Ethernet, present the data quickly and then the algorithms can start their processing. Now if the system calls for logging off all those images for offline analysis, I don't think the write-heads can keep up.
Thanks, William, for that description. That's the basic MV app in a nutshell, and a good succinct summary. And it applies, of course, to several different industries. Depending on the type of decision being made you need a higher or lower-res sensor, a color or B/W one, perhaps some visible light and some NIR cameras, various lens types, etc.
The 100-percent solar-powered Solar Impulse plane flies on a piloted, cross-country flight this summer over the US as a prelude to the longer, round-the-world flight by its successor aircraft planned for 2015.
GE Aviation expects to chop off about 25 percent of the total 3D printing time of metallic production components for its LEAP Turbofan engine, using in-process inspection. That's pretty amazing, considering how slow additive manufacturing (AM) build times usually are.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This radio show will show what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.