Thanks, William, for that description. That's the basic MV app in a nutshell, and a good succinct summary. And it applies, of course, to several different industries. Depending on the type of decision being made you need a higher or lower-res sensor, a color or B/W one, perhaps some visible light and some NIR cameras, various lens types, etc.
naperlou, I would suggest that in the embedded vision systems mentioned in the article, the main goal of the vision system is to collect lots, and lots of data; spend some time analyzing the images with the intent of making a decision; and then dumping the images after the decision has been made. Perhaps for archival purposes, a modest-resolution consumer video camera can be used in the loop for forensic logging. As the decisions become higher level, for example, reject or accept, routing, or even multi-sensor target recognition, or autonomous navigation, the analysis algorithms would love to have as much time as possible to make a correct decision. High-speed networks, such as 10-GBit Ethernet, present the data quickly and then the algorithms can start their processing. Now if the system calls for logging off all those images for offline analysis, I don't think the write-heads can keep up.
Beth, can you clarify your question? What kind of integration are you thinking of?
Regarding 10-Gbit Ethernet, I think that article and the comments attached to it cover those applications in quite some detail. In genetral, they are medical, military, and high-value quality inspection applications in multi-camera systems, and any app that can take advantage of high speed.
Ann, one of the articles referenced talks about 10-gigabit Ethernet. The question is, do vision systems need that much bandwidth. Do you see support for 10-gigabit Ethernet in systems like Vision Systems in the near future?
I would think that the trend of supporting general-purpose protocols is really a must in order for these new multi-camera vision systems to gain traction in all of the interesting applications you mentioned, Ann. With more and more cameras deployed on the factory floor or for medical applications, there's got to be a need to integrate the plethora of images with mainstream systems in real time in order to truly leverage the capabilities and achieve any kind of benefits. Beyond bus interfaces are there any other efforts going on to leverage standards and mainstream computing protocols to address this integration challenge?
To give engineers a better idea of the range of resins and polymers available as alternatives to other materials, this Technology Roundup presents several articles on engineering plastics that can do the job.
The first photos made with a 3D-printed telescope are here and they're not as fuzzy as you might expect. A team from the University of Sheffield beat NASA to the goal. The photos of the Moon were made with a reflecting telescope that cost the research team £100 to make (about $161 US).
A tiny humanoid robot has safely piloted a small plane all the way from cold start to takeoff, landing and coming to a full stop on the plane's designated runway. Yes, it happened in a pilot training simulation -- but the research team isn't far away from doing it in the real world.
Some in the US have welcomed 3D printing for boosting local economies and bringing some offshored manufacturing back onshore. Meanwhile, China is wielding its power of numbers, and its very different relationships between government, education, and industry, to kickstart a homegrown industry.
You can find out practically everything you need to know about engineering plastics as alternatives to other materials at the 2014 IAPD Plastics Expo. Admission is free for engineers, designers, specifiers, and OEMs, as well as students and faculty.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.