"The EOS-1200 design is based on our vision that the market trend in embedded machine vision platforms is shifting towards small systems with ease of integration and more computing power," Jacky Lin, Adlink's vision product marketing manager, told us. The company plans to introduce an EOS platform based on the Power over Camera Link (PoCL) interface later this year.
Adlink has designed the EOS-1200 for high-end machine vision systems in applications such as design verification, quality control in factory automation, and inspection on automotive production lines, said Lin. That's because the needs of this end of the industry are growing rapidly. What's also growing is the need for small, powerful inspection systems in field maintenance operations to verify product defects before sending them back to a central depot for repair. "We're also seeing a lot of demand from the medical industry," he said. "For example, in the surgery room, doctors need a multiple camera setup, but a small footprint."
Thanks, William, for that description. That's the basic MV app in a nutshell, and a good succinct summary. And it applies, of course, to several different industries. Depending on the type of decision being made you need a higher or lower-res sensor, a color or B/W one, perhaps some visible light and some NIR cameras, various lens types, etc.
naperlou, I would suggest that in the embedded vision systems mentioned in the article, the main goal of the vision system is to collect lots, and lots of data; spend some time analyzing the images with the intent of making a decision; and then dumping the images after the decision has been made. Perhaps for archival purposes, a modest-resolution consumer video camera can be used in the loop for forensic logging. As the decisions become higher level, for example, reject or accept, routing, or even multi-sensor target recognition, or autonomous navigation, the analysis algorithms would love to have as much time as possible to make a correct decision. High-speed networks, such as 10-GBit Ethernet, present the data quickly and then the algorithms can start their processing. Now if the system calls for logging off all those images for offline analysis, I don't think the write-heads can keep up.
Beth, can you clarify your question? What kind of integration are you thinking of?
Regarding 10-Gbit Ethernet, I think that article and the comments attached to it cover those applications in quite some detail. In genetral, they are medical, military, and high-value quality inspection applications in multi-camera systems, and any app that can take advantage of high speed.
Ann, one of the articles referenced talks about 10-gigabit Ethernet. The question is, do vision systems need that much bandwidth. Do you see support for 10-gigabit Ethernet in systems like Vision Systems in the near future?
I would think that the trend of supporting general-purpose protocols is really a must in order for these new multi-camera vision systems to gain traction in all of the interesting applications you mentioned, Ann. With more and more cameras deployed on the factory floor or for medical applications, there's got to be a need to integrate the plethora of images with mainstream systems in real time in order to truly leverage the capabilities and achieve any kind of benefits. Beyond bus interfaces are there any other efforts going on to leverage standards and mainstream computing protocols to address this integration challenge?
Artificially created metamaterials are already appearing in niche applications like electronics, communications, and defense, says a new report from Lux Research. How quickly they become mainstream depends on cost-effective manufacturing methods, which will include additive manufacturing.
SpaceX has 3D printed and successfully hot-fired a SuperDraco engine chamber made of Inconel, a high-performance superalloy, using direct metal laser sintering (DMLS). The company's first 3D-printed rocket engine part, a main oxidizer valve body for the Falcon 9 rocket, launched in January and is now qualified on all Falcon 9 flights.
Lawrence Livermore National Laboratory and MIT have 3D-printed a new class of metamaterials that are both exceptionally light and have exceptional strength and stiffness. The new metamaterials maintain a nearly constant stiffness per unit of mass density, over three orders of magnitude.
Smart composites that let the material's structural health be monitored automatically and continuously are getting closer to reality. R&D partners in an EU-sponsored project have demonstrated what they say is the first complete, miniaturized, fiber-optic sensor system entirely embedded inside a fiber-reinforced composite.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.