A GigE Vision machine vision network can be upgraded to 10GigE speeds and still deploy Camera Link cameras using devices such as this iPORT CL-Ten transmitter, with two Camera Link ports and a 10GigE port.
Source: Pleora Technologies
It would seem that with all the emerging high-bandwidth applications in medical, military and other segments, 10-Gigabit would be a natural upgrade path to get the higher performance so the machine vision infrastructure can keep up. What is the downside to going with 10-Gigabit Ethernet offerings? Higher price?
What you'll find in these types of systems is that if the video needs to be transmitted only a short distance, from maybe 1-2 cameras, directly to a PC and no further, that 10 GigE might not be the right technology, cost-wise.
But most high-value systems aren't like that - either they have a more than half-a-dozen cameras (especially web inspection systems), they need to distribute imagery to multiple endpoints (for example, for distributed processing and analysis), the endpoints need to be far away from the inspection areas (especially in dirty environments like steel or textile inspection), or some combination of the above.
In any of those cases, 10 GigE can bring a cost savings, especially when you subtract out the cost of framegrabbers and/or expensive cabling and repeaters.
Thanks for clarifying, John. So what you're saying is that for the bulk of applications where there are multiple cameras that need to distribute images to multiple end points, 10GigE can make a big difference. For the fewer applications where there is closer proximity, it could be overkill from a price standpoint.
Even in cases where you have only 1 of the 2 (multiple cameras or multi-point distribution), you can have a significant cost savings. Camera Link medium cameras are plentiful in the market, well-understood, and come in a variety of performance classes. But they suffer from a costly interface - cables, repeaters, and frame grabbers aren't inexpensive. It's beneficial to convert the CL interface into something like 10 GigE.
Camera manufacturers are starting to take a look at offering the same camera (same sensor, same electronics), but with a 10 GigE interface built right in.
But yes, if you have 1-2 cameras that need to be connected to a PC a pace away, then there are other alternatives. But medical, military, and high-value quality inspection applications don't tend to fit this mold.
Beth, a machine vision system with multiple cameras is served well by any speed of GigE backbone and its multipoint-to-multipoint capabilities. Whether you need to ratchet that up to 10 GigE depends on the nature of the data and/or the speed of the transfer.
Looks like we are again bumping up against the limits of informatics. At some point it stops being a question of how fast and how much data can be transferred, and becomes a question of where do we store it and how quickly can we analyze it. Advances in cognitive algorithms are concerned with "attention" -- noticing anomalies in a product inspection line or movement in a normally still scene. We have a great interplay between the development of hardware and software solutions. With GigE 2.0, it looks like its software's turn to make a move.
Good point, Bill, and I think the challenge here will be on manufacturing and automation engineers to work with their software counterparts to create what I'd call predictive diagnostic and QC systems, which can make use of that data (not just more data for data's sake, which can't be analyzed, like you say). The objective would be almost an artificial intellgence-like program, or, more properly, software that over time builds up a database of patterns from which it can analyze and predict future outcomes, such as potential near-term failures as well as tweaks to improve/maintain production quality.
What factories do--or don't do--with all the data they collect via machine vision is a can of worms, according to what a lot of people told me off record when I was covering the subject for T&MW. That data can help monitor processes and equipment, as well as product quality, and make numerous predictions as Alex suggests, but it's often not utilized because of the lack of dollars, time and/or know-how for integrating it into a plant's SRP and QC systems.
Beth, good question. Many say there aren't really any downsides, and higher price is definitely not one of them. Ethernet's ubiquity throughout the enterprise means that most components--network interface cards, cables--are generally quite low cost. Some critics say that although GigE takes away the frame grabber (image capture card) it puts back in the NIC (network interface card). Although this is technically true, NICs cost a lot less than frame grabbers.To date, the main initial concerns about using GigE as a backbone for real-time networks such as high-speed vision requires have been regarding CPU loading and latency. This basically means a potential source of slowing down data transfers. Enthusiasts say that CPU loading has been fixed with filters and drivers, and latency has turned out not to be a problem in 1 Gbps GigE networks. Whether this will all translate well to an order of magnitude speed increase is not yet known.
That's an observant question, Chuck. 3D machine vision requires multiple cameras and GigE is good at handling data from multiple cameras, so that would make a good fit. But I don't think GigE--over 1 Gbps or 10 Gbps backbones--is currently being positioned that way, although perhaps it should be.