A GigE Vision machine vision network can be upgraded to 10GigE speeds and still deploy Camera Link cameras using devices such as this iPORT CL-Ten transmitter, with two Camera Link ports and a 10GigE port.
Source: Pleora Technologies
It would seem that with all the emerging high-bandwidth applications in medical, military and other segments, 10-Gigabit would be a natural upgrade path to get the higher performance so the machine vision infrastructure can keep up. What is the downside to going with 10-Gigabit Ethernet offerings? Higher price?
What you'll find in these types of systems is that if the video needs to be transmitted only a short distance, from maybe 1-2 cameras, directly to a PC and no further, that 10 GigE might not be the right technology, cost-wise.
But most high-value systems aren't like that - either they have a more than half-a-dozen cameras (especially web inspection systems), they need to distribute imagery to multiple endpoints (for example, for distributed processing and analysis), the endpoints need to be far away from the inspection areas (especially in dirty environments like steel or textile inspection), or some combination of the above.
In any of those cases, 10 GigE can bring a cost savings, especially when you subtract out the cost of framegrabbers and/or expensive cabling and repeaters.
Thanks for clarifying, John. So what you're saying is that for the bulk of applications where there are multiple cameras that need to distribute images to multiple end points, 10GigE can make a big difference. For the fewer applications where there is closer proximity, it could be overkill from a price standpoint.
Even in cases where you have only 1 of the 2 (multiple cameras or multi-point distribution), you can have a significant cost savings. Camera Link medium cameras are plentiful in the market, well-understood, and come in a variety of performance classes. But they suffer from a costly interface - cables, repeaters, and frame grabbers aren't inexpensive. It's beneficial to convert the CL interface into something like 10 GigE.
Camera manufacturers are starting to take a look at offering the same camera (same sensor, same electronics), but with a 10 GigE interface built right in.
But yes, if you have 1-2 cameras that need to be connected to a PC a pace away, then there are other alternatives. But medical, military, and high-value quality inspection applications don't tend to fit this mold.
Beth, good question. Many say there aren't really any downsides, and higher price is definitely not one of them. Ethernet's ubiquity throughout the enterprise means that most components--network interface cards, cables--are generally quite low cost. Some critics say that although GigE takes away the frame grabber (image capture card) it puts back in the NIC (network interface card). Although this is technically true, NICs cost a lot less than frame grabbers.To date, the main initial concerns about using GigE as a backbone for real-time networks such as high-speed vision requires have been regarding CPU loading and latency. This basically means a potential source of slowing down data transfers. Enthusiasts say that CPU loading has been fixed with filters and drivers, and latency has turned out not to be a problem in 1 Gbps GigE networks. Whether this will all translate well to an order of magnitude speed increase is not yet known.
Beth, a machine vision system with multiple cameras is served well by any speed of GigE backbone and its multipoint-to-multipoint capabilities. Whether you need to ratchet that up to 10 GigE depends on the nature of the data and/or the speed of the transfer.
Looks like we are again bumping up against the limits of informatics. At some point it stops being a question of how fast and how much data can be transferred, and becomes a question of where do we store it and how quickly can we analyze it. Advances in cognitive algorithms are concerned with "attention" -- noticing anomalies in a product inspection line or movement in a normally still scene. We have a great interplay between the development of hardware and software solutions. With GigE 2.0, it looks like its software's turn to make a move.
Good point, Bill, and I think the challenge here will be on manufacturing and automation engineers to work with their software counterparts to create what I'd call predictive diagnostic and QC systems, which can make use of that data (not just more data for data's sake, which can't be analyzed, like you say). The objective would be almost an artificial intellgence-like program, or, more properly, software that over time builds up a database of patterns from which it can analyze and predict future outcomes, such as potential near-term failures as well as tweaks to improve/maintain production quality.
Enabling the Future is designing prosthetic appendages modeled more like superhero arms and hands than your average static artificial limbs. And they’re doing it through a website and grassroots movement inspired by two men’s design and creation in 2012 of a metal prosthetic for a child in South Africa.
In order to keep an enterprise truly safe from hackers, cyber security has to go all the way down to the device level. Icon Labs is making the point that security has to be built into device components.
Three days after NASA's MAVEN probe reached Mars, India's Mangalyaan probe went into orbit around the red planet. India's first interplanetary mission, and the first successful Mars probe launched by an Asian nation, has a total project cost of nearly $600 million less than MAVEN's.
Plant user interfaces are beginning to incorporate the consumer features such as swipe, double tap, and pinch. The driver is Millennials who expect plant equipment to match the sophistication of the smartphone.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.