The USB3 Vision standard also represents a shift from the use of machine vision-specific buses like Camera Link to the practice of leveraging existing buses, such as USB, says Gross. "Machine vision is a very small market compared to many others. By leveraging an existing bus you can gain the resources that already exist for it, such as compatible hardware, cables, and connectors, instead of having to design it all up front."
USB3 Vision targets applications that don't need the long cables and networking capability of GigE Vision. It competes with devices made for Camera Link base and medium, but with faster speed and minus the need for a frame grabber card for each camera. "A frame grabber can give better performance and tighter synchronization in terms of I/Os and the more advanced onboard processing that you can do with FPGAs," says Gross.
The AIA can complete the standards review process for USB3 Vision faster than it has for others because so many lower-level functions about how to transfer data are already defined by USB 3.0. Examples include plug-and-play device discovery and how commands are sent to the device, according to Gross. In contrast, the GigE standard, based on a networking protocol, didn't pre-define those functions, which had to be built up for GigE Vision starting all the way down at the lower levels.
Gross says the AIA hopes to have a draft standard in early 2012, and to finalize it in time to announce the first products at Vision 2012.
Seems like the dizzying array of standards would present huge challenges to engineers building machine vision applications and systems in terms of dealing with compatibility issues and a host of extra configuration work. Is this mix of standards holding back adoption of machine vision tools or is this something where there is a pretty standard workaround?
Beth, it's going to make me pause long and hard while trying to decide which protocol to use. This is a sore point for me; anyone who wants to adopt these early stands the chance of seeing that company's favorite protocol falling into disfavor.
Good point, Tim. It's ironic that on the one hand, the automation industry is attempting to get away from proprietary fieldbus protocols, which is leading to widespread adoption of Ethernet and new standards (e.g., for safety) being grafted on top of it. Yet here in the machine vision sector, which see forking and a battle among standards. The technical version of "can't we all just get along." More correctly, this is a typical market battle driven by advancing technology. USB3 is looking strong right now.
TJ and Alex, machine vision standards tend to be fairly long lived, at least in semiconductor "time", including the original Camera Link, now 11 years old. To date, none have fallen by the wayside. Also, none are single-sourced. It's true that some new standards have been promoted by a single company, but that's a normal process within any standards organization. And this seems to be happening less often. Several recent standards, not confined to machine vision, have been promoted by industry groups, as happened with WiFi, DisplayPort, and now USB3 Vision.
Another thing to keep in mind is that, in machine vision standards govern the camera interface, and very little else. Where things can become confusing is on the customer end, not so much on either the vendor or system integrator end. The confusion has to do with sorting out what's now a lot more choices than there used to be, and deciding what's best for your particular app.
Beth, I'm all for standards... as long as they make my job easier. The great thing about standards is that in an ever-evolving technology environment they are of most benefit to the early-adopters. Standardizing on VHS, ZIP-disks, or RS-232 were all fantastic choices at the time -- and then we (predictably) evolved. Spending lots of time fretting about which standard will be around in 10 - 15 years really isn't a relevant question. In 1998 I was tasked with building a system that integrated six (6) scientific-grade 1-Megapixel CCD cameras. The frame grabber boards could only support two (2) cameras each, so the solution was three (3) PCs with one running as a master and the other two running as slaves. The fastest interface at the time was Ethernet so after lots of low-level TCP code the system acted as a single unit. I would have killed for USB3. =]
I think there's some misunderstanding about machine vision standards, and the article's title may be a bit misleading. While some companies decide to adopt only one on the factory floor, for instance GigE because of its networking capabilities
the fact is that cameras compliant with more than one standard can often be mixed and matched in the same system or network (along with their frame grabbers if they have one). So there aren't really any major incompatibility issues.
Also, these standards are not created equal. They run at different speeds, use different types of cabling, etc. Only some of them are "proprietary," such as Camera Link and Camera Link HS, in the sense that their protocols are not used outside machine vision. The main big deal about GigE Vision and USB3 Vision is the fact that they are based on open, non-vision standards. Each vision standard is a specification, some very long and detailed, some pretty short, and each one governs the camera interface, either to a PC or to a frame grabber, or some other network device. Whether the spec is long and complicated or short and sweet depends on how much needs to be specified about things like data transfer conventions. In the case of USB3, these already exist, and so do the connectors, so very little is needed there.
What is wrong with CoaXpress? That is a good standard! USB3 is an unneeded standard thhat is destined to deliver inferior results due to inferior hardware. Besides that, the only ones destined to benefit from USB3 are those selling the hardware. The connector format is neither robust nor particularly reliable, and not suited for any application where a field repair may ever be needed.
I realize that promoting a standard that someone has a financial stake in is logical, but the real purpose of USB3 is to obsolete previous versions and make money for the sellers. It is not really in a position to fill anyother real need.
So let us instead consider the other standards that are able to fill the requirements and fill the need for robustness and reliability.
William, the one thing I can think of to justify a USB3.0 standard for vision is the availability of the interface as a standard feature of many computers. There is still a need for the others becuase of the limited cable distance that USB supports.
@naperlou -- I wasn't favoring USB3 over any of the competing interfaces, I was just cheer-leading for ALL interfaces. My primary function while in industry was that of integration. Our team would specify a dozen or so different components from a dozen or so different OEMs and it was my function to cobble them together and integrate them into a single system through the creation of custom software. At least for our situation, there was no such thing as going to the catalog, receiving a shrink-wrapped component, having a field service engineer install it and then have it running by the afternoon. It was spend a day to figure out what interface to use, receive those boards and drivers, spend a week or two to write the low-level communication drivers to get it communicating and then moving on to integrating it into our single platform -- all while doing the same for the other dozen or so OEM devices in parallel. When new interfaces come out, such as USB3, the standard has performed much of the low-level driver stuff. Since we had to work with ones and zeros anyway, I'm thankful for any help in the process regardless of how in/complete the standard or universal its current adoption. -Bill =]
I'd like to clarify another point: no camera or frame grabber vendor has a financial stake in only one machine vision standard. That would be business suicide on their part, because volumes are so small here compared to electronics, for example. What's happened, though, is that suddenly, instead of only offering maybe two choices, they must now consider offering two or three times that number, depending on where they are positioned in the market. Again, because volumes are low, this can be quite a strain, especially on the smaller vendors. So adding products based on a new standard is not a decision taken lightly.
I did not think that USB3 was common on industrial grade computers, since they are not the first to follow fads such as USB3. In addition to the limited cable length there is the lack of a repairable or field serviceable connector. That alone would be plenty to disqualify it from my consideration. A pair of fatal flaws is a fairly good reason to look at other interconnect standards.
William, my understanding is that USB3 is about to become a lot more common on laptops of all kinds. According to In-Stat, USB of all speeds is the most successful interface ever in terms of the number of electronic units shipped with it. The transition to USB 3 has been slower than the transitions to its previous revs:
Apparently, what's been holding it back has been less immediate need for its high speeds and the fact that it wasn't yet integrated in Intel's core logic chipsets. But that's apparently about to change:
Ann, the challenge of USB3 is exactly it's speed, and the resultant increased demand for a much more tightly controlled connection and transmission every bit of the way. The cables that were fine for USB 1 and USB2 just won't work for USB3. To top it off, nobody needs that speed except for the marketing departments that are hoping that we will all trash our present computers and buy new ones that include this "latest thing". The consumer electronics industry's goal has been to make each development obsolete in less than a year in order to create a market for the "next big thing". It has been this way for quite a few years, acknowledged in many of the trade publications.
William, thanks for the description of how totally not plug-and-play machine vision system design is. That's pretty much what I've heard several times--from vendors, about their customers. One of the biggest complaints I've heard over and over is the need to do exactly what you describe--write low-level driver software to get cameras and other components from multiple vendors to talk to each other. Nicely. And quickly. And without too many translation problems.
The engineers and inventors of the post WWII period turned their attention to advancements in electronics, communication, and entertainment. Breakthrough inventions range from LEGOs and computer gaming to the integrated circuit and Ethernet -- a range of advancements that have little in common except they changed our lives.
The age of touch could soon come to an end. From smartphones and smartwatches, to home devices, to in-car infotainment systems, touch is no longer the primary user interface. Technology market leaders are driving a migration from touch to voice as a user interface.
Soft starter technology has become a way to mitigate startup stressors by moderating a motor’s voltage supply during the machine start-up phase, slowly ramping it up and effectively adjusting the machine’s load behavior to protect mechanical components.
A new report from the National Institute of Standards and Technology (NIST) makes a start on developing control schemes, process measurements, and modeling and simulation methods for powder bed fusion additive manufacturing.
If you’re developing a product with lots of sensors and no access to the power grid, then you’ll want to take note of a Design News Continuing Education Center class, “Designing Low Power Systems Using Battery and Energy Harvesting Energy Sources."
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.