William, thanks for the description of how totally not plug-and-play machine vision system design is. That's pretty much what I've heard several times--from vendors, about their customers. One of the biggest complaints I've heard over and over is the need to do exactly what you describe--write low-level driver software to get cameras and other components from multiple vendors to talk to each other. Nicely. And quickly. And without too many translation problems.
Ann, the challenge of USB3 is exactly it's speed, and the resultant increased demand for a much more tightly controlled connection and transmission every bit of the way. The cables that were fine for USB 1 and USB2 just won't work for USB3. To top it off, nobody needs that speed except for the marketing departments that are hoping that we will all trash our present computers and buy new ones that include this "latest thing". The consumer electronics industry's goal has been to make each development obsolete in less than a year in order to create a market for the "next big thing". It has been this way for quite a few years, acknowledged in many of the trade publications.
William, my understanding is that USB3 is about to become a lot more common on laptops of all kinds. According to In-Stat, USB of all speeds is the most successful interface ever in terms of the number of electronic units shipped with it. The transition to USB 3 has been slower than the transitions to its previous revs:
Apparently, what's been holding it back has been less immediate need for its high speeds and the fact that it wasn't yet integrated in Intel's core logic chipsets. But that's apparently about to change:
I'd like to clarify another point: no camera or frame grabber vendor has a financial stake in only one machine vision standard. That would be business suicide on their part, because volumes are so small here compared to electronics, for example. What's happened, though, is that suddenly, instead of only offering maybe two choices, they must now consider offering two or three times that number, depending on where they are positioned in the market. Again, because volumes are low, this can be quite a strain, especially on the smaller vendors. So adding products based on a new standard is not a decision taken lightly.
@naperlou -- I wasn't favoring USB3 over any of the competing interfaces, I was just cheer-leading for ALL interfaces. My primary function while in industry was that of integration. Our team would specify a dozen or so different components from a dozen or so different OEMs and it was my function to cobble them together and integrate them into a single system through the creation of custom software. At least for our situation, there was no such thing as going to the catalog, receiving a shrink-wrapped component, having a field service engineer install it and then have it running by the afternoon. It was spend a day to figure out what interface to use, receive those boards and drivers, spend a week or two to write the low-level communication drivers to get it communicating and then moving on to integrating it into our single platform -- all while doing the same for the other dozen or so OEM devices in parallel. When new interfaces come out, such as USB3, the standard has performed much of the low-level driver stuff. Since we had to work with ones and zeros anyway, I'm thankful for any help in the process regardless of how in/complete the standard or universal its current adoption. -Bill =]
I did not think that USB3 was common on industrial grade computers, since they are not the first to follow fads such as USB3. In addition to the limited cable length there is the lack of a repairable or field serviceable connector. That alone would be plenty to disqualify it from my consideration. A pair of fatal flaws is a fairly good reason to look at other interconnect standards.
William, the one thing I can think of to justify a USB3.0 standard for vision is the availability of the interface as a standard feature of many computers. There is still a need for the others becuase of the limited cable distance that USB supports.
TJ and Alex, machine vision standards tend to be fairly long lived, at least in semiconductor "time", including the original Camera Link, now 11 years old. To date, none have fallen by the wayside. Also, none are single-sourced. It's true that some new standards have been promoted by a single company, but that's a normal process within any standards organization. And this seems to be happening less often. Several recent standards, not confined to machine vision, have been promoted by industry groups, as happened with WiFi, DisplayPort, and now USB3 Vision.
Another thing to keep in mind is that, in machine vision standards govern the camera interface, and very little else. Where things can become confusing is on the customer end, not so much on either the vendor or system integrator end. The confusion has to do with sorting out what's now a lot more choices than there used to be, and deciding what's best for your particular app.
I think there's some misunderstanding about machine vision standards, and the article's title may be a bit misleading. While some companies decide to adopt only one on the factory floor, for instance GigE because of its networking capabilities
the fact is that cameras compliant with more than one standard can often be mixed and matched in the same system or network (along with their frame grabbers if they have one). So there aren't really any major incompatibility issues.
Also, these standards are not created equal. They run at different speeds, use different types of cabling, etc. Only some of them are "proprietary," such as Camera Link and Camera Link HS, in the sense that their protocols are not used outside machine vision. The main big deal about GigE Vision and USB3 Vision is the fact that they are based on open, non-vision standards. Each vision standard is a specification, some very long and detailed, some pretty short, and each one governs the camera interface, either to a PC or to a frame grabber, or some other network device. Whether the spec is long and complicated or short and sweet depends on how much needs to be specified about things like data transfer conventions. In the case of USB3, these already exist, and so do the connectors, so very little is needed there.
Robots that walk have come a long way from simple barebones walking machines or pairs of legs without an upper body and head. Much of the research these days focuses on making more humanoid robots. But they are not all created equal.
The IEEE Computer Society has named the top 10 trends for 2014. You can expect the convergence of cloud computing and mobile devices, advances in health care data and devices, as well as privacy issues in social media to make the headlines. And 3D printing came out of nowhere to make a big splash.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.