once again, thanks sir randy for a great and excellent presentation...much appreciated! another lesson to be learned and familiar with the new technology...although that was 2012 class, the devices are absolutely amazing!
thanks again to digikey and design news for bringing this continuing education for us PDH Certificate gainer...
Good lecture, got very excited about vison technology back in the 80's for inpection systems tried to get management interested enough to fund a project to explore this technology, but they could not see any use for this technology..... their loss.
This fires the imagination. I did not know these devices existed. I would think the manufacturers offer development kits with some screening process for eligibility. These must be expensive in some cases.
Good overview and pointers to parameters, products and applications.
* "ID" sensing in this context seems to imply capability to discern a particular shape/part/package perhaps label or barcode versus say face recognition.
* You mentioned software, and perhaps it will be touched on in terms of integrating sensors to industrial control such as PLC or LabView or embedded applications in future classes in this session or the Industrial Control Classes coming up ...
does anybody know a vision sensor that can distinguish between a glass tube being filled by, say, sand and a glass tube that is full of sand?
There is (or was) a company in Everett, Washington that made a "Curtain of Light" that would tell you exactly how full the jar was. I worked there briefly but can't remember the name. It's similar to Mannesman (Tally) but that was a printer company.
I guess the sensor method has a lot to do with "parts on hand" and just good enough. I think vision is a great non-contact method, but sometimes things like guided wave or free space radar may be OK in larger applications.
Q. What General functional-Architecture is associated with Smart sensor design?
Strictly speaking, smart sensor has definitions associated with it that IEEE experts developed (IEEE 1451). There are a lot of specific definitions. However, the term smart sensor is used by many companies and has many meaning associated with it. You really have to look at the data sheets. In more and more of the semiconductor sup[pliers sensors, such as accelerometers, gyroscopes and pressure sensor,the supplier uses the company's MCU to provide the smart capability. As these sensors are designed into more complete sensor modules, the architecture can vary significantly.
If the fill stream completely blocks the tube, so light cannot penetrate thorugh, than cannot use a through beam. Diffuse reflective with background supression will have similar issue, if stream is able to block beam before it fills up. You are better off using a capacitive sensor mounted to the tube. Omron has them. Easy enough application for those sensors
KevinJam: Silicon sensors go from UV into micrometer IR range (near IR). Chips based on indium Telluride or micro bolometers extend to the far infrared - 10 - 50 micrometers. The optics cannot be glass. Check out Flir or Wahl.
[Q] If a photoelectri sensor shows a poor precision and accrurate output, what do I check or make a fix the part of the system that the sensor attached. According to yesterday your slide #3 and the followed web site, the better accuracy would increase the cost, but I can't this, I'd like to change some circuitry part or system architecture, if it is possible. So I'd like to obtain the better precision output even though the accuracy will not improve. What about your recommandations?
What kind of processor are you typically using with the vision systems you have developed? What is the least amount of processing you have used?
I have not personally developed a vision system but I know that several of the well-known and leading semiconductor suppliers have a vision sensing MCU in their portfolio, expecially if they addrress automtive applications. If I remember right, they frequently are based on the ARM processor.
syakovac: I learned today that it would pay off to get a _module_ reducing the vision problem as far as possible for your "backend" processing. you'd trigger the sensor, it would "do its thing" and report back "match" or "no match"
Snandu, re illomination LEDs... These are usually IR at about 900nm. The human eye cut off is about 850 nm, so the output is invisible; however typical camera chips made using silicon are responsive to over 1 micro-meter. Your TV remote uses the same illumination diode, and consumer video cameras can see the digital signal from these LEDs, usually even with an IR cut filter in place.
Q. What of the effect of the medium on the light path.....refraction, dispersion attenuation?...... on effectiveness of the sensor.
With vision sensing a great question. In non industrial applications such as automotive these all can play a limiting role of the effectiveness of the sensor. In fact, to overcome the limitations two different sensing techniques are often used for vehicles that need to sense traffic for warning systems. vision and radar or lidar are the choices fro these applications. In industrial, I would expect that the variations in the operating enviroment would ahve to be considered under extrme cicumstnaces. I have seen little about them causing a non functioning system.
I didn't see much for examples of low-cost machine vision. What do you like for low-cost complete solutions for robotic vision for navigation in full light without relying on high speed computation to "fix" the image? If you don't have good examples, how would you go about shopping for such a device? What do you look for specifically in terms of specifications and features?
Randy, when we go to greater fps rates combined with higer resolution, memory and cpu power thends to increase perhaps quadratically. Can you summarize some typical examples of industrial application requirements as a funciton of static and dynamic resolutions?
If you want more info and to be up-dated all the time with everything what is new on the market you may want to SUBSCRIBE and REGISTER with this : http://www.globalspec.com/electronics/ , then select what you are interested in and you will receive all that in your email.
I found this to be a good source of "zero effort" source of info.
I don't know what happened to the website today. Just because I restarted the browser today, after installing some updates, I went to the usual page and took 20 minutes to get in. Initially it went to an applet with a pretty lady and a button, which didn't take me into the CEC calendar. Very difficult.
anyone know of a way to charge a series-connected stack of nine supercaps?? I've looked at some supercap chargers but they only handle two cells. Shoulld i look for a Lion chip that does nine? Any comments would be appreciated.
The streaming audio player will appear on this web page when the show starts at 2pm eastern today. Note however that some companies block live audio streams. If when the show starts you don't hear any audio, try refreshing your browser.
The first Tacoma Narrows Bridge was a Washington State suspension bridge that opened in 1940 and spanned the Tacoma Narrows strait of Puget Sound between Tacoma and the Kitsap Peninsula. It opened to traffic on July 1, 1940, and dramatically collapsed into Puget Sound on November 7, just four months after it opened.
Noting that we now live in an era of “confusion and ill-conceived stuff,” Ammunition design studio founder Robert Brunner, speaking at Gigaom Roadmap, said that by adding connectivity to everything and its mother, we aren't necessarily doing ourselves any favors, with many ‘things’ just fine in their unconnected state.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.