Widely used in manufacturing processes to ensure quality, vision sensors detect flaws, verify correct assembly or packaging, and sort parts. Self-contained vision sensors, also called smart cameras, automate visual inspections that are too complex for photoelectric sensors http://rbi.ims.ca/4922-541 but don’t require a costly multi-component vision system. The automation process significantly increases the throughput and quality of inspection.
Acquire, Analyze, Decide: The Sensor Does the Work
Engineers have designed ease-of-use into the new generation of vision sensors, so operators on the factory floor can set up and operate them. The operator uses the vision sensor to capture an image of the ideal object — such as a bottle with the correct, properly aligned label — and saves that reference image in the sensor’s memory. When the operator runs the inspection, the sensor compares the reference image to the images it captures of objects — in this case, the labeled bottles. If the label is misaligned, is the wrong label, or is missing entirely, the sensor sends an output to trigger a user-determined response, such as diverting the bottle from the line. http://rbi.ims.ca/4922-542Vision Tools
Vision sensors use digital technology and complex algorithms to analyze captured images. While the lowest cost sensors might use a single algorithm, versatile, full-featured sensors can offer a dozen tools or more. The application determines whether a sensor needs one or more tools. Tools fall into two fundamental groups: linear and area. Both look for a transition in the image, but in different ways.
An area tool is the most useful when the location of the target could vary, such as a carton of mustard bottles that could be missing one or more units anywhere in the box. An area tool examines the entire box for any deviation from the norm.
Linear tools are the best choice when the area of interest is predictable, because they are faster and more precise than area tools. For example, a vision sensor could use a linear tool such as the Edge tool to make sure that vials rushing past on an assembly line all have their lids tightly sealed.
Three key components determine the success of any vision sensing application:
Lighting. Without question, the most important technical factors in vision sensing, the choice of lighting (LED, fluorescent, halogen, color, etc.) and its location (backlight, area light, ring light, etc.) are crucial for creating the greatest possible contrast between the target and the background. http://rbi.ims.ca/4922-538
Lens choice. The quality of the lens affects the quality of the image. Using the wrong lens can increase processing time, decrease system performance, and require costly refixturing of compensate for the incorrect lens. http://rbi.ims.ca/4922-539
Resolution. The standard resolution varies with the sensor, but 640 X 480 is typical. High-resolution 1.3 megapixel sensors (1280 x 1024 pixels) are available. Because increased resolution increases processing time, use the lowest resolution required for an application. http://rbi.ims.ca/4922-540
For other useful vision sensing tools, check out:
iKNOW® Vision uses clear descriptions and full-color pictures to explain basic and advanced concepts about vision sensing. http://rbi.ims.ca/4922-544
There is currently much discussion around the term "platform," which may be preceded by the adjectives "mobile," "wearable," "medical," "healthcare," etc. However, regardless of the platform being discussed, they usually have one key aspect in common: They tend to be wireless. So, why is this one aspect so fairly universal? The answer is convenience.
Everyone has a MEMS story. For most of us it’s probably the airbag that saved our lives or the life of a loved one. Perhaps it’s the tire pressure sensor that alerted us about deflation before we were stranded alone on a dark muddy road.
Bioimimicry is not merely a helpful design tool -- it also encourages designers to think not only about how to solve design problems by imitating nature, but how to make the products, materials, and systems they design more ecologically sound and nature-friendly.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.