Part 5: PCB Fabrication: Gerber Files & Ordering PCB
Continuing Education Center 9/28/2012 217 comments Today is the clincher, to take our layout and generate manufacturing files commonly called Gerber files as well as the Bill of Materials (BOM). Then we'll use a Gerber viewer to review the files. We'll then go over the process of how to specify and get a quote from a PCB manufacturer.
Part 4: Finishing the Layout: Finishing Touches & Design Rule Check
Continuing Education Center 9/27/2012 207 comments To finish the layout, we'll need to add some vias for soldering on the wires. We'll also add some text to the board and fill in some extra copper to help with heat dissipation. Then we'll cover some power-supply-oriented design rules and show how to set up the program to do a Design Rule Check.
Part 3: Using PCB Layout Software: Custom Component Libraries
Continuing Education Center 9/26/2012 317 comments If we need a component that is not in the provided libraries and not found in online user groups, we can always create our own components. We'll go through a few examples and use as much copy-and-paste as possible to create custom symbols and footprints for our custom components. We'll also cover the common footprint design guidelines, such as pad size and spacing tolerance, and using Eagle layout designer to create custom library components to fit our needs.
Part 2: Using PCB Layout Software: Schematic Capture & Component Libraries
Continuing Education Center 9/25/2012 238 comments This second class will start with an overview of circuit board layout software. Continuing on with our LED driver, we'll be using freeware to capture the schematic of our design using built-in library components. Then we'll see the real power of the software as the captured circuit is translated into PCB footprints. This allows us to lay down copper traces on the board.
Part 1: Driver Design & Component Selection
Continuing Education Center 9/24/2012 351 comments We start off this series with an overview of the prototyping process from a circuit design to schematic capture to circuit board layout. We'll be using the example of the LED driver design using the HV9910 that we discussed in "Advanced LEDs & Displays" in May. In this first class, we'll start with a quick review of the circuit, then go over the Bill of Materials and start the component selection process using the DigiKey Website.
Day 5: More Algorithms and More on Using OpenCV
Continuing Education Center 9/14/2012 208 comments Here, we present more complex embedded vision algorithm examples, including face detection and object tracking. As in the preceding session, we explain how these algorithms work, through demonstrations built with OpenCV. We also illustrate a quick and easy way to set up your own vision algorithm development environment using OpenCV. Finally, we provide pointers to additional resources for learning about embedded vision.
Day 4: Introduction to Vision Algorithms and Some Free Tools
Continuing Education Center 9/13/2012 253 comments At the heart of embedded vision are algorithms. These include algorithms for improving captured images, identifying features of interest, inferring the presence of objects, and reasoning about objects and motion. In this class, we introduce some fundamental algorithms, such as motion and line detection. We explain how these algorithms work, and illustrate them with demos (which are available for download). We also introduce OpenCV, which is a free, open source vision software library.
Day 3: Processor Choices for Embedded Vision
Continuing Education Center 9/12/2012 267 comments Embedded vision applications typically make heavy demands on processors – not just in terms of processing performance, but also regarding memory, I/O, and real-time behavior. In this class, we explore the processor requirements of embedded vision applications in quantitative and qualitative terms. We then discuss the six main types of processor used in embedded vision applications, highlighting their key strengths and weaknesses and how they are evolving over time.
Day 2: Fundamentals of Image Sensors for Embedded Vision
Continuing Education Center 9/11/2012 449 comments Image sensors are the "eyes" of embedded vision systems, and their characteristics largely determine the capabilities of the systems on which they are built. In this session, we introduce the most common types of 2D and 3D sensors used in embedded vision applications and explore their strengths and weaknesses. We also highlight recent developments in sensor technology.
Day 1: Introduction to Embedded Vision
Continuing Education Center 9/10/2012 381 comments In this course we introduce embedded vision – the incorporation of computer vision techniques into embedded systems. Via case studies, we explore the functionality that systems can gain with embedded vision and provide a taste of typical vision algorithms. We also discuss technology trends that are enabling embedded vision to be used in cost-, energy- and size-limited applications, and we highlight challenges that must be addressed in integrating embedded vision capabilities into systems.
In a move that strengthens its 3D design business, Stratasys continued a 15-month buying spree this week by announcing its plan to acquire GrabCAD, a provider of a cloud-based collaboration environment for engineers.
Many diverse markets take advantage of semiconductor IP; so many that no one can recite the entire list without leaving off several. So why do we track all the vertical markets? They all have a unique set of requirements and value attributes differently. One major vertical market segment is automotive.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.