Hey folks.Sorry to be so late, but I needed to spend the afternoon at one of our local TV stations.After that, it was taking care of domestic matters and getting the day's calendar cleared of so I could have some peace to focus on our forum.But, at least things are archived so that I can avail myself of this added knowledge.So, let's get down to it.
Alex's comment about not knowing what's under the hood resonates with my feeling. I learned at the time when a single person could understand the relatively simple h/w, OS or monitor and the applications. That ability is going away with current systems.
@EdB_Vt - I don't know how/why they carried the *Jet to LaserJet. It has definitely been part of HP's printer and tools naming. InkJet, ThinkJet, QuietJet, OfficeJet, JetSend, JetAdmin. Several years ago a competitor tried to sell a PhaserJet printer but HP lawyers manage to squash that.
planning for future comes from experience. this experience comes from maintaining existing systems. one learns quickly the good practices and that saves time, and learns that just a little better design that allowed for future changes saves a ton of time, effort and money
In guessing the future, if you HAVE an 8-bit register and are only using 4 bits, you CAN kinda "predict" those spare bits will come in handy SOMEtime. So attach pads/throughholes for rework ECOs (avoid someone having to solder directly to chip leads, at the very least). Or a 10x4 PLA; same thing.
Several have commented about not being able to guess the future. No, we cannot guess everything perfectly. But how many times have you guessed right? Though all my future guesses have not been correct, many of them were correct.
Heh -- you guys got the degree FIRST, THEN started your careers. I did it the other way around, so when I got into those college courses (relatively recently; STARTING though with 1.5 years at MIT did help), I had a MUCH better feel for "good point" versus "yeah, right" re stuff some instructors were pushing.
@jl - Good observation. My planning ahead doesn't mean implementing the future now, but leaving room so that I can implement it later. We did not hook all 8 of those positions to I/O pins, just the 5. But when we needed the 6th, then we added in the register where there was space reserved.
Current college courses DO cover a large amount of this stuff -- unfortunately, it isn't the kind of thing that makes a lot of sense until you've had some time "in harness" and seen a few of the many ways that people can do design WRONG!
University of Illinois had Computer Engineering Degree in the late 70's. Basically a Elect Eng. Degree with software and computer architecure courses. My graduating class had 33 students in it for that degree.
Right= T-shooting with desigh experoence gives you in the future good position as a good verifier of ANt documentation...and of cors your responsibility to new generation is to brcome TEACHERs as YOU ARE guys. Congrat for exelent job
The pendulum swings -- USED to be just "EE", now there's CS and EE. That is soon to be merging again. At this point, with totally configurable programmable DRAM-type processor cores, an ASIC where you can program up to four cores, NOW the line between Hard- and Software is fuzzier than ever.
Gary, Great week of lectures. I think the most critial point is to use standards. This allows easy collaboration with teams, better defined projects and ability to re-use code and do it right the first time. Thanks for the great points.
A comprehensive survey of design-planning tools might be a valuable resource, if not a whole week of discussion. Kind of begs a bit different level of interaction, though -- not clear when "Chat" may be part of a bottleneck...
I agree flaredOne. I have worked at several startups and wearing many hats is something that is valued to some degree. But sometimes (not always) they dont realize how much you are covering till you are gone.
Just checking -- you familiar with programmable logic devices which are like DRAM?? Every time you power them up, you have to load them with their configuration, their logic tables, etc. I mentioned this yesterday, having encountered a situation where there was a huge multi-function ASIC AND an adjunct "boot PROM". Point: make the load-space selectable, so you can access variations and (perhaps) diagnostic versions of the ASIC without having to change out any hardware.
If you HAVE the option to select "slightly larger than actually needed Right Now", go for it. Excess pins, among other things, simplify layouts (more physical-path choices). Key point: if you HAVE spare pins, attach a trace which runs off to a labeled through-hole (someplace you CAN attach a jumper lead). Priceless.
Re finance, savings question (which I missed -- just got back): I recently finished up my BsEE (non-traditional, after "a few years" working -- was amused by a SIGNIFICANT amount of time in an engineering-specific class spent on investment counseling, "The Power of compound Interest", etc. Have started some trusts for the boys.
what I am hearing from Gary's presentation...planning ahead is not necassarilly putting in future functionality...but rather leaving room for expansion in registers usage, memory, type definitions, etc
Planning ahead is always a good idea but the benefit is only as good as your ability to predict the future. If yoiur guess is correct planning ahead pays off. If you are wrong there's probably no gain.
I looked at the Leap product and it looks interesting. I thought about signing up as a developer but I don't have the time to take on anything additional right now. I'm getting one of the devices to try out, but having to wait until the end of the year or longer is a bit of a "bummer."
Hi everyone. Another Friday and another section completed. I see Alex has added one more course to the list. I wonder if there will be more to follow on this track or if Digi-Key or someone else will carry this forward to the second 180 days of the year.
Anyone check out that LEAP product re gestural input device with 0.01mm precision in 3D space? There is an opportunity to MAYBE sign up as a developer and get a free device and access to the SDK. Could be a "get in on the ground floor" opportunity. Just my opinion, but I predict this device is going to raise the standard of "Elegant" in graphical user interfaces and virtual functionality -- the device is reasonably priced for market penetration, could influence a "bump" in heads-up display glasses, and might actually 'rescue' a bit of the market for 3-D monitors.
Let's see: anticipating the "usual" questions -- I am predominantly hardware: project, proto, product; with more than enough software to make the hardware walk, talk, sing, dance, see, hear, communicate, be produced and tested with embedded diagnostics -- AND (generally) to be profitable and supportable. Currently seeking gainful employment AND contemplating and researching a wide range of possibly lucrative projects. Last time I checked, it's in the mid 80s outside (again), on its way to the mid 90s.
Good afternoon to all. It appears Chat is STILL losing posts (THIS is my third try for THIS content) so I'm covering my bases: selecting and copying all this text, then pasting it in a text buffer off to one side, just in case it gets lost AGAIN.
Was just reading through some stuff about Microsoft STARTING to "embrace" open-source. I was struck by just how much of the conversation was about the evolving semi-formal knowledge base which revolves around Best Practices for "collaborating" on making use of knowledge bases, i.e., those misc shared libraries of functions, etc which are arguably among THE most useful contents in most software developers' "tool kits".
Gotta love the self-referential stuff, AND the synchronicity of stumbling over that while contemplating "Best Practices" and "Principles" and how best to implement those in a larger-than-just-one-person design environment.
The streaming audio player will appear on this web page when the show starts at 2pm eastern today. Note however that some companies block live audio streams. If when the show starts you don't hear any audio, try refreshing your browser.
Using Siemens NX software, a team of engineering students from the University of Michigan built an electric vehicle and raced in the 2013 Bridgestone World Solar Challenge. One of those students blogged for Design News throughout the race.
Robots that walk have come a long way from simple barebones walking machines or pairs of legs without an upper body and head. Much of the research these days focuses on making more humanoid robots. But they are not all created equal.
The IEEE Computer Society has named the top 10 trends for 2014. You can expect the convergence of cloud computing and mobile devices, advances in health care data and devices, as well as privacy issues in social media to make the headlines. And 3D printing came out of nowhere to make a big splash.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.