Programming has not only given us smartphones, tablets, and extremely useful software programs, it has changed the way electrical and computer engineers are designing circuits. For example, an LED effect such as a blinking light could be built using a 555 timer circuit rather simply. However, using a cheap MCU, the circuitry is less complex and a few lines of code can achieve the same effect. It is even possible to create much more complex lighting effects using just a few more lines of code.
The first general-purpose 8-bit microprocessor in the world was the Intel 8008. After its introduction, MCUs quickly began replacing discrete logic. Due to their small size, ease of use, and programmability, engineers quickly welcomed them into the field. However, in the 80s, memory was scarce and even more important, it was expensive. In January of 1983, memory was approximately $2,396 per megabyte, whereas today it is approximately half a penny per megabyte. Consequently, the cost of memory in the past put a limit on the applications MCUs could be used for. Engineers needed to program with caution, making sure their code used as few lines as possible and did not surpass their memory limit.
Bjarne Stroustrup, creator of C++. (Source: Wiki)
Today, MCUs are cheap and memory is abundant. Furthermore, MCUs have been integrated into products we use every day such as: displays, printers, keyboards, phones, washing machines, microwaves, and most importantly, cars. A new luxury car today could consist of hundreds of MCUs.
With all the great development in software and MCUs up to this point, it is imperative that engineers build a strong foundation in programming. Programming not only gives people a deep insight on how computers work, it opens up a large array of new tools you can use your computer for. Overall, programming has made our lives unbelievably technically involved. Programming is the powerful force driving modern day technology, and we still have a long road to drive.
Other notable happenings in 1982:
Sun Microsystems incorporates.
Adobe was founded, placing PostScript description language in the Apple LaserWriter.
Symantec was founded, mostly selling security and information management software.
Hercules, maker of high-resolution screens, was founded.
Maxtor was founded, now absorbed into Seagate.
Commodore 64 sold with 64 kilobytes of RAM for $200, the fastest selling computer of all time.
HX-20, the first notebook-sized portable computer, debuted.
I cut my teeth on an Apple II! Learned programming in assembly language on a Z80 processor. And quickly moved to Fortran and C. I have been out of the electronics and computer programming for over 20 years and am just amazed at what the people can now program. Gone are the days of trying to fit code into 2K of memory!
With my youngest starting college, I am once again getting reacclimated with C++ and am amazed at the extension of this programming language. Then I discovered visual C++. WOW! Even an oldtimer can get into the new stuff!
Very nice historical synopsis on coding! I always appreciated having to learn assembly as it gave me an understanding of what was really happening at the bit level, but I also deeply appreciate the evolution of higher level programming languages to get the job done. I think it is important for an engineer to learn low level programming so that they have low level control when needed, but can also use higher level languages to implement solutions that meet customer and project criteria and compatibility requirements.
Remember the Intel MDS80, the development system for the MCS-48 series processors? If I remember right the system had two 8" floppy drives, and that's the machine we used to write the 1K of code we squeezed into an 8048. At one point the place I worked at had lost the ability to move the source code out of the MDS80 and its 8" drives, so I once had to type all the code into my new flagship machine, a `286 PC clone. It was a huge improvement, though, because I could assemble (not compile, this was Assembly) in as little as five minutes. Yup, just type into the command line on the PC, and walk down the hall for a cup of coffee. When I came back from the cafeteria I'd pull a windowed part out of the uV eraser, burn the HEX file with my Needham programmer and start debugging the latest change.
There isn't a day I don't marvel at how far embedded software development has come, and what what the next big change will be.
Nice retrospective, Cabe. I think the time will come when all engineers will have to have good programming skills when they leave college (not just one or two classes in programming). I don't think that mechanical engineers will be able to avoid it. From what I can tell, though, engineering curriculums don't seem to be making that a priority yet, but I think the time is coming.
The thing is, when I was covering app programming, everything was about getting simpler and also being more visual, so even people who didn't have coding expertise could use visual tools to code to build applications. Is something similar happening in the engineering world? I am not super up to speed on CAD tools, but I imagine there is a similar trend there, no?
"Electrical engineering is without a doubt one of the fastest growing disciplines today. Between the constantly changing curricula and rapidly advancing technology, engineers are required to keep themselves up-to-date with the latest tech and tools."
Cabe, we can say electrical engineering is the mother of modern engineering. In 70's there are only 3 branches like electrical, mechanical and civil. Later electrical branch is divided in to electronics, computer, communication etc. While I done my graduation in 90's all such divisions are under the electrical department.
Charles, now a day’s programming and computer literacy is a mandatory for all the engineering branches. Even mechanical and civil graduates are doing programming courses, inorder to grab a job in IT domain. Moreover, now a day’s most of the branches had introduced C/C++ programming in their curriculum either as main subject or elective.
When I was in school FORTRAN counted for foreign language credit. We ran stacks of punchcards to the VAX gnomes. Many a student broke down in tears when they accidentally dropped their stack of (unnumbered) cards.
In the DOS days I was a BASIC guru and a whiz with Turbo BASIC.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.