Most of us who attended engineering school had to take a class that involved some type of programming language. Did anybody else out there get trained with punch cards? Yes, I’m clearly dating myself. But those were interesting times. You’d work on your program, output the cards, and then leave them on a shelf of the Univac operator to plug into the mainframe. You’d come back a few hours later to either a working program, or an error message that had to be deciphered.
In those days, the languages were Basic, Fortran, Pascal, and Assembly. Toward the tail end of my education, the new-fangled C language was starting to emerge. That was followed by something even more radical: C++. Today, C is somewhat dated, but still in use. C++ is still “that new programming language” to some, but more than mainstream to others.
To maximize the performance of your embedded system, you may want to give C++ a look. And it may not be as radical or “out there” as you may have been led to believe. To help simplify the process, we’ve arranged for a free class in our Continuing Education Center that breaks down the essentials in a very easy-to-understand manner. Taught by
Colin Walls of Mentor Graphics, the class goes through the basics and provides lots of examples to help get you started. Programming Embedded Systems in C++ is being held live next week, but can be seen in its archived form at any time later.
I went to college and we had Fortran and a little C. I stuck with the C language after college but never really got into it that deep. Then my oldest son went to college and had to take C++. He would come to me for help with his assignments and this forced me to learn the C++ side. Not that hard to make the jump from C to C++, but I could see if I had only stuck with Fortran I would of been no help.
The moral, do not be afraid to learn (if just the basics).
You are right, the transition from C to C++ is "hassle free" but if one really wants to learn how things work in the background, he must learn the basic assembly languages. Nowadays students are quite comfortable using C and C++ as they are very user friendly, but at the same time they are missing out on how things work at the background of their code. To get a complete hold of programming, one must know what lies in the details.
I don't think that any transition in software is hassle free, however. (maybe I could generalize and say that nothing is hassle free about software - the devil IS in the details)
It seems to me like C++ may be a bit too much for the resources available in lower end processors. Many of the compelling features of C++ might be pretty resource heavy for microcontrollers. I think C is a great choice for microcontrollers. It gets you a step above Assembler, but is still lean enough to be feasible.
I'm interested in seeing what C++ features will be supported in the microcontroller implementations, and what features won't, and whether low end controllers will be supported at all.
Daniyal_Ali: I agree with your comments about "working in the background." And I would encourage those that are trying to master these languages to do so. However, today's tools make that (unfortunately) unnecessary. It reminds me of the days when Windows was first popularized. Those of us who knew our way around DOS seemed to have a distinct advantage.
In college, I thought programming was boring and didn't really see why its important in elecrtrical engineering. After taking a BASIC programming class, I saw how powerful software could play in enhancing embedded systems. Programming languages like C and C++ make the impossibility of creating smaller and smarter electronic devices possible. I tell my students the ability to make a dumb device smart using C and C++ programming languages makes them the ultimate problem solver developing tools and products that can change the world.
Rich, I started out with punch cards and FORTRAN. Actually, I was studying physics and was a paid student programmer. We had terminals in the Physics building, so I never really had to submit card decks. The fist terminals were printing terminals with paper tape. We then went to fancier printing terminals and then we went to CRTs. It was great. Then I went to work at NASA and was back to submitting cards. It was a letdown.
I, too, started out with punch cards. I had a couple of rather early classes in finite element analysis. I'd take a stack of my punch cards to the guys who ran the computer room, come back the next day, and pick up my results on the old 11" x 17" perforated, folded printout paper. If I made a slight error, I couldn't get my corrected results until the next day. You had to hope you didn't have a dozen errors in your punch cards because that could represent 12 days of new printouts. Gee, when I describe it that way,it all sounds so old.
In college I worked in the Engineering Dept while in the EE program. One of my first jobs was to throw away piles of punch cards. How symbolic. Fortran was the language in our required introduction to programming. no punch cards, but running jobs using the VI editor on Hazeltine terminals connected to a Digital mainframe in the CS dep't. The experience was more valuable than any specific knowledge. Later came Pascal. By the time C was offered, I was finishing my studies and had no elective slots.
I learned ASM in a microprocessor class on Motorola chips. I agree, learning ASM is so valuable. When you look at the output listing from your compiler, wouldn't you want to have some idea what the compiler generated (particularly when the compiled code is not performing some low-level task as you expected)?
I learned C in self-study while working. C seems the perfect fit for low-end MCUs. C is also a good fit with the procedural code we tend to develop for embedded systems.
Now looking at C++, I see that over the years I've solved some problems using OO-like approaches without having any formal understanding of C++. But C was still powerful enough to allow me to create those things.
The promise of the Internet of Things (IoT) is that devices, gadgets, and appliances we use every day will be able to communicate with one another. This potential is not limited to household items or smartphones, but also things we find in our yard and garden, as evidenced by a recent challenge from the element14 design community.
If you didn't realize that PowerPoint presentations are inherently hilarious, you have to see Don McMillan take one apart. McMillan -- aka the Technically Funny Comic -- worked for 10 years as an engineer before he switched to stand-up comedy.
The first Tacoma Narrows Bridge was a Washington State suspension bridge that opened in 1940 and spanned the Tacoma Narrows strait of Puget Sound between Tacoma and the Kitsap Peninsula. It opened to traffic on July 1, 1940, and dramatically collapsed into Puget Sound on November 7, just four months after it opened.
Noting that we now live in an era of “confusion and ill-conceived stuff,” Ammunition design studio founder Robert Brunner, speaking at Gigaom Roadmap, said that by adding connectivity to everything and its mother, we aren't necessarily doing ourselves any favors, with many ‘things’ just fine in their unconnected state.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.