An excellent, free tool that I've used for years is devFlowcharter. You create using a flow chart and the output is in C. The real benefit is coming back years after a project has been completed and easily being able to modify complicated code that would otherwise have been forgotten.
I think the Pi and the many kits available for it are an excellent platform, but the Arduino also does analog which the Pi does not. so as long as the digital on/off is sufficient then you get more out the Pi.
The Arduino is now used in commercial applications such as toys (e.g. MakeyMakey) and is one option in the endless array of low cost microcontrollers to choose from. The fact that the Pi sold more units does not mean that the Arduino is bad. The Pi can do more things, but does not do everything an Arduino can do. As it turns out, there are Arduino boards that plug straight onto the Pi combining both worlds and more and more companies make good business with both (e.g. Adafruit Industries).
Lastly, Linux is the name of the kernel for the operating system that runs on the Pi. The choice of kernel/OS does not determine which programming language can be used. The Pi is designed to run a Linux distro (Debian) and be programmed using Python (which is why it was called "Pi").
I used to program CNC machines awhile back and we did straight coding. A few years after that I ran a machine that was all GUI and just spit out the code. This was a new company for me, and everyone there just excepted the code as fine. Once I was there for awhile and comfortable putting in my opinion I told them the programs this thing gives you are terribly inefficient. As I had learned the code from scratch I could take so many unneccessary lines out of the machine's code it would sometimes cut a parts cycle time over %50 easy.
So I guess my point is...yes this is good...but I feel you should still know the code and not rely on a machine to give it to you...How can you debug or modify code you don't understand anyways?
While you have embraced a nice tool set for getting students excited (which has it's value) .. it is based on assumptions in hardware and software that are becoming obsolete!
The raspberry pie / beagle board/ and numerous other new platforms are demonstrating platforms that are very cheap can include enough processing power and memory to compete with the desktop environment. (no need to separate development environment vs target platform in most applications.
My point... Your students should be learning both the macro (trends in industry) AND micro (low level coding) aspects of developing products and the tools available.
Your observations on the trends in the industry (concerning code generation) - are over simplified to the point of being misleading. Similar to some perspectives on "auto-routing" of PCBs.. (Most professionals in the industry use auto-routing features of their pcb CAD software in a VERY limited manner, reason: they can route most sections of a design better manually)
While you may see my observations on newer hardware as a confirmation of your conclusions - I see them as a unintentionally distorted perspective on software development (no offense intended - intended only as constructive critic).
Naperlou is correct... code quality comparisons aren't that simple.
Naperlou, I agree with you. I used to work with PLC programmers and their software ladder logic on real-time controls. I'd program PC as the brains and their PLC's were the automonic systems for our systems.
Many times I saw the PLC programmers unable to understand why their scan rates went into the toilet after they added just one more functional block to their programs. Repeatedly I had to tell them that one block contained a massive chunk of code that had to run in order to simulate that function. We'd usually find another set of lighter weight blocks to use instead.
Removing them from the nitty gritty of code creation removed them from reality.
My prefered language was FORTH, that let me do keyhole optimizations by recoding some of the slower high level commands into assembler. FORTH lets you intermix high and low level language constructs as long as they manipulate the stacks in a similar manner.
And you can implement FORTH in less than 8K or memory, perfect for microprocessors.
A Raspberry costs $35, uses an ARM1176JZ-F running at 700Mhz, has Ethernet, 2X USB, HDMI, stereo audio and about 20 GPIO pins. It is the size of a credit card and runs from a cell-phone charger. It also runs debian linux.
It can encode/decode 1080P video, and can handle wireless simply by plugging in a 802.11 - USB "dongle".
And if you don't like linux, you can program the device in ASM, C, Python, etc. A port of Android is in the works.
While I acknowledge the value of these observations (getting more people into engineering) - I have some serious reservations concerning the use of these new "tools" ( graphical programing environments/ auto code generation).
The "other side" of a double sided sword... Easy to create - vs - encouraging people that do not have the basic discipline required to understand a problem - to automate a solution!
I see a parallel with office staff creating custom spread sheets (another simplified programming environment). Most often these "programs" are created without any education on structure/quality assurance or simple verification of proper operation! Result: People making important financial decisions for a company with bad data!
The other concern: efficiency vs creativity. Automation of code creation is like power steering in a car.... a trade off in "road feel" vs " ease". In this example "ease" equates to removal from understanding the problem - resulting in good - but never great - performance.
The last observation: true automation in creating code is VERY different than most the examples given - which are wrappers for code re-use.
naperlou, I agree. The fun in software development is in hand coding but what's happening today is there's a lot of non-tech entrepreneurs who are creating new tech products. Without spending a lot of detail time reading datasheets or software design guides, Graphical based programming languages that auto generate C-code allow non-tech entrepreneurs to rapidly develop a PoC (Proof of Concept) for feature/function feasiblity. Once the PoC establishes validation of the idea being sound, then the auto generated code can be optimized using traditional hand coding techniques.
Also, to spark interest in electrical-electronics and computer engineering, tools like Matlab/Simulink, NI LabView, and Cypress Semi PSoCs keeps the interest of the next generation technologists and engineers in engaged in the problem solving tasks through a fun-graphical creative learning enviroment. As titled in Michael Schrage's book, its "Serious Play".
Kevin, as one who has done this kind of thing for over 30 years, I can tell you that much of the fun in solving the problem is in writing the code. The reason you see so many microprocessors in products today is that it is cheaper and easier to write and change code than circuits. And don't even get me started on programming analog circuits.
Comparing the number of defects in manufactured hardware to code is not valid. Software and circuit design are equivalent, not software and manufacture. If you look at the equivalent manufacturing process (i.e., reproducing the code for distribution) then software trumps hardware any day. If any errors are introduced it is in the underlying hardware medium it is not in the software. In fact, with checksums used in code distrubution the errors can be detected before the software is ever used and the information re-transmitted.
As for automatic code generation, it is the exception rather than the rule. The companies you mention will use automatic code generation to create a template and to automate the rote communication code that is required. The code that solves the real problem is still done mostly by hand, or hand tuned after some level of generation.
When you are talking about small microprocessors, hand coding is essential. The diagram needed to drive an automated process in many cases is more difficult to draw than just writing the code.
Finally, electronic cirsuits today are developed using code anyway. These are VHDL, Verilog, System Verilog and System C. They are very detailed, much more detailed than writing code for a microprocessor. Once a circuit is developed it is successively transformed using very complex and expensive tools, until it is in a form that can be manufactured. Even with these tools, it takes a good bit of time.
The first Tacoma Narrows Bridge was a Washington State suspension bridge that opened in 1940 and spanned the Tacoma Narrows strait of Puget Sound between Tacoma and the Kitsap Peninsula. It opened to traffic on July 1, 1940, and dramatically collapsed into Puget Sound on November 7, just four months after it opened.
Noting that we now live in an era of “confusion and ill-conceived stuff,” Ammunition design studio founder Robert Brunner, speaking at Gigaom Roadmap, said that by adding connectivity to everything and its mother, we aren't necessarily doing ourselves any favors, with many ‘things’ just fine in their unconnected state.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.