"The average programmer writes about 200 lines of code per month. At that rate, a staff of 50 would need 100 months -- more than eight years -- to write a million lines of code."
Knowing the author is simplifying for the sake of brevity, it may not be obvious to some that total lines of 'good' project code per time unit is never linear in the number of people devoted to the task. There is definitely a point of diminishing returns, and a point at which adding more people does the project a disservice by making the overall task unwieldy, if not outright unmanageable. Microsoft used to blame IBM for ruining OS2 because non-technical managers relied on the 'masses of asses' principle as a means of (erroneously) getting it done faster, then ran roughshod over the coders when the simple arithmetic did not prove axiomatic. The people who were a party to the overall 'vision' at the outset can become disconnected from what is actually emerging, as new people are added at the back end to expedite certain tasks or address new requirements. Moreover, the newcomers may have a completely different view of what the goal posts look like. If you start out with a few people who all know 'C' well then, for example, marketing decides the thing needs Android, bringing in Java experts who've never seen a pointer in their life may cause the team to split into two camps, and they may end up competing more often than working together.
Regarding operating systems, I agree they should be avoided wherever prudent and possible. For some projects however, there's no getting away from it. For example, if you're targeting a high end MPU like a Cortex A8 or above, you NEED an OS else you'll get bogged down in the minutiae of writing drivers etc. The first rule of thumb is you should abandon all rules of thumb. The second might be if you're using a MCU like a MSP430 or a Cortex-M0 to M3, you can probably get away without an OS. As stated in the article, concurrency beyond all but the simplest of requirements dictates you need an OS to manage access to inter-process, shared objects.
Thanks to the author for making a software guy feel important for an afternoon. It's time to go home so my teenager can ruthlessly burst that delicate bubble.
Great article Charles. I am one of those mechanical types and have very limited experience with software, embedded or otherwise. This field fascinates me and I am certainly appreciative of your article highlighting the difficulties with the technology. Your comments about the time-consuming efforts and costs to produce the code were revelations. Revelations. My experience in programming is with C++, Pascal and Visual Basic, which are basically "learning-types" of software. If I may, I write an educational blog published through WordPress; i.e. www.cielotech/wordpress.com. Would you mind giving me permission to reference your article in an upcoming blog discussing embedded systems? I think my readers would also be fascinated by the subject. Many thanks, Bob J.
ChasChas: It can't really be boiled down any further than to say that writing and de-bugging code is a very slow, tedious, complex process and many products have hundreds of thousands, or even over a million, lines of code. As RogerD accurately points out here, the numbers cited here refer more to larger projects. Still, the stories I've heard seem to indicate that many, many teams don't have a full appreciation for the scale of the software portion, and that misunderstanding (or lack of understanding) gets them into trouble. As for your reference to eccentric behavior by programmers, we'll need some deeper insight from some of our readers on that one.
I couldn't have said it better. I completely agree that the smallest possible software team will usually minimize development time.
I have seen a one man "team" design, debug, and program a very high performance FPGA/DSP/Microcontroller based motion control and data acquisition system in about a year (hardware and software). I doubt if a whole team could have done it in five years. A government funded team would probably never have finished it. I'm not saying that anybody could have done what he did, but he was the right man for the job, and adding more people to the mix would have only slowed him down.
Unless a software project can be very distinctly divided and conquered, the fewer programmers the better.
One additional thought: when a project takes any COTS module and places it onto a product host PCB, any Agency Approvals (FCC, UL, CE, etc.) are streamlined because the COTS module already "grand-fathers" the host product's approval process. On the contrary, embedding the solution eliminates that short-cut and you must face the full scrutiny of any Agency. Plan on adding at least another 8-12 weeks before approvals are granted.
To point #1 (all about SW) truer words were never spoken. A recent contract assignment involved placement of a Standardized (COTS) transceiver on a motherboard. One staff meeting discussion entertained the topic of eliminating the COTS transceiver in favor of a direct chipset embedded solution. Easier for the EE's; easier for the ME's. But the SW guys hit the ceiling, citing months of recoding development. All the points of your article are great checkpoints for whole teams and especially program managers to post on their walls for continuous awareness.
I went back to my last embedded project and tallied the hours and code, and the result was 50 lines of assembly-level code per 8 hour day, and this includes the algorithm development. This was a small medical device (2100 lines of code not including LUTs) where I did both the H/W and S/W. The S/W was real-time in nature due to a feedback controller.
So I think there are several caveats to the 200 lines per month number. It's no doubt accurate on large projects with a random assortment of programmers, and I find that just the problem of having multiple entities involved bogs the process down. In my example I had intimate knowledge of the H/W since I designed it as well, an advantage never afforded the typical project. But a factor critical to assess is the motivation and technical ability of the individuals involved rather than the use of a fixed line/day estimate; in fact I'd argue that using the smallest (carefully chosen) S/W staff possible has a significant benefit toward minimizing the development time.
Festo's BionicKangaroo combines pneumatic and electrical drive technology, plus very precise controls and condition monitoring. Like a real kangaroo, the BionicKangaroo robot harvests the kinetic energy of each takeoff and immediately uses it to power the next jump.
Design News and Digi-Key presents: Creating & Testing Your First RTOS Application Using MQX, a crash course that will look at defining a project, selecting a target processor, blocking code, defining tasks, completing code, and debugging.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.