@All- it seems from reviewing your Votes that several of the decision criteria I talked about were used in making your decisions. That's great! Just keep these all in mind when you are doing your next design...
@All- a good point was made in a comment below. It is typically a good practice to do an initial 'boot' in assembly and then transition to the main program in "C". This can help make sure the boot is done correctly (setting clocks, PLL division, POR settings, etc. You typically want interrupts off during initialization too. Any, a good approach to think about using...
? @gartsxt- It can be difficult to implement table jumps in C, but if you have a clean way to implement them (that is easy for someone else to understan) that can work fine.
OK. I usually use tables as a replacement for calculating values. You are right, computed jumps can be a nightmare to deal with, and I have found some low end development tools get confused when debugging these structures in C.
? How about C+ (avoid NEW operator to prevent heap issues). Classes allow better abstraction and each design decision can be encapsulated in its own method. The comiler will inline all the little functions.
@Bob- Yep, malloc() and free() are dynamic functions and require lots of overhead as lists get large. If you can use static structures or build your own linked lists that would be better (more predictable timing).
? In a Digi-Key class from two weeks ago (Introduction to Real-Time Kernels), the point was made to avoid using C's malloc() & free() because they fragment memory, which can lead to disaster when you finally want to allocate something and there's no space for it to fit. I guess the question is, have you ever been burned by this, Warren? Are there work-arounds?
Example 3 would be a mix of C and asm. C for the wireless network and complex calculations. The supprt for the high speed sensors and would be written and debugged in C first and optimized in asm as needed.
The only speed issue on a SM seems to be if it's a3Phase unit. The MSP430 has a specialized unit with Sigma Dleta ADC built in -- thee "channels" (7 inputs) for three phases. -- I would look for specialized hardware -- if it was 3Phase. One phase is simpler (220V -- household) Can do with simpler chips.
Assembly due to code reuse and likelihood of developers knowing the old system. If memory and performance are considerations, keep in mind the PIC does not have a true stack pointer which makes it C unfriendly.
@email@example.com - >Bob Loy:OK...as a learning experience it makes some sense..but I don't think that reasons 1 & 2 are valid. Hey have fun! You ought to really go hog wild and write the new language in itself and bootstrap the whole thing!!
LOL - I used to think that a language that can be written in itself was the coolest thing in computer science. Actually, I still do, but from a practical standpoint I believe that having an interpret/compile kernel "off to the side" is probably much more efficient. There's enough ivory in my tower already without looking for more!
It is truly embarrassing when you're just buzzing along, turning out some of the most brilliant code you've ever conceived when you hit that stumbling block and it just ain't workin'. Then you realize the include (or function/procedure/called_whatever) you're using is in a completely different language!! (sigh) (grin)
@mharkins- yep the process is the same. It is important to understand how to measure the tools to see which is best for the job. Hopefully todays class will help folks who need to better understand some of the 'metrics' used for measuring...
@warrenM My point is that all these real world constrains ARE the determining factors of a "winner" I've been coding since the 70s, embedded since the 80s. The selection process is always the same. Processor that does the job, with the most bang, for the least cost, with the best tools. Coding? Get the job done, with the least errors that meets the requirements. Specifics of tools? The tool that does the job the best, and the quickest. LOL!
@mharkins - On ol' girlfriend bought me a repurposed Twister game spinner. (Remember those?) She taped over the numbers and had me write all of the coding languages I use – so I wouldn't (ever again) spend an hour writing the wrong syntax for the language I was (supposed to be) using!! That spinner still hangs on my coding room wall, right next to my desk. Yes, I still use it. (grin) Yes, I still MUST use it!! :)
@mharkins- My hope is that the experience of going thru some real world examples (simplified however due to our classtime constraints) will help develop your skills so that when you have your own selection process to do you will be able to do it more efficiently and with increased confidence...
It is the process we want to learn about more than the actual 'winner' for each example...
I have to ask, is this to be a theoretical exercise, or are all real world constraints part of the consideration (i.e. schedule, cost of time (implementation time), etc)? If execution "speed" is the only consideration, then the outcome is skewed. If ease of readability is the only criteria, then again the outcome is skewed.
Proposed function: in an ISR, read an 8 bit hardware register. If bit 1 is set, do subroutine 1. If bit 2 is set, do subroutine 2, etc. The subroutines are brief and you must call the subroutine plus exit the ISR in the shortest possible time.
@JoeFromOzarks Couldn't agree more. I'm a back yard mechanic at home, but I gotta say I like using the right tool for the right job. Same with coding. Usually first pass in C, optimize, then either recode sub routines that require it in Assy, or single inlines as needed for efficiency. Both belong in the toolbox.
@FCsuggestion -- image recognition , such as used for check scanning to deposit a check ... must interpret orientations, check and interpret type printed, hand printed and signature characters for correctness ... if standing at an ATM, don't want to take too long, but longer OK if remote sending (like from smart phone) is OK??
@All- If you have some time prior to the class think about a specific function you would like to propose as a Fight Club topic. We may have time at the end to pick an additional example and I'd like each of you to propose a cabdidate.
IMHO, given equality in skill and proficiency coding in C and Assembler, and given the target hardware platform is equally accessible to both coding platforms, I would think it's an error to presuppose one coding platform is "better" than the other.
We can debate (until the moo-cows come home) which one is faster in execution, which one is quicker to code and debug, which one is cheaper (.$$$/1k hardware and/or $$$,$$$ salary) to implement and which hardware platform is "the best."
I like to say "Use the correct tool for the job." Usually, there is a fog-caked clarity determining which tool is most suitable or most effective for a specific task.
Given a choice (and a tie in all other considerations,) I'll choose the "IDE" providing the most entertainment (FUN) for a particular project. (grin) I'm easily entertained though...
C is portable, kind of. Assembler commits you to a particular CPU/architecture . Since I've switched to C I only do assembler for low-level (tasking, stack/interrupts, atomic I/O) operations. Or using special CPU features, like the 8051 bit-addressing funnies.
The streaming audio player will appear on this web page when the show starts at 2 PM Eastern time today. Note however that some companies block live audio streams. If when the show starts you don't hear any audio, try refreshing your browser. If that doesn't work, try using Firefox or Google Chrome as your browser. Some users experience audio interruptions with IE. If that doesn't work, the class will be archived immediately following our live taping.
I gotta say, I've designed fire alarm systems (including smoke detectors) and used both assembly and C for those designs at various times. I know in my mind which one wins, so this will be interesting.
A new service lets engineers and orthopedic surgeons design and 3D print highly accurate, patient-specific, orthopedic medical implants made of metal -- without owning a 3D printer. Using free, downloadable software, users can import ASCII and binary .STL files, design the implant, and send an encrypted design file to a third-party manufacturer.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.