@tbh. Thanks for reply thats actually a decent post. Me likes... Still Ada is far from a best practice... i personaly would go with the more popular languages until others find/refine its problems. That now could be a best practice. :D
@ naperlou, it is an interesting case. I think there may be two reasons that can be attributed to this anomaly. First is that you were probably prepared to find the error in the code and knew the nature of the problem from going through the word exercise. Second, it might be the case of frame of reference. We can understand things better which are related to us somehow.
I agree, McConnell's books are well worth the read. His books have a lot of good pointers on writing solid code. My copy is heavily highlighted and dog-eared:) I have not read the Stroustrup pages you referenced, thanks for pointing them out. Looks interesting.
If you are interested in learning about coding practices, (and not just arguing about it) I would recommend the book "Code Complete". I think it is a Microsoft Press book. It is a fascinating book concerning what practices and habits are conducive to producing good code. It is probably a little dated by now, but it is a great read if you are really interested in quality coding. I have read it several times, and it is on my bookshelf at work. The book is not language specific, as it is mainly concerned with programming practices. The book contains a lot of actual data concerning programming practices, not just a bunch of opinions.
If you happen to be a C++ programmer, I also recommend any writings of Bjarne Stroustrup, including the stuff on his web page (http://www.stroustrup.com/). I find his writing style to be very enjoyable, and his insight into using C++ is invaluable. Because he invented the language, and has been involved with almost every detail of it, he knows what the design intent of the language is, as well as how it is most properly used. He is very good at articulating both the strengths and weaknesses of the language.
As this article points out, C/C++ does give programmers every opportunity to write disasterous code (actually most languages do). But on the other hand, it gives you control at nearly an assembly language level when you need it, and high level tools as well. When used properly, it can produce very high quality, maintainable code that executes extremely fast.
It's a matter of cost, too. In medical devices we test 100%, yes, every branch, and then a line by line peer review. It's tedious, expensive, and even minor changes trigger a complete review of the module. I'm certain that level of verification is far too costly and time consuming for a device that is largely a consumer product.
If the consumer went to WalMart and saw two computers hanging on the wall, both with the same features, but one was verified with 100% coverage at twice the price, I'm certain the consumer would pick the cheaper model. In many ways, that's already been proven.
I may not be qualified enough to raise my finger on designers and programmers working on the payroll of Apple, but these are the basics of programming. Programmers are advised to always use brackets even in case of single statements in C. It brings our attention to a problem that most of us face. We tend to turn our backs to the basics. This is a classic example of how we may suffer from doing so.
I don't think the author directly said that about Apple, but I can see how one might have seen that as implied. I'm not an analyst so I cannot comment one way or another on Apple's performance in the market, or whether or not they appear to be "off their game". That wasn't the focus of the article, and seems irrelevant to the argument put forth.
I beleive that the author was trying to be informative and provide useful information on practices that he values, and that others may benefit from knowing about. Unfortunately the article as it appeared online has major flaws that resulted, at least for me, in loss of the message.
No offense but the author has one thing right. Apple has not been getting things right for a while now... just look at their market share and stock prices... all i have to say is "SELL" for Apple. The devil is always in the details.
That table below was % of the sample population. The preferred programming language for those sampled. Obviously bias view and I have seen a different one on every major computer magazine out there. However one thing they have in common... ADA is not even listed. so correct me if I'm wrong but does not a best practice require a tried and true development environment, language and compiler? If the community does not practice this language how could it be a best practice?
The company says it anticipates high-definition video for home security and other uses will be the next mature technology integrated into the IoT domain, hence the introduction of its MatrixCam devkit.
Siemens and Georgia Institute of Technology are partnering to address limitations in the current additive manufacturing design-to-production chain in an applied research project as part of the federally backed America Makes program.
Most of the new 3D printers and 3D printing technologies in this crop are breaking some boundaries, whether it's build volume-per-dollar ratios, multimaterials printing techniques, or new materials types.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.