The innovation have been around for quite some time and it is true that it takes an outsider to see the upside of the innovation, probably a business graduate will come up with a ground breaking idea on the future of the cloud computing and leave many surprised why they did not think about it in the first place. The changes in computing are way too fast to cope up with but with the current dynamics in the way people think innovations will keep on sprouting up.
Cloud computing is being projected as replacing individual computers right now because a few folks have discovered how to profit by selling it. And when you are selling a product and making money at it the only smart move is to declare loud and long that you are selling "the next big thing, everybodie's future". It makes no difference if it is correct or not, if you are selling a product you MUST sound like you absolutely believe that it is the best choice and the only way to go for the future.
My point is that only one side is talking and they are talking a lot. There are some serious potential failure modes with cloud computing that are not mentioned very often, and seldom discussed or considered. Security and reliability are two of them, and the only thing that we hear is all sorts of general assertions that the concerns are invalid. But is that really true? I don't think so.
One other thought is that while computers are one convenient way of saving information, they are by no means the primary aid to creativity.
If we are to innovate, we must look deeper. From my point of view, we are just seeing the initial stage of a large shift in the way we think and use parallel/distributed computing. We are seeing democratization, low cost availability, and a reduction in complexity to develop software for such systems. Now individuals have access to computing resources that before were only available to governments, large academic institutions and large firms.
Hadoop is just one tool in the toolbox. Hadoop (Map Reduce) might not be the right answer for highly intensive-CPU scientific or real-time applications but it does have a role to play that needs to be evaluated in the context of the specific problem domain.
MMoreno, I am very familiar with Hadoop and it's ecosystem. This is not typically used for large scale scientific computing. I once proposed a research project to study this, but it has not come to fruition.
Hadoop is, in the words of one CS Department head, a processing flow paradigm. This is in contrast to a data flow paradigm. I thought that very interesting when I heard it. I am not sure of how that could apply to, say, a CAE program. I think it is because of the finite element model, which is driven by the mathematics and physics, not the programming paradigm. It would be interesting to see if that can adapted, but I know of no commercial product that has.
Yes parallel computing is actually really hard; fortunately new paradigms have also emerged that are now transforming the technology landscape. The most prominent is Hadoop (and its powerful ecosystems Hive, Pig, Zookeeper, Mahoout and Hbase) that is disrupting conventional ways of thinking about parallel computing.
These technologies are now in production 24-7 at Google, Yahoo, Facebook, Linkedin and others (like NSAl),and by the way, Amazon is making affordable too: you don't even need to buy the hardware.
Another initiative to follow is "openStack" that is enabling standardization and management of cloud services (large clusters).
If you want to experiment a bit: see http://aws.amazon.com/elasticmapreduce/
You may want to visit: http://hadoop.apache.org/ http://www.openstack.org/
Chuck, I am not sure I agree with him, but his point is that the programming of the massively parallel machines is extremely difficult. The most widely used approach is called Message Passing Interface (MPI). I have studied this and applied it. It is HARD. Many software vendors rely on the CPU vendors to keep increasing performance. Two companies I know of at present use it in commercial products. SAS just came out with a version of their product that uses it. A electromagnetic CAE software company called REMCOM is advertising using it in IEEE Microwave (my light reading magazine). I am sure there are others, but not many. MPI has been around for a long time, though.
So naperlou, the guy who did his Ph.D. thesis in supercomputing says that the massive number of small CPUs is coming to an end? This is mind-boggling to me because I recall how many people in supercomputing resisted the idea of little CPUs replacing monolithic processor boards. That wasn't so long ago -- maybe 1995. Now the days of little CPUs is passing? The changes are almost too fast to keep up with.
Brian, Thought provoking post. I do agree that users will need to find some compelling benefits to reaching to the cloud. I recently bought a Chromebook Pixel with one terrabyte cloud disk space. I thought I would immediately just put all of my photo library there but that has taken much more time than I thought it would. Google Drive has been a great convenience but I agree there is a need for new apps and uses.
Just when you thought mobile technology couldn’t get any more personal, Proctor & Gamble have come up with a way to put your mobile where your mouth is, in the form of a Bluetooth 4.0 connected toothbrush.
The grab bag of plastic and rubber materials featured in this new product slideshow are aimed at lighting applications or automotive uses. The rest are for a wide variety of industries, including aerospace, oil & gas, RF and radar, automotive, building materials, and more.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.