Brian, I was at IBM in the early 2000's. At that time, Sam Palmisano took over as Chairman and CEO. His first big initiative at that level was On-Demand Computing. This is just an early name for Cloud Computing. In fact, some people still use the term On-Demand Computing becuase it describes the offering much better than Cloud Computing. Now, this was over ten years ago. I still have the internal materials that talk about the program. It really is the current idea of Cloud Computing.
There are parallels with what Amazon did to initiate the Cloud Computing era. At General Electric the Aerospace Group was selling time "ouot the back door"" because they had lots of mainframes (made by GE at the time) and more capacity than they needed. This was a great business, so GE set it up as a stand-alone business. It was the largest world-wide network in its day (1970s - 1980's). I worked for them when I first joined GE.
Finally, I was talking to a Computer Science Professor the other day. His PhD research was in supercomputing. He says that the day of the massive number of small CPUs is nearing an end. It is too hard to program. He is looking at increases in performance in the individual chip. If you look at something like the Oracle SPARC chip I think that you may be looking at the future of high end computing. IBM's mainframe chips are no slouch either.
@naperlou, thanks for commenting. No doubt the concept has been around for decades. But what fascinates me is what you might call the "familiarity-breeds-contempt" problem. GE and IBM were on top of this years ago, but other companies took the ball and ran with it.
Sometimes as innovators we can't see past a really good idea (familiarity) and so someone else, looking at things from a different perspective, will push the ball downfield.
Yes, cloud computing has been around for a long time under other names, but then e-commerce was around for about a decade before Amazon ever figured it out. I spent the day yestereday with the core group from SCO who created the first e-commerce site for a pizza store in Santa Cruz, CA.
Just because a bunch of engineers have figured out how to do something doesn't mean it catches fire immediately. It generally takes a non-engineer to figure out how to use it on a massive scale, after the engineers hav lost interest and are on to other things.
Cloud computing is being projected as replacing individual computers right now because a few folks have discovered how to profit by selling it. And when you are selling a product and making money at it the only smart move is to declare loud and long that you are selling "the next big thing, everybodie's future". It makes no difference if it is correct or not, if you are selling a product you MUST sound like you absolutely believe that it is the best choice and the only way to go for the future.
My point is that only one side is talking and they are talking a lot. There are some serious potential failure modes with cloud computing that are not mentioned very often, and seldom discussed or considered. Security and reliability are two of them, and the only thing that we hear is all sorts of general assertions that the concerns are invalid. But is that really true? I don't think so.
One other thought is that while computers are one convenient way of saving information, they are by no means the primary aid to creativity.
So naperlou, the guy who did his Ph.D. thesis in supercomputing says that the massive number of small CPUs is coming to an end? This is mind-boggling to me because I recall how many people in supercomputing resisted the idea of little CPUs replacing monolithic processor boards. That wasn't so long ago -- maybe 1995. Now the days of little CPUs is passing? The changes are almost too fast to keep up with.
Chuck, I am not sure I agree with him, but his point is that the programming of the massively parallel machines is extremely difficult. The most widely used approach is called Message Passing Interface (MPI). I have studied this and applied it. It is HARD. Many software vendors rely on the CPU vendors to keep increasing performance. Two companies I know of at present use it in commercial products. SAS just came out with a version of their product that uses it. A electromagnetic CAE software company called REMCOM is advertising using it in IEEE Microwave (my light reading magazine). I am sure there are others, but not many. MPI has been around for a long time, though.
Yes parallel computing is actually really hard; fortunately new paradigms have also emerged that are now transforming the technology landscape. The most prominent is Hadoop (and its powerful ecosystems Hive, Pig, Zookeeper, Mahoout and Hbase) that is disrupting conventional ways of thinking about parallel computing.
These technologies are now in production 24-7 at Google, Yahoo, Facebook, Linkedin and others (like NSAl),and by the way, Amazon is making affordable too: you don't even need to buy the hardware.
Another initiative to follow is "openStack" that is enabling standardization and management of cloud services (large clusters).
If you want to experiment a bit: see http://aws.amazon.com/elasticmapreduce/
You may want to visit: http://hadoop.apache.org/ http://www.openstack.org/
MMoreno, I am very familiar with Hadoop and it's ecosystem. This is not typically used for large scale scientific computing. I once proposed a research project to study this, but it has not come to fruition.
Hadoop is, in the words of one CS Department head, a processing flow paradigm. This is in contrast to a data flow paradigm. I thought that very interesting when I heard it. I am not sure of how that could apply to, say, a CAE program. I think it is because of the finite element model, which is driven by the mathematics and physics, not the programming paradigm. It would be interesting to see if that can adapted, but I know of no commercial product that has.
If we are to innovate, we must look deeper. From my point of view, we are just seeing the initial stage of a large shift in the way we think and use parallel/distributed computing. We are seeing democratization, low cost availability, and a reduction in complexity to develop software for such systems. Now individuals have access to computing resources that before were only available to governments, large academic institutions and large firms.
Hadoop is just one tool in the toolbox. Hadoop (Map Reduce) might not be the right answer for highly intensive-CPU scientific or real-time applications but it does have a role to play that needs to be evaluated in the context of the specific problem domain.
Brian, Thought provoking post. I do agree that users will need to find some compelling benefits to reaching to the cloud. I recently bought a Chromebook Pixel with one terrabyte cloud disk space. I thought I would immediately just put all of my photo library there but that has taken much more time than I thought it would. Google Drive has been a great convenience but I agree there is a need for new apps and uses.
The innovation have been around for quite some time and it is true that it takes an outsider to see the upside of the innovation, probably a business graduate will come up with a ground breaking idea on the future of the cloud computing and leave many surprised why they did not think about it in the first place. The changes in computing are way too fast to cope up with but with the current dynamics in the way people think innovations will keep on sprouting up.
Fifty-six-year-old Pasquale Russo has been doing metalwork for more than 30 years in a tiny southern Italy village. Many craftsmen like him brought with them fabrication skills when they came from the Old World to America.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.