Looking back after reading this piece, I can't help but feel that you left out one very important development that took place somewhere along the cycle of evolution you outlined; the dawn of Windows 95. Now that really defined computing at the time and it was very affordable back then too. I believe it's played a very important role in developing the modern computer.
Funny I never thought of it in that light. The similarities are quite striking when viewed in this respect. Even so, I think the cloud computing would never have been feasible without the availability of powerful PCs with ultra fast processors. If you doubt that then try accessing cloud storage from a Pentium III processor and see the difference.
I totally agree on this one. Cloud computation can give rise to huge security issues. Perhaps, restricting cloud computation to lesser confidential applications would be a wiser adaptation to this technology!
If we manage to balance local computation and cloud conputation along with high Connectivity, then we can certainly save huge costs on our personal computers for future high computational applications.
I once contracted at a large company that had their incredibly expensive CAD system on a server and each of us users had ourown computer connected via the network. So when we used the cad program portions somehow arrived on our computers, and we used the cad program. BUT all of our work was saved on the mainframe, or whatever that beast was. It had to be huge to hold all of those files and that huge program with it's huge collections of libraries. A whole lot like the cloud, but not "distributed", and when the computer holding the program was down about 500 workstations were not able to do CAD. That was indeed a bit of a financial drag, I am sure.
My point is that unless the connection is fast enough, which may be the case in some design office area of a rich organization, there does seem to be a slowness that happens when a bunch of folks watch some TV or movie, which happened at another company that had all of the programs on the server. Of course nobody was allowed to watch TV or movies, but it was quite obvious when it was happening. Can cloud computing function adequately without lightning speed internet connections? I don't believe that it can. And in a lot of plantsthere is NO INTERNET at any speed available, and even an extension cord or an outlet to plug into is hard to find. All of my service information is always either on the notebooks hard drive or on CDs or floppies. "Never show up without being prepared to do the job" is a very good mantra for anybody who is intending to do service for a profit. Remember that clouds can be quite spotty at times.
Who owns your data, read the small print, even gmail has ownership of your data? This is very important detail
For an example of me and VCS, I use Linus Torvalds et all GIT. Distributed.
As for US patents, for an outsider they offer no protection, the small print is along the lines of for the protection, security and benefit of the USA people. I am not a US citizen therefore filing a US patent would be lowering protection.
I recommend Linux SW raid, a couple of spare enterprise discs and basic partitions with EXT4 for the important underlying FS. WD, Fujitsu or Hitachi (not Seagate/Maxtor!).
There again I do not use mobile anything other than Laptops with 17" or 18.3" displays, no WiFi or Bluepoo.
TJ most of the tie cloud is use for the backups so that there may be a limited access to the data. Any way I believe that still cloud is cost effective with the increase in data transfer costs, compared with the traditional data storage system which has more maintenance and physical recourses.
dgreig, I believe that now the cloud servicers are much secure and very cost effective from the organizational point of view. I may say that it is much secured than the in hand data cause that there is many possibility of oozing the data which is in hand than the cloud.
Festo's BionicKangaroo combines pneumatic and electrical drive technology, plus very precise controls and condition monitoring. Like a real kangaroo, the BionicKangaroo robot harvests the kinetic energy of each takeoff and immediately uses it to power the next jump.
Design News and Digi-Key presents: Creating & Testing Your First RTOS Application Using MQX, a crash course that will look at defining a project, selecting a target processor, blocking code, defining tasks, completing code, and debugging.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.