HOME  |  NEWS  |  BLOGS  |  MESSAGES  |  FEATURES  |  VIDEOS  |  WEBINARS  |  INDUSTRIES  |  FOCUS ON FUNDAMENTALS
  |  REGISTER  |  LOGIN  |  HELP
naperlou
User Rank
Blogger
Really??
naperlou   7/29/2013 9:38:02 AM
NO RATINGS
Brian, I was at IBM in the early 2000's.  At that time, Sam Palmisano took over as Chairman and CEO.  His first big initiative at that level was On-Demand Computing.  This is just an early name for Cloud Computing.  In fact, some people still use the term On-Demand Computing becuase it describes the offering much better than Cloud Computing.  Now, this was over ten years ago.  I still have the internal materials that talk about the program.  It really is the current idea of Cloud Computing. 

There are parallels with what Amazon did to initiate the Cloud Computing era.  At General Electric the Aerospace Group was selling time "ouot the back door"" because they had lots of mainframes (made by GE at the time) and more capacity than they needed.  This was a great business, so GE set it up as a stand-alone business.  It was the largest world-wide network in its day (1970s - 1980's).  I worked for them when I first joined GE. 

Finally, I was talking to a Computer Science Professor the other day.  His PhD research was in supercomputing.  He says that the day of the massive number of small CPUs is nearing an end.  It is too hard to program.  He is looking at increases in performance in the individual chip.  If you look at something like the Oracle SPARC chip I think that you may be looking at the future of high end computing.  IBM's mainframe chips are no slouch either.

Brian Fuller
User Rank
Blogger
Re: Really??
Brian Fuller   7/29/2013 2:23:40 PM
NO RATINGS
@naperlou, thanks for commenting. No doubt the concept has been around for decades. But what fascinates me is what you might call the "familiarity-breeds-contempt" problem. GE and IBM were on top of this years ago, but other companies took the ball and ran with it. 

Sometimes as innovators we can't see past a really good idea (familiarity) and so someone else, looking at things from a different perspective, will push the ball downfield. 

 

 

LouCovey
User Rank
Iron
Re: Really??
LouCovey   7/29/2013 2:51:53 PM
NO RATINGS
Yes, cloud computing has been around for a long time under other names, but then e-commerce was around for about a decade before Amazon ever figured it out.  I spent the day yestereday with the core group from SCO who created the first e-commerce site for a pizza store in Santa Cruz, CA.  

Just because a bunch of engineers have figured out how to do something doesn't mean it catches fire immediately.  It generally takes a non-engineer to figure out how to use it on a massive scale, after the engineers hav lost interest and are on to other things.

NadineJ
User Rank
Platinum
Re: Really??
NadineJ   7/30/2013 12:46:17 PM
NO RATINGS
@Lou-you make a good point.  Sometimes the "relaxed mind" belongs to a non-engineer.  Innovation without adaptation is just an art project.

William K.
User Rank
Platinum
Re: Really??
William K.   7/30/2013 7:04:13 PM
NO RATINGS
Cloud computing is being projected as replacing individual computers right now because a few folks have discovered how to profit by selling it. And when you are selling a product and making money at it the only smart move is to declare loud and long that you are selling "the next big thing, everybodie's future". It makes no difference if it is correct or not, if you are selling a product you MUST sound like you absolutely believe that it is the best choice and the only way to go for the future.

My point is that only one side is talking and they are talking a lot. There are some serious potential failure modes with cloud computing that are not mentioned very often, and seldom discussed or considered. Security and reliability are two of them, and the only thing that we hear is all sorts of general assertions that the concerns are invalid. But is that really true? I don't think so.

One other thought is that while computers are one convenient way of saving information, they are by no means the primary aid to creativity.

Charles Murray
User Rank
Blogger
Re: Really??
Charles Murray   7/29/2013 6:46:14 PM
NO RATINGS
So naperlou, the guy who did his Ph.D. thesis in supercomputing says that the massive number of small CPUs is coming to an end? This is mind-boggling to me because I recall how many people in supercomputing resisted the idea of little CPUs replacing monolithic processor boards. That wasn't so long ago -- maybe 1995. Now the days of little CPUs is passing? The changes are almost too fast to keep up with.

naperlou
User Rank
Blogger
Re: Really??
naperlou   7/29/2013 8:14:52 PM
NO RATINGS
Chuck, I am not sure I agree with him, but his point is that the programming of the massively parallel machines is extremely difficult.  The most widely used approach is called Message Passing Interface (MPI).  I have studied this and applied it.  It is HARD.  Many software vendors rely on the CPU vendors to keep increasing performance.  Two companies I know of at present use it in commercial products.  SAS just came out with a version of their product that uses it.  A electromagnetic CAE software company called REMCOM is advertising using it in IEEE Microwave (my light reading magazine).  I am sure there are others, but not many.  MPI has been around for a long time, though. 

MMoreno
User Rank
Iron
Re: Really??
MMoreno   7/30/2013 9:51:40 AM
NO RATINGS
Yes parallel computing is actually really hard; fortunately new paradigms have also emerged that are now transforming the technology landscape. The most prominent is Hadoop (and its powerful ecosystems Hive, Pig, Zookeeper, Mahoout and Hbase) that is disrupting conventional ways of thinking about parallel computing.

These technologies are now in production 24-7 at Google, Yahoo, Facebook, Linkedin and others (like NSAl),and by the way, Amazon is making affordable too: you don't even need to buy the hardware.

Another initiative to follow is "openStack" that is enabling standardization and management of cloud services (large clusters).

If you want to experiment a bit: see http://aws.amazon.com/elasticmapreduce/

You may want to visit:
http://hadoop.apache.org/
http://www.openstack.org/

Manuel.

naperlou
User Rank
Blogger
Re: Really??
naperlou   7/30/2013 10:25:22 AM
NO RATINGS
MMoreno, I am very familiar with Hadoop and it's ecosystem.  This is not typically used for large scale scientific computing.  I once proposed a research project to study this, but it has not come to fruition. 

Hadoop is, in the words of one CS Department head, a processing flow paradigm.  This is in contrast to a data flow paradigm.  I thought that very interesting when I heard it.  I am not sure of how that could apply to, say, a CAE program.  I think it is because of the finite element model, which is driven by the mathematics and physics, not the programming paradigm.  It would be interesting to see if that can adapted, but I know of no commercial product that has.

MMoreno
User Rank
Iron
Re: Really??
MMoreno   7/30/2013 5:09:12 PM
NO RATINGS
If we are to innovate, we must look deeper. From my point of view, we are just seeing the initial stage of a large shift in the way we think and use parallel/distributed computing. We are seeing democratization, low cost availability, and a reduction in complexity to develop software for such systems. Now individuals have access to computing resources that before were only available to governments, large academic institutions and large firms. 

Hadoop is just one tool in the toolbox. Hadoop (Map Reduce) might not be the right answer for highly intensive-CPU scientific or real-time applications but it does have a role to play that needs to be evaluated in the context of the specific problem domain.

I found this interesting article that may give context to others following this conversation http://hortonworks.com/blog/hadoop-in-perspective-systems-for-scientific-computing/.

apresher
User Rank
Blogger
Ice Breakers
apresher   7/29/2013 5:19:13 PM
NO RATINGS
Brian, Thought provoking post. I do agree that users will need to find some compelling benefits to reaching to the cloud. I recently bought a Chromebook Pixel with one terrabyte cloud disk space. I thought I would immediately just put all of my photo library there but that has taken much more time than I thought it would.  Google Drive has been a great convenience but I agree there is a need for new apps and uses.

Brian Fuller
User Rank
Blogger
Re: Ice Breakers
Brian Fuller   8/5/2013 3:33:10 PM
NO RATINGS
New apps and uses and attention to the user interface. I love google but its Drive interface, for all its accessibility, feels like Windows 2.0. But they'll get there. 

1 TB storage? Wow. 

 

AnandY
User Rank
Gold
Re- Really?
AnandY   8/11/2013 4:13:51 AM
NO RATINGS
The innovation have been around for quite some time and it is true that it takes an outsider to see the upside of the innovation, probably a business graduate will come up with a ground breaking idea on the future of the cloud computing and leave many surprised why they did not think about it in the first place. The changes in computing are way too fast to cope up with but with the current dynamics in the way people think innovations will keep on sprouting up.



Partner Zone
Latest Analysis
Advertised as the "Most Powerful Tablet Under $100," the Kindle Fire HD 6 was too tempting for the team at iFixit to pass up. Join us to find out if inexpensive means cheap, irreparable, or just down right economical. It's teardown time!
The first photos made with a 3D-printed telescope are here and they're not as fuzzy as you might expect. A team from the University of Sheffield beat NASA to the goal. The photos of the Moon were made with a reflecting telescope that cost the research team 100 to make (about $161 US).
At Medical Design & Manufacturing Midwest, Joe Wascow told Design News how Optimal Design prototyped a machine that captures the wing-beat of a duck.
The increased adoption of wireless technology for mission-critical applications has revved up the global market for dynamic electronic general purpose (GP) test equipment. As the link between cloud networks and devices -- smartphones, tablets, and notebooks -- results in more complex devices under test, the demand for radio frequency test equipment is starting to intensify.
Much of the research on lithium-ion batteries is focused on how to make the batteries charge more quickly and last longer than they currently do, work that would significantly improve the experience of mobile device users, as well EV and hybrid car drivers. Researchers in Singapore have come up with what seems like the best solution so far -- a battery that can recharge itself in mere minutes and has a potential lifespan of 20 years.
More:Blogs|News
Design News Webinar Series
10/7/2014 8:00 a.m. California / 11:00 a.m. New York
9/25/2014 11:00 a.m. California / 2:00 p.m. New York
9/10/2014 11:00 a.m. California / 2:00 p.m. New York
7/23/2014 11:00 a.m. California / 2:00 p.m. New York
Quick Poll
The Continuing Education Center offers engineers an entirely new way to get the education they need to formulate next-generation solutions.
Oct 20 - 24, How to Design & Build an Embedded Web Server: An Embedded TCP/IP Tutorial
SEMESTERS: 1  |  2  |  3  |  4  |  5  |  6


Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.
Next Class: 10/28-10/30 11:00 AM
Sponsored by Stratasys
Next Class: 10/28-10/30 2:00 PM
Sponsored by Gates Corporation
Learn More   |   Login   |   Archived Classes
Twitter Feed
Design News Twitter Feed
Like Us on Facebook

Sponsored Content

Technology Marketplace

Copyright © 2014 UBM Canon, A UBM company, All rights reserved. Privacy Policy | Terms of Service