For this to really work, the network will become the big issue. It is imperative to have the GPU near the CPU and data. If there is network lag, the experience will still not be good for 3D manipulation. For viewing, perhaps, this would work or an arbitrary network. Considering the advances in networking this may really become feasible.
What will be important, though, is that all vendors support this. If you have some software that runs in the cloud on this type of technology and some that do not, you will still get a video card for your system.
I just upgraded my Engineering WorkStation. I purchased a laptop online, direct from the manufacturer, and went thru the step-by step HW & SW selections during the order process. Because I run SolidWorks2012 which is a heavy, graphic intensive application, I ordered my new hardware with an i7-Quad Core processor, and an NVIDEA (K2) graphics card – both options make the workstation very pricey as high-end options, running the $2,500 range for the machine. (Looking at the simple laptop, it appears equal to any $500 (i3) processor model, so "looks are deceiving") -- Anyway, back to the point:
When the new workstation arrived, I went to load my SolidWorks CDs and found the DVD/RW port would not open. (What the HECK-?! Was I so focused on the processor that I overlooked the external drives during my order-?!) A quick call to my Application Engineer, and he reminded me that I had opted for the secondary HardDrive which occupies the space of the normal DVD/RW or CD drive. "But don't worry, your SolidWorks application Software can be downloaded and activated using a soft key", he assured.
He went on to explain that the trends are moving faster toward the elimination of external drives in favor of virtual downloads for all apps. Really-!!?
When I started my career in CAD/CAM engineering around 1980, workstations were only big, dumb terminals (with hoods); all computing was done on the CPU Mainframe. 30 years later, now we are returning to a similar architecture, but the local mainframe is the "Cloud". Amazing how trends cycle back around.
It makes sense to see engineering processing to the cloud. The need for processing power keeps growing, plus, there is increasing interest in using mobile devices, espeically among young engineers. So this makes sense.
It's about time. Although restrictive, terminal PC have been put to good work for over 50 years (if one form or another). It is a practical idea.
I would love for Nvidia to extend that concept for PC gamers and or arts workers. Having to consistently upgrade hardware is costly. Paying for graphics capability like one would for a video streaming service is a solid model. People pay $60 dollars a year to play XBOX games, why wouldn't they do the same to have up to date graphical capabilities?
Bandwidth is the ultimate limitation. Take a Nvidia Geforce 680, it has a bandwidth of 192.2 GB/sec. For a home user, that is impossible. Few businesses, let alone people, have access to Infiniband or OC3072 internet connections. And Fibre optics to the home, at the moment, limits bandwidth... for those who can get it.
When these techs open up for the average user, terminal computing will take over.