Maybe the Manager of Geeks looked at the whiteboard after a cryptic technical presentation by Geeks and identified the object that they kept pointing to as a cloud - so that must have been what they were talking about. Actually, Geeks have been using clouds in telecommunications and network diagrams for several years.
I smell the presence of Homeland Security. I think it is of fundamental importance that no matter how secure the cloud may become, organizations can be infiltrated and espionage is a political business and technology nightmare. I worked in a building which contained one of the USA's eleven Internet traffic hubs and HS had a listening room upstairs. Take that to one more permutation and there are no such things as secure designs that are stored in the cloud. Considering PLM, I would think that putting all of this information would be akin to undressing in front of a TSA official instead of submitting to a body scan.
Thanks for wading in Arenasolutions, and I do agree with you that the cloud is not a passing fad.
@Chuck: I don't have any intel on rates of adoption. My guess is it's pretty nil at this point. I think what we're seeing is more education of the benefits of the cloud and architecture changes at the vendor level for their PLM (and some CAD, but really PLM and collaboration tools along with simulation) so they are prepared when the shift to this more modern compute architecture dominates.
Hi! Thanks for including Arena in your article. As someone who works for a cloud PLM provider, I have to say I respectfully disagree with naperlou's skepticism about cloud's long term viability.
For some time, manufacturers and engineers were slow to adopt cloud solutions, but we have definitely seen increased adoption rates in the last few years, as manufacturers begin to see the benefits of centralized data management. I expect that with the arrival of Autodesk to the cloud PLM scene, this trend will continue.
The way we see it, cloud is not a flash-in-the-pan - - it's a response to the demands of modern product development. To be useful, product data must be seamlessly transferable across business lines to both suppliers and customers—especially in high tech organizations which often outsource. Companies need to collaborate on changes and send manufacturing data to several contract manufacturers (CMs) in high velocity, which makes cloud-based applications ideal for data transference. This trend is not going away, and cloud solutions will only become more useful in the future. Especially when cloud providers begin to sync up and offer really interesting functionality to customers (for example, we recently partnered with Octopart to offer real-time supplier availability, cost, compliance info for BOMs. That sort of collaboration is not possible w/on-premise solutions.)
Beth, are they seeing that the first uptake of this concept is among larger companies? If a larger company can afford its own private cloud, it would seem as if one of the big barriers to adoption (entrusting it to a cloud outside its own walls) would be automatically taken care of.
My feeling is similar to yours, naperlou, that most engineering organizations aren't yet ready to adopt a cloud approach for a variety of reasons. That said, IT organizations are flocking to the cloud paradigm--both public clouds and private clouds and likely over time, a hybrid cloud where there is a mix of the two.
There is still a lot of work to be done to move the cloud beyond mere virtualization, but there are benefits to be had. I think even if companies aren't ready, it's important that the design tool vendors have a forward-thinking strategy to get there and continue work to evolve their platforms so that they can take advantage of the new capabilities when and where it makes sense.
Beth, as you might have figured out, I am skeptical of the long term cloud computing situation. Cloud is just the same as timesharing of the past. That is gone because it is cheaper to get your own computers. Cloud works in some situations where workload is variable and for startups. On the other hand, most cloud spend is for private clouds. This is really just server consolodation and virtualization, which has been going on for some time. In addition, if you add up the long term costs of being on the public cloud, private or on-site resources are cheaper in 1 to 3 years.
In the case of PLM, there is often a heavy graphics requirement as well. What makes me think there is something that can be done in this area is watching my son play a fancy video game on his PC. The game is "played" on a number of servers. He has a PC he built with a powerful graphics card. So, even though this game is "in the cloud" (whcih really means on the network), there was lots of processing going on at the local PC level. I expect this will be similar to cloud computing for engineering products.
Truchard will be presented the award at the 2014 Golden Mousetrap Awards ceremony during the co-located events Pacific Design & Manufacturing, MD&M West, WestPack, PLASTEC West, Electronics West, ATX West, and AeroCon.
In a bid to boost the viability of lithium-based electric car batteries, a team at Lawrence Berkeley National Laboratory has developed a chemistry that could possibly double an EV’s driving range while cutting its battery cost in half.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.