How Engineers Can Use the Cloud

DN Staff

October 5, 2010

6 Min Read
How Engineers Can Use the Cloud

The National Institute of Standards and Technology defines cloudcomputing as "a model for enabling convenient, on-demandnetwork access to a shared pool of configurable computing resources (e.g.,networks, servers, storage, applications and services) that can be rapidlyprovisioned and released with minimal management effort or serviceprovider interaction." Honing in on the key definingterms, you may recognize the key features of the "cloud" include on-demandaccess, shared compute resources and dynamic provisioning.

Clearly this type of dynamic accessible processing can beuseful in many ways and has already ushered its way into many applications. Forthose companies that have begun using cloud computing, the benefits includerapid reprovisioning, reduced upfront costs, "anywhere" access, poolingresources across users through multi-tenancy, load spike handling and betterreliability, uptime and maintenance than they could have achieved on their own.However, this is not the entire story.

Critics will say that utilizing the cloud opens companiesand IP to security threats, failing to offer enough benefit in manycircumstances to justify the risk. Nevertheless, there are a number of areaswhich have begun to emerge as high-level uses for cloud computing in thegeneral engineering space. Details onthese areas follow.

DataAggregation

Data is the lifeblood of engineering. From the definition of cloud computing, itis clear that one of the scalable computer resources is storage-the backbone ofa data aggregation strategy. Data aggregation using the cloud is most effectivewhen data sources are globally distributed and require convenient on-demandaccess. Looking at a windfarm monitoring example, it turns out that one of themost expensive parts to fix on a wind mill, if it fails catastrophically, is thelarge rotating assembly behind the blades. There is a whole science behindmonitoring rotating machinery vibrations to predict normal maintenance andanalyze warning signs for future sudden events. Using an acquisition system for monitoring coupled with cloud-basedstorage, companies can aggregate the data in a central scalable location,automatically sweep the data for alarm conditions and serve the data to anyuser globally that needs access. This can all be done without spending largecapital investments on onsite servers. A central tenant of the cloud allows fordynamically changing computer resources as your business grows (or shrinks).

Softwareas a Service (SaaS)
Manyself-hosted operations report that their servers run at 10-20 percentutilization under normal operating conditions to account for periodic extraordinaryusage patterns. Using cloud computing, engineers can mitigate thisunderutilization of a resource for their Web application. As an example ofthis, a company at Solidworks World 2010 showcased previews of upcomingtechnology that facilitate game-changing levels of collaboration on designusing cloud backends. The design is "alive" in the cloud and any personinvolved, from design engineer to executive management, can work with a designon any device. Also, at NI Week 2010, National Instruments previewed acompletely hosted LabVIEW Web UI Builder which is a browser-based developmentprogram for creating Web user-interfaces, connecting the custom interface to Webservices and exporting the UI as a thin-client to embed essentially anywhere onthe Web. These examples provide a glimpse into how engineering applications cancompletely run in the cloud or utilize a hybrid approach where there are cloudapplications available that augment a locally-running application.

HighCompute Power
Simulationsweeps on large ASIC designs or dynamic analysis of mechanical models are onlya few of the computationally intensive tasks one might want to target at arented bank of computer resources that will scale up quickly to handle the loadand scale down quickly when not utilized. In some FPGA simulations, ananosecond of simulation time can mean minutes of real time churning. One does notneed to be a mathematician to understand the time an FPGA video algorithmhigh-fidelity simulation could take to crunch a one minute clip.

EDA tools are already experimenting with this type ofsystem for highly computational tasks. One of the promises of cloud computingis the virtually unlimited scalability of the processor cycles engineers canhave at their disposal. Of course, this is an oversimplification as not allintensive tasks can be parallelized. But in theory, the cloud represents infiniteopen lanes of instruction crunching. In practice, companies are already seeingbenefits. For example, National Instruments is targeting FPGA compilation tocloud-based computers. In fact, the prototypes shown at NI Week display a user selectingthe NI Hosted Services compilation server and subsequently using the LabVIEW FPGAtools in the same way. The only difference being that compilation is happeningon an optimized high-RAM dedicated computer in the cloud rather than a locallymaintained server. The transfer of files and statuses are handled in thebackground on high security web service connections to a set of cloud machinesthat take care of authentication, license checking, scheduling and thespecialized work of an FPGA compile. For more on this, see the Design News Automation & Controlblog entry on the topic at: http://bit.ly/DNCloud.

SecurityConcerns
Manycritics of cloud computing are quick to point out the drawbacks of cloud technology for certain engineering tasks.While complex display (such as rendering 3D models) is nearly impossible solelythrough a network connection to the underlying data, security concerns are themost cited drawback to putting more of an organization's important, sensitiveengineering work in the cloud. With the LabVIEW FPGA Cloud Compile server,National Instruments has worked through a lot of these issues, including:

  • All calls to the server should use a secureuser ID and all access to customer data on cloud servers should be gated byservices on the server side which re-check the end user's permissions on eachaccess;

  • All passwords, as a security best practice, shouldbe stored using strong one way encryption and never stored in any other way,especially plain text;

  • Even the service provider should not be ableto decrypt a user's password;

  • Users who forget their password must be reset;

  • All calls from the user to the cloud shouldalso be made securely over encrypted HTTP (HTTPS) with added securitymechanisms for login authentication and data protection;

  • The service provider should have aCISSP-certified security engineer on staff or be consulted with to audit thesecurity of the solutions from design through implementation; and

  • The cloud infrastructure provider, such asAmazon Web Services, should have high security data centers as well. Forexample, Amazon's data centers have SAS 70 Type II certification.


In the consumer world, the shift toward hosting all typesof data, applications and computationally intensive tasks on scalable on-demandsystems is clear. It is only a matter of time until the research anddevelopment community starts using similar technology in related use cases fortheir day-to-day work.

Rick Kuhlman is LabVIEW FPGA product manager at NationalInstruments.

For more information, visit: http://zone.ni.com/devzone/cda/tut/p/id/11573

Sign up for the Design News Daily newsletter.

You May Also Like