I'm not a real fan of open source. The open source compilers that I've used would get the job done, but were terribly inefficient. I sat down and looked and the object code and wondered why the compiler was loading one register to pass it to another register, push it on the stack, pop it off the stack and then test the bit you were looking for in the first place. Yes, it's open source and I could modify it to suit my needs, but I would just rather buy the right tool for my project rather than building my own. In a mechanical vein; I could build my own drill press with the motor, chuck and stand that I bought open source, or I could just buy a drill press, drill the hole I needed and continue on with a single project rather than working on two projects.
I haven't gone that far, yet, mrdon, although this past week suggests that maybe I should have. Somehow, I lost something I spent a week working on. I couldn't resurrect the file using the standard recovery tools. In this case, I'm thinking that it wasn't electronics failure, but PEBKAC.
Absolutely, mrdon! Maybe I'm paranoid, but I like keeping a copy of my backups under my exclusive control. That way if anything happens from a natural disaster to a business going under or servers being hacked, I don't lose my stuff.
NadineJ, I agree. Open source is the way to go for designs that have no profit margin associated with them and placing them on Cloud servers will not be a concern. I would still use traditional methods of backup data storage devices such as harddrives and thumbdrives to ensure I will always have a copy of my designs. Also, IP theft can happen within the IT corporate Cloud server organization just as easily from a hacker on the outside. Be responsible and backup your own data.
I wanted to make you aware of (and hopefully get DesignNews to publicize) an important not-for-profit (volunteer) vendor-neutral activity that should be of interest to your readers. I've joined the "HPC (Ueber-Cloud) Experiment" as a Mentor/Supervisor of multiple teams and also to help publicize and grow this project investigating the actual processes and obstacle for engineers who currently do desktop/workstation CAE simulation (FEA, CFD) and have a need to "scale up" to occasional/heavy HPC/Cloud-based compute power for larger problems and faster turn-around. The details of the project so far (it started last Summer and is now in Round 2) can be read at these two links:
We are particularly interested in reaching out to more workstation/desktop-level simulation engineers (end users) to take part in Round 3. I am hoping that DesignNews can mention this effort and provide a link to one/both of the above e-articles. Wolfgang Gentzsch, an acknowledged global expert on grid/cloud computing and co-founder of the HPC Experiment, is also interested in writing a more detailed article for DesignNews on the purpose and results, so far, of the Experiment.
Please feel free to contact me (email@example.com) or Wolfgang directly (firstname.lastname@example.org) for more details.
On another blog we just are talking about "the connection." My present access does slow at times, occasionally getting down to perhaps 330 Baud. NOT KBaud, just Baud. Just a bit faster than I type. So a 22K one page letter takes a while, a 2 page PDF takes minutes, and even small executable files take a LONNNGG time. So what good is a slow connection? How slow is a slow cloud?
With the descriptions of the different kinds of "cloud", the one common item is the connection. Running AutoCad from a local server was bad, I can't imagine how very bad running a more powerful cad program over a longer link would be.
As for security, backing up to offsite backup servers every night is a bout as safe from loss as data can be, while keeping the servers behind three physically locked levels of security is insurance against physical theft, and a really good firewall, plus file encryption, is a fair protection against hacking the data. But the most secure system that I am aware of keeps all of the vital data on a non-internet-connected machine. Data that must be sent out is copied to a memory stick and then sent from a connected computer. NO, it is not very convenient, but it is more secure.
Hi all - love all the great comments here! Security is absolutely an ongoing challenge and concern, and an area that varies quite a bit between providers. Our public cloud offerings like vCloud are designed with security at the forefront and are secured using Secureworks and monitored 24x7.
Something else to consider is that when data is stored on the device, there is risk around the device itself being lost or stolen, and then the data may be at risk if not well protected. Many customers (certainly not all, as these comments attest to!) are coming to the conclusion that the overall risk profile is lower with cloud models.
Theft-it may be easier for those who are very tech saavy to steal ideas but that happens in the analogue world today. I learned years ago to 'give it away', so to speak. Thieves will always look for opportunity to steal ideas but good designers will always create more ideas.
Accessibility-one unfortunate reality that is rarely addressed is accessibility. Connecting securely to the internet is expensive for most and even unrealistic for some. Many small companies or even individuals struggle with paying for a fast and reliable connection. If everything is in the cloud, you have to stay connected.
That's a major issue. I was at a company that simply kept all parts on the server and the amount of traffic required as every single screw (we even had one guy that drew threads on screws) and washer was pulled into a design brought the company's server to its knees. I can't imagine having a staff of engineers doing the same damage to a web gateway.
Festo's BionicKangaroo combines pneumatic and electrical drive technology, plus very precise controls and condition monitoring. Like a real kangaroo, the BionicKangaroo robot harvests the kinetic energy of each takeoff and immediately uses it to power the next jump.
Design News and Digi-Key presents: Creating & Testing Your First RTOS Application Using MQX, a crash course that will look at defining a project, selecting a target processor, blocking code, defining tasks, completing code, and debugging.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.