As the machine-to-machine (M2M) revolution gets underway, there are some interesting developments in the technology that makes it possible and cost effective.
The term normally used is big-data, and there are lots of moving parts where big-data is concerned. The main thing is the reduction in the cost of storage. There have been petabyte databases in the past couple of decades, but these were restricted to high value applications. Some of these include tracking the nuclear stockpile and data mining for very large retail organizations (e.g., Wal-Mart). The storage costs for these projects were huge, but the scale made them cost effective. Now, for much lower "value" data we are seeing an explosion of the data kept and analyzed.
As Al Presher mentioned on this site in his article The Industrial Internet of Things, companies like General Electric see a continuing growth in M2M data with a consequent increase in efficiency. It is the ability to analyze this data and apply it to both operation and design that is significant. Some of the interesting things that can be done include real-time analysis of trends for failure prediction as opposed to failure analysis. Thus, large amounts of historical data can be included in the control loop. In design, the ability to access large datasets for similar designs can help improve a new design.
The cost of storage is coming down all the time because of innovations in the storage devices and control. And that is just for the current technology of spinning disks. As solid state storage progresses, it promises to decrease energy consumption and increase speed of access. This should also lead to increases in density and ultimately storage capacity.
But, there are even more exotic technologies being developed that promise even greater efficiencies and capacity. One is the use of DNA to store information. This technology may be a few years off, but it shows promise in the lab today. Rather than storing the information on electronic devices, this technology uses actual strands of DNA. One interesting aspect of this is that the compounds are stable for very long periods of time. They are also much denser than current technologies. Add to that the fact that they do not require power when sitting idle and you have the ability to store huge amounts of data much more cheaply.
With the M2M population expected to reach 50 billion connected devices in the not too distant future, the amount of data generated is going to be tremendous. Storing all that information is truly a big-data challenge.
Connecting one device to another, sharing, working together only seems logical. It's fascinating to think of the network path through my phone, to the refrigerator, to the television, and my watch. The path of some future sci-fi hacker. All to make me late for an event. All for some trivial purpose, a diminished reason in a group of 8.5 billion. In his dark room, screens flicker. He loses the access point through my toaster. Foiled plans. He attempts to find a port through my blender.
The fodder of science fiction for sure. Can't wait.
Great post. I had to do some online looking to bring myself up to speed with DNA storage and found an excellent article titled "Harvard Cracks DNA Storage"--17 August 2012. Harvard has successfully stored 5.5 Petabits or 700 Terabytes of information on a single gram of DNA. According to them, it would take 14,000--50 gigabyte Blue Ray disks to store a comparable amount of information. This just blows me away. I'm going "back to school" on this one and I certainly appreciate you giving us this information. It will be very interesting to see how much time must elapse before this method becomes commercially available.
The idea of using intelligently using data that's already available on industrial equipment, and making intelligent decisions based on much simpler algorithms, is by itself already mind boggling for me. This is an area where what seems possible in the abstract is sometimes a bridge too far. Current developments aiming to use Ethernet network solutions such as PROFIenergy and sercos Energy (plus the ODVA Energy Initiative) to manage energy usage on machine is technology that should make a big impact in just the next few years.
Great blog, Naperlou. As I read your description of where big data is going (especially the part about using DNA), I couldn't help but wonder how this description will look to someone 50 years from now. They'd probably consider it quaint -- the way we might look back at magnetic drum storage today. To me, though, the technology you describe is mind-boggling.
Naperlou, Excellent article. I do think that the combination of Big-Data along with creation of software that can implement algorithms that actually create intelligent decision-making is the key to leveraging the possibilities with M2M.
Great blog naperlou! Did you have a chance to catch the recent NOVA episode on Drone technology? http://goo.gl/xqLRK
I really expect it to be a script error, but the show claims that the new ARGUS 1.8 GigaPixel imaging system records 1 Million Terabytes (that's 1 Exabytes) of image data per day. That sounds pretty fantastic... I'm not sure where the DOD would store it all, but perhaps they have technology that can store this amount of data on a USB stick.
As you suggest, a real design dilemma is being created in figuring out how to analyze all of this historical data. I think I've mentioned this before in another comment, but Jeff Hawkins of "On Intelligence" fame, has a new product called "Grok" that changes the paradigm from analyzing big data into analyzing a big number of data streams in real time. Definitely an emerging area of research. Jeff's introduction to Grock is here: http://youtu.be/1mq8c2Orgso
Are they robots or androids? We're not exactly sure. Each talking, gesturing Geminoid looks exactly like a real individual, starting with their creator, professor Hiroshi Ishiguro of Osaka University in Japan.
Truchard will be presented the award at the 2014 Golden Mousetrap Awards ceremony during the co-located events Pacific Design & Manufacturing, MD&M West, WestPack, PLASTEC West, Electronics West, ATX West, and AeroCon.
For industrial control applications, or even a simple assembly line, that machine can go almost 24/7 without a break. But what happens when the task is a little more complex? That’s where the “smart” machine would come in. The smart machine is one that has some simple (or complex in some cases) processing capability to be able to adapt to changing conditions. Such machines are suited for a host of applications, including automotive, aerospace, defense, medical, computers and electronics, telecommunications, consumer goods, and so on. This discussion will examine what’s possible with smart machines, and what tradeoffs need to be made to implement such a solution.