Connecting one device to another, sharing, working together only seems logical. It's fascinating to think of the network path through my phone, to the refrigerator, to the television, and my watch. The path of some future sci-fi hacker. All to make me late for an event. All for some trivial purpose, a diminished reason in a group of 8.5 billion. In his dark room, screens flicker. He loses the access point through my toaster. Foiled plans. He attempts to find a port through my blender.
The fodder of science fiction for sure. Can't wait.
Great post. I had to do some online looking to bring myself up to speed with DNA storage and found an excellent article titled "Harvard Cracks DNA Storage"--17 August 2012. Harvard has successfully stored 5.5 Petabits or 700 Terabytes of information on a single gram of DNA. According to them, it would take 14,000--50 gigabyte Blue Ray disks to store a comparable amount of information. This just blows me away. I'm going "back to school" on this one and I certainly appreciate you giving us this information. It will be very interesting to see how much time must elapse before this method becomes commercially available.
The idea of using intelligently using data that's already available on industrial equipment, and making intelligent decisions based on much simpler algorithms, is by itself already mind boggling for me. This is an area where what seems possible in the abstract is sometimes a bridge too far. Current developments aiming to use Ethernet network solutions such as PROFIenergy and sercos Energy (plus the ODVA Energy Initiative) to manage energy usage on machine is technology that should make a big impact in just the next few years.
Great blog, Naperlou. As I read your description of where big data is going (especially the part about using DNA), I couldn't help but wonder how this description will look to someone 50 years from now. They'd probably consider it quaint -- the way we might look back at magnetic drum storage today. To me, though, the technology you describe is mind-boggling.
Naperlou, Excellent article. I do think that the combination of Big-Data along with creation of software that can implement algorithms that actually create intelligent decision-making is the key to leveraging the possibilities with M2M.
Great blog naperlou! Did you have a chance to catch the recent NOVA episode on Drone technology? http://goo.gl/xqLRK
I really expect it to be a script error, but the show claims that the new ARGUS 1.8 GigaPixel imaging system records 1 Million Terabytes (that's 1 Exabytes) of image data per day. That sounds pretty fantastic... I'm not sure where the DOD would store it all, but perhaps they have technology that can store this amount of data on a USB stick.
As you suggest, a real design dilemma is being created in figuring out how to analyze all of this historical data. I think I've mentioned this before in another comment, but Jeff Hawkins of "On Intelligence" fame, has a new product called "Grok" that changes the paradigm from analyzing big data into analyzing a big number of data streams in real time. Definitely an emerging area of research. Jeff's introduction to Grock is here: http://youtu.be/1mq8c2Orgso
Festo's BionicKangaroo combines pneumatic and electrical drive technology, plus very precise controls and condition monitoring. Like a real kangaroo, the BionicKangaroo robot harvests the kinetic energy of each takeoff and immediately uses it to power the next jump.
Design News and Digi-Key presents: Creating & Testing Your First RTOS Application Using MQX, a crash course that will look at defining a project, selecting a target processor, blocking code, defining tasks, completing code, and debugging.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.