Manufacturing plants are on the cusp of producing much more data than they do today. One motivation for the increase in data being gathered for analysis is predictive maintenance actions. The fact that plentiful data can be collected, analyzed, and used to create knowledge that leads to the prevention of downtime is too compelling to ignore. When you compare the cost of a production outage due to machine failure versus the cost of collecting and analyzing the data, the argument makes itself.
How much data is there, and how much data will there be in the future?
What data are we talking about? All controls systems or telemetry systems use values called points or tags. These tags hold values such as temperature, amperage, and flow rate. They may include speed settings or directions or rates of change. These tags are read or written as needed for the processes they support. What we expect to be added are tags that hold additional values.
While a motor is turning, we also collect lubricant temperature, multiple points of vibration, torque values, and more. If you really look at the number of tags that are configured in a large manufacturing plant today, you will see numbers around 1 million. By simply adding additional variable to monitor in new tags, this could increase the amount of data by a factor of five times or more.
MORE FROM DESIGN NEWS: Unstructured Big Data Is a Big Manufacturing Challenge
A large factory could have 1 million tags, with each tag at 4 bytes or 32 bits. If you sample each tag once every 60 seconds, the data rate is only 534 kilobits per second -- not too much. However, if your tag count goes to 5 million and the scan rate goes to 1 second, the data rate goes to 160 megabits per second.
Let's say that I have 500,000 tags and the scan rate is 10 milliseconds and also 160 megabits per second. This is just SCADA. It can grow fast when we can measure more. Now add a 3D printer that produces terabytes of data per day and you have many of the same issues data center managers faced as they watched data volume grow.
MORE FROM DESIGN NEWS: Learning to Use Big Data
A typical data center stores data and uses the data through a number of applications that support various business applications. A manufacturing plant also has applications that produce lots of data, but it has not normally been the place where that data is stored or evaluated in an application at scale. The increase in tags defined and the rate at which the tags are read requires data analysis in near real-time. Manufacturing plants now have to create an architecture that collects, processes, and stores data locally. Some analysis has to be done locally because it is needed in near real-time.
The plant becomes a data center
What is interesting when we look at enterprise IT data centers and manufacturing plants as centers of data is how similar they are when comparing computing, network, and power attributes. Although I have never found an actual data comparison between manufacturing plants and enterprise data centers, my conversations with customers have shown a fairly consistent pattern.
That pattern has been:
- Manufacturing Plants
- 10,000 network ports per million square feet
- 1,000 computers (PC or PLC) per million square feet
- 5 megawatts of power
- Data Centers
- 2,000 network ports for 100,000 square feet
- 1,200 servers (bare metal) per 100,000 square feet
- 4 megawatts of power
The similarity in profiles is fascinating, and it leads to possibly considering the benefit of data center technologies being applied to manufacturing plants. For starters, virtualization of compute, storage, and network provides economic pathways if the amount of data from manufacturing systems grows and requires infrastructure to do the same -- as it would appear. Virtualization allows for greater flexibility with greater availability of systems at a lower cost.
A manufacturing plant has the same needs for ease of deployment and operations as does a data center. So can the new paradigms of software-defined networking and OpenStack and similar abstract management tools of these resources be applied to manufacturing?
MORE FROM DESIGN NEWS: PTC Doubles Down on IoT Market
With these new tools and methods, a data center manager can move an important application to any physical system without having to reconfigure the application in any way. Before these tools, the steps involved in a move caused weeks of delay. With these tools it can be accomplished in minutes.
What if a millwright or robot tech could move a robot across the plant and not change the configuration of the robot or its controller in any way?
Manufacturing plants are centers of data, and the volume of data and the important uses of that data require us to consider systems and technologies that allow us to use store and reuse that data. Perhaps those technologies are the same ones we use in data centers.
Dave Cronberger is customer solutions architect for industrial automation and connected vehicles at Cisco Systems. He will be a featured presenter at Atlantic Design & Manufacturing in New York City, June 9-11, a Design News event. He is part of a comprehensive education program on smart factories of the future.
Atlantic Design & Manufacturing, the largest advanced design and manufacturing trade show serving the Northeastern US, will take place in New York, June 9-11, 2015. It's your chance to meet qualified suppliers, get hands-on with the latest technologies, and expand your network. Learn more here.