Design News is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

What's Making Industrial PCs Get Bigger and Smaller at the Same Time?

Makers of industrial PCs are continuing to take advantage of Moore’s law expansion of processing power enabling creative automation and control schemes with multicore processors.

The use of industrial PCs in factory automation and control applications is still evolving, as controls suppliers exploit the continuous upward arc and availability of processing power. It is spawning new, creative ideas, including the consolidation of multiple control applications into one box to bring data analytics and visualization tools closer to machines and the use of multicore processors as a way for separating out different automation processes and tasks onto individual cores.

Edge Nodes vs. Computing Power

“Two areas of convergence that we are seeing are small computing edge nodes and the desire to consolidate applications on powerful industrial PCs,” Vibhoosh Gupta, product management leader for GE’s automation and controls solutions, told Design News during a recent interview.

“The megatrends for industrial PCs are that they are becoming both smaller and bigger,” Gupta said. “With the Internet of Things, more and more applications require some kind of computing edge node. These small nodes can collect data and securely transmit data for better analytics.”

GE’s SCADA Edge industrial PC now comes complete with the company’s SCADA/HMI automation software preinstalled.
(Source: GE)

On one hand, while industrial PCs (IPCs) go smaller, customers are trying to consolidate multiple applications into one box, driven by lower maintenance costs and, in some cases, lower cost of capital, thanks to Moore’s law enabling today’s PCs to do 10 times more than their brethren from five or six years ago.

“These factors are leading to a different kind of architecture where companies are running multiple applications, many of which they would not have thought of running together a couple years ago,” Gupta said. “Technology has progressed to a point where it can be done very effectively.”

A new offering from GE, the SCADA Edge IPC is an amalgamation of this approach for automation users that want to run data analytics and visualization applications closer and closer to the machine. Not only is the goal to put the computing edge closer to the machine, tougher hardware is needed since industrial applications require much more ruggedness, fanless operation, and computer platforms that support a longer product lifecycle, reaching out five to 10 years for effectively collecting data.

The SCADA Edge industrial PC is preconfigured with either GE’s iFIX or CIMPLICITY HMI/SCADA software and can be scaled to 500, 1,500, or 3,000 tags. One version of the product is standalone SCADA, where there is a capability to collect and connect locally but then push the data to a remote or centralized historian for analysis and reporting. A second version offers networked-edge SCADA, where the user already has an existing HMI and remote nodes, and the system provides an ability to collect and feed data to a centralized CIMPLICITY or existing SCADA system.

Gupta said the need for Big Data platforms for industrial automation applications is driving these areas of innovation. “The idea is to effectively own and analyze the data, and use data to optimize operations. IPCs are a big part of this because small-form-factor IPCs can be used effectively as edge node collectors, but they offer ruggedness to be effective in collecting data and transmitting it within the factory,” Gupta said.

IPCs with larger form factors provide a method for running multiple applications on a single computing platform that can be implemented close to the machine for machine analytics and visualization. One challenge with these applications is the quantity of data that needs to be collected, because often you don’t want to ship all the data up to the cloud, but rather do a certain level of data analysis locally and send a selected subset of data to the cloud.

“One trend we see is that companies want to have data at their fingertips all of the time, whether it’s for the day-to-day shift operator or supervisors. There are different sets of data, but everyone wants access to the information,” Gupta said. “The goal of this architecture is always to collect locally but also to connect either to a historian, an existing HMI/SCADA system, or the cloud. It’s a field model of data collection and visualization.”

Over time, Gupta said there will also be a field model of analytics. Companies don’t need the cloud to do everything required for analytics. Network-level analysis can be done in the cloud, but often it makes sense to do machine-level analytics next to the machine. It will be interesting to see how machine-level analytics is impacted by network-level analytics and vice versa.

TAGS: Automation
Hide comments
account-default-image

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish