Design News is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

How to Build a Better I/O Automation System

Adobe Stock Feature.png
Here are details about how to develop an automation system that can deliver on the promise of advanced manufacturing.

The pressures on manufacturing performance have increased as companies revamped operations during the pandemic. They needed to increase production while using fewer workers. Manufacturers were also under the gun to implement Industry 4.0 and IoT technology to improve operations. I/O automation systems have become critical for improving performance.

An I/O automation system includes a point-to-point connection that results in a communications connection between two separate endpoints or nodes. It is bi-directional, meaning it works two ways and works over a short distance. It is primarily used to communicate with sensors and actuators in plant or factory automation processes.

Adobe StockAdobeStock_350888164.jpeg

We asked Ian Fountain, National Instrument’s director of technical marketing in the company’s Industrial Internet of Things, to explain how you build a high-performance I/O automation system.

Design News: What attributes of an I/O automation system need to be the highest priority during the design process?

Ian Fountain: There are many standard requirements for an I/O automation system. Requirements include open access to data through industry-standard interfaces such as OPC-UA, support for programming methods widely supported by the development community, and broad support for I/O types. Many of the most popular automation systems check all of these boxes. Given that fact, we advise engineers and managers to look for capabilities that support future-proofing, such as the ability to interface to high-speed measurement subsystems often required in predictive maintenance or provisions for enabling safety-critical scenarios.

As the pressures on manufacturing performance increase due to the promises of Industry 4.0 and IIoT, we believe future-proofing will become a key consideration that earns equal importance as the previous standard considerations. National Instruments (NI) was an early proponent of the industry-standard networking technology TSN (Time Sensitive Networking) because it holds the greatest opportunity to truly enable highly flexible automation approaches and enables engineers to approach their automation challenge with a best of breed philosophy previously impossible due to vendor-specific industrial network topologies.

DN: Who needs to be involved? What disciplines of engineering?

Ian Fountain: In years past, the typical answer was process engineering or mechanical engineering disciplines, but the world of automation hasn't been immune to the impacts of "software eating the world." It is now common to see teams comprised of multiple disciplines, including IT, computer engineers/scientists, and even data scientists. The opportunities for analytics in the factory are immense. Many organizations will employ industrial engineers or use six-sigma and lean manufacturing principles to better leverage the data you have available to you and identify additional data that could be useful either to improve efficiency or quality.

Additionally, in years past, IT team members were often underrepresented in automation system design. With the ever-changing cybersecurity landscape, I don't believe the approach of excluding serious IT talent in your automation projects is very wise.

DN: What is the process for creating this system? What are the software, networking, and hardware needs?

Ian Fountain: There are very well-documented approaches to this challenge. What I think is more interesting is to focus on two relatively new concepts to the world of automation systems:  

Simulation/MBSE (Model Based Systems Engineering) - over the years, many industries (e.g. automotive) have leveraged system simulation to improve their solution quality, reduce time to market, reduce development costs, etc. In the world of manufacturing, the concept of digital twins should be better understood as the impacts can be incredibly meaningful. 

Agile development - most software-centric organizations have long eschewed traditional waterfall systems development approaches in favor of agile methods such as Scrum. Many organizations that are both software-centric and required to manufacture products to support their business model (e.g. new car companies, etc.) have recognized that applying agile development methodologies to their automation systems design projects yields similar benefits as their pure software projects. 

DN: What are the quality control and testing processes?

Ian Fountain: I want to take this back to why we build an automation system in the first place. Going to market with a new product means you need to go through two validation processes: product and process. The product validation process is geared towards making sure the product has been designed correctly and meets the customer expectations, whether those expectations are cost, availability, or quality. Process validation and testing focus on the automation systems' ability to reliably and repeatedly build the product and identify any defects before the product makes it out the door. That process validation can be long and expensive, and very iterative as it often has dependencies on outside factors (prior process steps, network infrastructure, operator training, etc.).

The best way to quickly tune and validate the process is to leverage the data coming off that process and identify through analytics where the issues are, and quickly get them fixed. As you do this, you’re building up a data DNA of that process that will be key to identifying predictive maintenance opportunities, efficiency opportunities, etc. In an end-to-end flow, you can also use product data taken at downstream inspection steps to enhance that data DNA through correlation analysis. Ultimately any validation plan will have the ability to collect and analyze data at the heart of the validation activities.

Rob Spiegel has covered manufacturing for 19 years, 17 of them for Design News. Other topics he has covered include automation, supply chain technology, alternative energy, and cybersecurity. For 10 years, he was the owner and publisher of the food magazine Chile Pepper.

Hide comments
account-default-image

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish