This month’s question on our LinkedIn System & Product Design Engineering group (actually multiple questions) comes from a reader who doubles as an industrial automation consultant: When is a project too large for a PLC? When is it best to change to a multiple-PLC system rather than a single PLC? Or when does it become cost-effective to develop a custom solution rather than an OTS PLC?
One response comes from Zeeshan Noorwala:
There could be a number of answers to your question, depending on where you look at it from. Here’s what I learned from my experience: We were to redesign the PLC code for an already-running concrete-brick-making plant, as part of a preemptive disaster recovery plan. It was a big plant with two major control rooms, one for the brick-making link, and one for batching/weighing and mix preparation. Both rooms had a panel with PLC racks, containing I/O modules, analog modules, etc. However, the whole plant was controlled by one PLC. The panel in the batching control room only had I/O and analog modules, connected by an interface module to the main PLC, as an extension to the main rack.
While the system was working fine under the control of one PLC, I see this as an example of cost-cutting. When the PLC finally gave out, taking the plant down with it, the damage was done. There was a temporary shutdown in the brick-making section. But the real damage was done in the batch weighing and mixing section. There was wasted material and extra labor charges to practically blast the set mixture out of the mixers/hoppers.
According to Doug Huffman, a senior instrumentation and control system engineer, there may not be a correct answer to the question:
Off-the-shelf products have become a very cost-effective way to tackle big projects. It’s easy to solve an issue like the one Zeeshan encountered by having redundancy for the processor and I/O.
A very specific response comes from David Mertens, who says:
The factors that should be considered when deciding whether to split a project over multiple CPUs are:
Independence, or the ability to completely shut down one part of the plant without affecting other parts.
Cycle time. If your program reaches 80 percent of the design cycle time, you should look for an extra controller. However, if you decide to split the program at this point, the cost will probably be much higher than if you use two controllers right from the get-go.
Memory. Similar argument -- if you use up 80 percent of the available memory, you need to consider splitting.
I/O count. If you run out of addressing space or possible extension space, you’ll be forced to split. However, this usually occurs early in the design.
Distance. It may be cheaper to add another controller at another location rather than pulling all the extra lengths of cable. However, this argument is fading as most PLC systems use remote I/O over profibus or profinet.
How would you answer these questions? Tell us in the comments section below.
I don't do that much work with PLC equipment so I'm basically a rookie at application and installation. With that being the case:
QUESTION: Do PLCs often serve as redundant devices? Can one be a "master" with another the "slave"? One more, are PLCs ever used for "pilot duty" enabling other pieces of electrical equipment to function; i.e. start/stop?
While I agree with the majority of the posters that there is no one correct answer, I would recommend that the original questioner follow up on the idea presented in the article by Doug Huffman. A lot of the major PLC manufacturers have a number of options for redundant processors and / or power supplies, depending upon what the most likely failure mode in the plant is. These can go so far as to include hot redundancy that will automatically keep everything running when one fails. The cheap and easy course of action (if you can live with some downtime while a tech. takes care of the problem) is just having a spare / programmed processor stored in the cabinet. I have seen that done with a bit of success as well.
"When your only tool is a hammer... all your problems start to look like nails"...
Don't make PLCs your only tool (solution to everything)...
My simplest observations:
- for building automation ... distributed controls (PLC, Custom or specifically made for the application - but for multiple customers/sites) .. the wiring architecture is likely more important than the pieces involved. Unless mutiple campuses are involved (10,000+ I/O points).. then other issues become dominate (software architecture).
- for process automation ... too many variables - depends
- for machine controls.. product volume drives the answer (most of the time). What has changed recently, the volume for custom or semi-custom has a much lower threshold than in the past. I can often make a case for custom solution with volumes of less than 50 units / year. In this environment, having a single (easy to trouble shoot) control board can make field maintenance become a driving factor.
- everything else.. depends
My point: be aware of the ALL the options and DON'T ASSUME TOO MUCH.
The costs for each hardware option are changing all the time.
There is a cost to maintaining the software for each option and the assumptions for this cost are changing also. In some systems, a PLC can demand a full time software effort (constantly changing environment). This MAY be better handled with Custom configurable software than custom PLC software. Reason: configuration changes on some software will not require regression testing before being put into place. Just depends.
I'd agree with the idea of there is no simple answer for this. It always depends on the nature of the application. You always have to consider things such as I/O counts, cycle times, physical sizes and areas, failure risks, BUDGET, etc. High-end PLC's can run multiple control sequences concurrently, you can mount multiple CPU's on a single rack, and I/O networking is a very convenients means for wiring and sharing I/O.
For failure, even if you partition areas of an operation into multiple PLC's, failure of a critical PLC can still hamstring the entire operation. You still have to plan for operating with a critical failure until the failure can be repaired. So, you always need a full analysis of the risks. There will always be the situation that you didn't think of, but you should be prepared for the one you do think of
For my 'tastes', for a simple standalone application where failure is an incovenience (think, say an automated car wash), use the single cheapest controller that will work, with standard wiring.
For larger applications, always plan for the future by leaving cabinet space and power capacity for additional PLC's (possibly by at least leaving panel space for a new rack). I/O networking (Ethernet/IP, ProfiNet, etc.) make adding I/O much easier... Also, plan for ethernet access to operations from MES, management, etc. Once plants get a taste for data, they always want more...
In many instances it is better to have multiple PLCs, especially if the process exists in multiple blocks that are not interwoven. In the brick making example, one controller for mixing, one for forming and handling green product, one for running the kiln and kiln conveyer system, and one for handling the logistics of unloding and storage. On top of other advantages, simpler programming and the ability to run the individual processes manually are major benefits. In addition, if a connon PLC type was selected for all the sections, one spare could provide complete backup capability for a lower cost.
Of course, designing the hand-shake between controllers would be an extra task, but the effort would not be as much as the advantage gained.
Given the nature of multi-tasking operating systems and more advanced hardware, I expect the devil is in the details. What types of tasks must be managed from one platform? What are the overall hardware and processing resources required, etc. for the complete set of tasks. I think normally one question is how complicated the application software becomes in terms of maintainability and expandability into the future. Interesting question.
Rich, in an application like PLCs it is probably better to opt for multiple units. This allows the modification of different parts of a plant without affecting others. PLCs will probably not be the major cost driver.
The company says it anticipates high-definition video for home security and other uses will be the next mature technology integrated into the IoT domain, hence the introduction of its MatrixCam devkit.
Siemens and Georgia Institute of Technology are partnering to address limitations in the current additive manufacturing design-to-production chain in an applied research project as part of the federally backed America Makes program.
Independent science safety company Underwriters Laboratories is providing new guidance for manufacturers about how to follow the latest IEC standards for implementing safety features in programmable logic controllers.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.