Design News is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Sitemap


Articles from 2014 In December


Engineering Disasters: The Dam Catastrophe That Changed the Profession

Engineering Disasters: The Dam Catastrophe That Changed the Profession

Eighty-six years after America's worst civil engineering disaster, the lessons appear painfully obvious.

Engineers now know that each big project has its own set of unique challenges; there's no one-size-fits-all solution. They know that calculations must be made and that project heads must be degreed and registered. And they know that giant concrete dams must be built with the input of geologists.

None that that seemed obvious, however, at 11:57 p.m. on the night of March 12, 1928. It was at that moment, the St. Francis Dam, located 40 miles northwest of Los Angeles, catastrophically failed. Within seconds, 12.4 billion gallons of water roared down the San Francisquito Canyon, creating a wall of water 140 feet high. The deluge demolished one of the dam's heavy concrete powerhouses and quickly took the lives of 64 workmen and their families who lived nearby. It continued on from there, drowning a construction crew of 84 men downriver, washing away the town of Castaic Junction, devastating the cities of Santa Paula and Fillmore, as well as a town now known as Bardsdale. By the time the torrent had travelled the 54 miles from the dam site to the Pacific Ocean, approximately 600 people were dead. Countless bodies were washed into the ocean, with some found as far away as the Mexican border.

Click on the photo of the damaged dam to start the slideshow.

William Mulholland, the legendary engineer who headed the dam's design, ultimately took the blame. "If there was an error in human judgment, I was the human, and I won't try to fasten it on anyone else," he later said. "I envy those who were killed."

Today, engineers say there were multiple causes of the failure. The dam was unknowingly built on a landslide. During construction, designers raised the height of the dam by 11%, without a corresponding increase in the base width. And workers plugged expansion cracks with oakum on the downstream side, which actually raised the uplift pressure on the dam. In retrospect, experts now say those were just a few of the engineering problems.

"When you look at what causes a levee to break, it's never just a single thing," J. David Rogers, chair in geological engineering at Missouri University of Science & Technology, told Design News. "In the case of the St. Francis Dam, we've looked at eight different failure modes and it failed by all eight."

More disturbing, however, was the approach to the design. Engineers made no formal calculations, instead following examples from books, such as Construction of Masonry Dams, Water Supply Engineering, and The Design and Construction of Dams. The dam's design was simply transferred from the books to the St. Francis site, with little or no modification. "It was a cookbook design," Rogers told us. "Draftsmen simply drafted up their plans based on the books."

Experts contend that the tragedy wouldn't have happened if Mulholland's ideas had been subject to outside review. But Mulholland, widely credited as the force that brought water to Los Angeles, was given carte blanche on the project. Oversight from outside engineers and geologists was almost non-existent.

"If he had had a consulting board looking over his shoulder, it wouldn't have happened," Rogers said. "But he was William Mulholland. He was the engineer who foresaw the need to bring the water supply to the city of Los Angeles. No one in the history of mankind had ever done what he had done."

The problem was that Mulholland had little experience in dam design and none in engineering geology. Equally troubling, he had no formal education. Starting as a ditch digger, he had worked his way up through the Los Angeles Bureau of Water and Power, educating himself in engineering, hydrology, and geology before reaching the level of chief engineer and general manager.

Today the lessons learned in the wake of the St. Francis Dam disaster are easily visible across much of the engineering profession, particularly so in civil engineering. A year after the disaster, the state of California passed a Civil Engineering Registration Bill, requiring that anyone practicing in the field must be registered. Such laws are now enforced in all 50 states. Moreover, geologic input in the design of dams, which had been all but absent in the 1920s, became commonplace in the '30s, remaining intact today.

"The world learned a lot from the St. Francis Dam," Rogers said. "But the engineering profession gained even more from it."

Design engineers and professionals, the West Coast's most important design, innovation, and manufacturing event, Pacific Design & Manufacturing, is taking place in Anaheim, Feb. 10-12, 2015. A Design News event, Pacific Design & Manufacturing is your chance to meet qualified suppliers, get hands-on access to the latest technologies, be informed from a world-class conference program, and expand your network. (You might even meet a Design News editor.) Learn more about Pacific Design & Manufacturing here.

Related posts:

Engineering Disasters: The Dam Catastrophe That Changed the Profession

The St. Francis Dam failed catastrophically on March 12, 1928, after a massive landslide occurred along its left abutment. Only the center section of the dam and a partial abutment remained. <br> (Source: J. David Rogers, Missouri S&T. Colorization by Pon

Eighty-six years after America's worst civil engineering disaster, the lessons appear painfully obvious.

Engineers now know that each big project has its own set of unique challenges; there's no one-size-fits-all solution. They know that calculations must be made and that project heads must be degreed and registered. And they know that giant concrete dams must be built with the input of geologists.

None that that seemed obvious, however, at 11:57 p.m. on the night of March 12, 1928. It was at that moment, the St. Francis Dam, located 40 miles northwest of Los Angeles, catastrophically failed. Within seconds, 12.4 billion gallons of water roared down the San Francisquito Canyon, creating a wall of water 140 feet high. The deluge demolished one of the dam's heavy concrete powerhouses and quickly took the lives of 64 workmen and their families who lived nearby. It continued on from there, drowning a construction crew of 84 men downriver, washing away the town of Castaic Junction, devastating the cities of Santa Paula and Fillmore, as well as a town now known as Bardsdale. By the time the torrent had travelled the 54 miles from the dam site to the Pacific Ocean, approximately 600 people were dead. Countless bodies were washed into the ocean, with some found as far away as the Mexican border.

Click on the photo of the damaged dam to start the slideshow.

William Mulholland, the legendary engineer who headed the dam's design, ultimately took the blame. "If there was an error in human judgment, I was the human, and I won't try to fasten it on anyone else," he later said. "I envy those who were killed."

Today, engineers say there were multiple causes of the failure. The dam was unknowingly built on a landslide. During construction, designers raised the height of the dam by 11%, without a corresponding increase in the base width. And workers plugged expansion cracks with oakum on the downstream side, which actually raised the uplift pressure on the dam. In retrospect, experts now say those were just a few of the engineering problems.

"When you look at what causes a levee to break, it's never just a single thing," J. David Rogers, chair in geological engineering at Missouri University of Science & Technology, told Design News. "In the case of the St. Francis Dam, we've looked at eight different failure modes and it failed by all eight."

More disturbing, however, was the approach to the design. Engineers made no formal calculations, instead following examples from books, such as Construction of Masonry Dams, Water Supply Engineering, and The Design and Construction of Dams. The dam's design was simply transferred from the books to the St. Francis site, with little or no modification. "It was a cookbook design," Rogers told us. "Draftsmen simply drafted up their plans based on the books."

Experts contend that the tragedy wouldn't have happened if Mulholland's ideas had been subject to outside review. But Mulholland, widely credited as the force that brought water to Los Angeles, was given carte blanche on the project. Oversight from outside engineers and geologists was almost non-existent.

"If he had had a consulting board looking over his shoulder, it wouldn't have happened," Rogers said. "But he was William Mulholland. He was the engineer who foresaw the need to bring the water supply to the city of Los Angeles. No one in the history of mankind had ever done what he had done."

The problem was that Mulholland had little experience in dam design and none in engineering geology. Equally troubling, he had no formal education. Starting as a ditch digger, he had worked his way up through the Los Angeles Bureau of Water and Power, educating himself in engineering, hydrology, and geology before reaching the level of chief engineer and general manager.

Today the lessons learned in the wake of the St. Francis Dam disaster are easily visible across much of the engineering profession, particularly so in civil engineering. A year after the disaster, the state of California passed a Civil Engineering Registration Bill, requiring that anyone practicing in the field must be registered. Such laws are now enforced in all 50 states. Moreover, geologic input in the design of dams, which had been all but absent in the 1920s, became commonplace in the '30s, remaining intact today.

"The world learned a lot from the St. Francis Dam," Rogers said. "But the engineering profession gained even more from it."

Design engineers and professionals, the West Coast's most important design, innovation, and manufacturing event, Pacific Design & Manufacturing, is taking place in Anaheim, Feb. 10-12, 2015. A Design News event, Pacific Design & Manufacturing is your chance to meet qualified suppliers, get hands-on access to the latest technologies, be informed from a world-class conference program, and expand your network. (You might even meet a Design News editor.) Learn more about Pacific Design & Manufacturing here.

Related posts:

Scary IEEE Tech Predictions for 2015

 Scary IEEE Tech Predictions for 2015

What are the trends that will rock your world in 2015?

The IEEE Computer Society came up with its version of the Top 10 Technologies for 2015, and while they are all interesting, we took a closer look at the five that were just a tad less obvious than 3D printing, wearable technology, and smartphones. We also selected the technologies that come with the greatest potential threat. Take a look at the five top technologies that will give you pause in the coming year.

Building security into software design

The exponential growth of large data repositories of personal and corporate information, combined with the ability to analyze and collect that data, is creating an urgent need to reconsider security and privacy trade-offs. Bad actors such as adversarial governments, criminals, business competitors, and malcontents have a growing ability and determination to gather information about individuals, businesses, and even critical infrastructure.

In 2015, after a year of unprecedented security breaches, IEEE believes we can expect to see continued focus on balancing security and privacy, along with growing momentum to provide software developers with the tools to create more secure software. "Security is becoming more and more important, and now it has to be built into the software design itself," David Alan Grier, IEEE Computer Society board member and associate professor of International Science and Technology Policy at George Washington University, told Design News. "While this is not particularly new, developers are now designing security function into the software. They are getting pushed to make sure protection is in the software itself."

Industry will tackle Software-defined Anything (SDx) interoperability and standards

IEEE notes that the concept of Software-defined Anything (SDx), using software to control hardware, is in its infancy. Encompassing Software-defined Storage, Software-defined Infrastructure, Software-defined Data Centers, and Software-defined Networking, at its heart, SDx is about data center interoperability and infrastructure programmability. Driven by automation and DevOps, SDx takes network centralization and virtualization, and especially network control, into the cloud. Similar to the smartphone ecosystem, Software-defined Networking's programmability will turn various network appliances into a warehouse of apps.

Manageable, cost-effective, and adaptable in keeping with today's high-bandwidth, dynamic applications, SDN architectures decouple network control and forwarding functions, allowing network control to be programmable and the underlying infrastructure to be separated from applications and network services. "We're taking the idea of software backwards. The core technology is a layer of control networks that act dynamically to keep things out that you want out. It's a matter of control and optimization," said Grier. "They are also taking a bigger look at a design and control system that brings many items together, including sensors and data gathering."

Cloud security and privacy concerns grow

The celebrity photo hacking scandal, the Sony hacks, and the iCloud breach in China in 2014 have brought cloud security to the forefront for 2015. According to IBM's annual CISO study, over the next year nearly one-half of chief information security officers expect a major cloud security breach that will cause a substantial number of customers to switch providers. Enterprises are moving workloads to the cloud and expecting enterprise-level security. Although cloud computing provides an effective way to address the constraints of limited energy, capabilities, and resources, security and privacy protection has become a critical concern in the development and adoption of cloud computing.

To defend against vulnerabilities, various cyber security techniques and tools are being developed for cloud systems. "The cloud has been around and it has really taken off since 2008. Price is driving it, but privacy matters. The major issue with cloud computing is the privacy policy," said Grier. "In the marketplace, cloud services have to be more secure and they have to provide better control than a data center. Right now they're not exercising that control. The second issue with cloud security is how clouds talk to each other. How do you secure inter-cloud connections? You have to make sure the conversation between two clouds doesn't reveal anything."

With cloud computing, an outside organization controls access, security policy, and the data center. Currently, according to IBM, three-quarters of security breaches take days, weeks, or even months to be discovered, significantly increasing the damage inflicted by attackers. "Our researchers are trying to find ways to get big cloud providers to look at what they need to do to protect their clients," said Grier. "The problem has reached a critical point, and these issues don't get enough attention. But it's very serious."

Programmable Logic - How Do They Do That?

Programmable Logic - How Do They Do That?

Designing with programmable logic often leaves design engineers scratching their heads. The details of how programmable logic devices and the associated tools are implemented come with common trade-offs that are sometimes hard to understand. Yet programmable logic devices have become more attractive as they continue to evolve. For one, they have become competitive with other devices. "Perhaps the biggest change in programmable logic has been the growth in dedicated hardened logic," Warren Miller, principal at Wavefront Marketing, told Design News. "This makes FPGAs much more cost competitive with a wide range of ASSPs and MCUs while allowing the creation of complete subsystems in a single FPGA."

Miller will present the Continuing Education Course, Programmable Logic - How Do They Do That? January 12-16 at 2 p.m. Eastern. The course is sponsored by Digi-Key and is free to attendees.

Click here to sign up for the course.

According to Miller, the class will go behind the scenes of FPGA devices, technology, and software to answer some of the questions not typically discussed, such as:

  • Why did 4-input look-up tables become the dominant programmable, logic element over other implementations (AND array, FPLA, and muxes)?

  • Why do FPGA include so many registers and how was synthesis technology able to benefit?

Miller brings more than 30 years of experience in electronics to the course. He has held a variety of positions in engineering, applications, strategic marketing, and product planning with large electronics companies such as Advanced Micro Devices, Actel, and Avnet, as well as with a variety of smaller startups. He has in-depth experience of programmable devices (PLDs, FPGAs, MCUs, and ASICs) in industrial, networking, and consumer applications and holds several device patents.

One area of discussion will be the changes that have occurred with the addition of MCUs. "Now that FPGAs have on-chip MCUs they can be used for algorithms with either sequential functions (in the MCU) or parallel functions (in the FPGA), bringing dramatic performance, power, and ease of use improvements," said Miller. "FPGAs can also be used as sequential function accelerators for the computationally intensive inner loops for even larger efficiency gains."

Another area Miller will explore is the way advances in software have affected programmable devices. "Recent tool improvements allow many FPGAs to be used by software-oriented designers who never need to get their hands dirty with the details of hardware design," said Miller. "This opens up FPGA technology to a much larger design community and will accelerate the creation of software-centric IP and the reuse of high-level functions."

Design engineers and professionals, the West Coast's most important design, innovation, and manufacturing event, Pacific Design & Manufacturing, is taking place in Anaheim, Feb. 10-12, 2015. A Design News event, Pacific Design & Manufacturing is your chance to meet qualified suppliers, get hands-on access to the latest technologies, be informed from a world-class conference program, and expand your network. (You might even meet a Design News editor.) Learn more about Pacific Design & Manufacturing here.

Related posts:

Holland Tests World's First Solar Roadway

Holland Tests World&#039;s First Solar Roadway

For a country that doesn't get as much sunshine as its southern European neighbors, the Netherlands is certainly doing some innovative things with solar power.

We've already told you about a bike path designed after Van Gogh's "Starry Night" using solar-energy-harvesting paint in Nuenen en Eindhoven, Holland, where the artist lived for a time. Now the world's first solar roadway is being tested in Krommenie, Holland, as part of a larger plan to see if solar roadways are a viable option for use throughout the country.

SolaRoad is a 230-foot bike lane comprised of concrete modules about 8 feet x 11 feet thick. Crystalline silicon solar cells are sandwiched between the concrete and a translucent layer of tempered glass about .25 of an inch thick.

SolaRoad is a collaboration of a consortium that includes private and government agencies. One of them is an independent research agency, TNO; the Province of Noord-Holland, Imtech, and Ooms Civiel are the others taking part in the project.

Ultimately, those behind SolaRoad hope the three-year test of the initial road surface will mean other parts of the country will follow with their own solar roadways. Plans are for the road to be extended to 328 feet by 2016.

"The ultimate goal is the realization of the dream that large parts of the road surface in the Netherlands will act as a large solar panel," according to the SolaRoad website. "The generated electricity can be used for street lighting, traffic systems, households, and electric vehicles."

Designing the road was not without its challenges. Constructing the glass to protect the solar cells as well as to be the roadway surface itself was particularly tricky. "It has to be translucent for sunlight and repel dirt as much as possible," according to the site. "At the same time, the top layer must be skid resistant and strong enough in order to realize a safe road surface."

SolaRoad also did not come cheaply. Though project coordinators did not immediately respond to request for comment, a report by the BBC in the United Kingdom said the road has cost $1.9 million so far and could reach nearly $4 million by its scheduled completion in 2016.

Design engineers and professionals, the West Coast's most important design, innovation, and manufacturing event, Pacific Design & Manufacturing, is taking place in Anaheim, Feb. 10-12, 2015. A Design News event, Pacific Design & Manufacturing is your chance to meet qualified suppliers, get hands-on access to the latest technologies, be informed from a world-class conference program, and expand your network. (You might even meet a Design News editor.) Learn more about Pacific Design & Manufacturing here.

Related posts:

NI System-on-Module Comes Ready to Go

NI System-on-Module Comes Ready to Go

National Instruments (NI) has released the NI SOM (system on module). It combines the Xilinx Zynq All Programmable system on a chip (SoC) with supporting components such as memory on a small PCB and features a complete middleware solution and ready-to-go Linux-based real-time operating system pre-integrated. The NI SOM gives design teams the customizability of a SOM without the increased time and risk of developing custom software.

The NI SOM was designed to enable design teams to deploy reliable, complex embedded systems faster, since it is based on and has the same rigorous design standards as the LabVIEW reconfigurable I/O (RIO) architecture. This architecture has already been used in high-reliability applications such as unmanned aerial vehicles and cataract surgery machines.

The goal is to speed development time by taking some of the customization out of technology deployment. "Ultimately, this SOM technology makes it possible for end users to regain control of their applications," Ahmed Mahmoud, senior group manager for embedded control and monitoring at NI, told Design News. "From smart manufacturing facilities in France to healthcare in India, this technology can address the need to advance the industrial automation market, as well as advancing medical technology."

Mahmoud noted that Airbus decided to improve its manufacturing processes with the development of adaptive, smart manufacturing devices. "Using NI's approach to embedded design, they were able to quickly develop smart tools for operator integration in their 'Factory of the Future,'" he said. "They estimate that their development costs are a tenth of the cost of alternative approaches because of the productivity gains of NI's approach to system design. Successfully integrating smarter tools into the Factory of the Future is a complex process that relies on a distributed and interconnected system known as cyber-physical systems."

While the concept of SOM is not unique to NI, Mahmoud noted that the integration is a step forward. "The system on module is a common term," he said. "The difference is the integration with the middleware and the software; if you prototype with ours, you don't have to rewire the code."

Key benefits of the NI SOM include:

  • Complete Middleware Solution: The NI SOM is shipped with a complete middleware solution out of the box to remove the time and risk associated with developing an embedded OS, custom software drivers, and other common software components.
  • LabVIEW FPGA Integration: LabVIEW FPGA eliminates a design team's need for hardware description language expertise, making powerful FPGA technology more accessible than ever before.
  • NI Linux Real-Time: The NI SOM offers a robust Linux-based RTOS, which gives design teams access to an extensive community of applications and IP.
  • Shorter Prototyping Phase With CompactRIO: Design teams can use CompactRIO to quickly prototype their applications and then deploy them with the same code used for prototyping, which saves significant time and effort.

Design engineers and professionals, the West Coast's most important design, innovation, and manufacturing event, Pacific Design & Manufacturing, is taking place in Anaheim, Feb. 10-12, 2015. A Design News event, Pacific Design & Manufacturing is your chance to meet qualified suppliers, get hands-on access to the latest technologies, be informed from a world-class conference program, and expand your network. (You might even meet a Design News editor.) Learn more about Pacific Design & Manufacturing here.

Related posts:

Home Sweet Dumpster: How to Transform a Trash Bin Into a Sustainable Living Space

Home Sweet Dumpster: How to Transform a Trash Bin Into a Sustainable Living Space

If one person's trash is another person's treasure, then it stands to reason that one person's trash receptacle is another person's idea of a cool place to live.

That's the case with Jeff Wilson, a dean and environmental science professor at Huston-Tillotson University in Austin, Texas, who since February has been living in a 33-square-foot Dumpster on campus.

Let's be clear about something: Wilson was not homeless, nor did he need to find housing desperately when he moved into the Dumpster and spawned The Dumpster Project.

On the contrary, he was living in a "3,000-square-foot house full of Ikea furniture with four bedrooms and four-and-a-half baths," he said -- nearly any American's idea of a very comfortable place to live.

Wilson is living in the Dumpster by choice. He came up with the idea to transform a Dumpster into a residence two years ago to explore the idea of sustainable living in smaller spaces, and to understand what a human being really needs at home to feel comfortable, he told Design News. "It really came out of really wanting to spice up the conversation both in the classroom and general society about not only living on less but sustainability, education, and small houses," Wilson, in a slight southern drawl, told us in an interview.

Wilson is not a stranger to unconventional ideas that test the limits of the amount of things humans think they need to live comfortably. One of his early dates with his now-girlfriend Clara Bensen was to invite her on a travel journey from Istanbul to London with no baggage, wearing only the clothing they traveled in and taking only a very small amount of "necessary" items with them. (The story, as written by Bensen, consequently has been optioned to be turned into a Hollywood film.)

So living in a Dumpster he handpicked with help from some of his students may be unconventional for many, but for Wilson it is an extension of his own interest in forcing people to take a critical look at unnecessary consumption. "It really is designed to be an engaging idea -- taking this ultimate symbol of waste and turning it into a comfortable, sustainable home," he said.

Wilson said he has "a lot better life" since moving out of that house and into his Dumpster palace. "You think about all the things that happen when you compress your stuff," he said. "Right now with consumerism we're all 'stufficated' (with) all the time spent organizing, washing, moving that stuff around. That's all gone away."

Having a smaller house also forces people outside to interact with their community and nature, inspiring more potential for collaboration or creativity. It also reduces time spent on practicalities like cleaning and maintenance. "I can paint my entire interior faster than most people can clean their house," Wilson said.

Moreover, since the Dumpster is on the university campus, no longer does Wilson need to commute to campus in a car, saving time, money, and vehicle emissions. "My commute time has gone from 35 minutes to one minute and 35 seconds," he said.

Agile Robots Will Rule in 2015

Agile Robots Will Rule in 2015

2015 may become known as the year of the robot, just as 2014 was the year of connected everything. The recent shift in robotics is not centered on mammoth packaging or welding machines, but rather on small agile machines that are safe around humans. Rethink Robotics with its ubiquitous Baxter has become one of the faces of the new people-friendly robots.

Factory robots for mass production have been kept in large cages since they are heavy and dangerous. The new robots don't need to be isolated. They are soft and they freeze when they bump into a human. "Traditional robots are in cages because they hurt people. We brought the robots out of the cage," Jim Lawton, chief marketing officer of Rethink Robotics, told Design News. "We have this new category that doesn't need a cage and can't hurt you. Getting the robot out of the cage is a big step."

Part of the reason to get the robot out of the cage is so it can do tasks that are not endlessly repetitive. The new robot can help with a variety of tasks. "We look at how we can get the robots to do real work with real value in an imprecise environment with constantly changing tasks. We need the robot to be flexible," said Lawton."

Robotics for non-programmers

The goal was to create a robot that can do a number of tasks and can be switched from task to task. The software to make this possible is embedded in the robot so users don't have to program every task. "It needs to be a tool that companies can quickly get up and running with little expertise. Companies should not have to be experts in robotics," said Lawton. The ease of operation and flexibility make the robot affordable to companies that watch ROI closely. "They need the robot to be able to do a variety tasks and do it with a payback in a year or less."

The small flexible robot becomes an automation system for companies that have not invested heavily in automation. "Now that I have these robots, how do I make them valuable to me? In the real world that's challenging. That's why automation has been narrowed to a few industries over the last few years," said Lawton.

Robots for small plants

The newer robots are also becoming less expensive, which makes them attractive to small and medium-sized plants. "We've seen the uptick in interest in robots for small to medium businesses. They got left out of the automation revolution. They don't have the big machines or people to program their automation," said Lawton. "Their products are high mix and low volume. Now there's a robot that's really cheap and you can move it around from job to job. So these folks are now getting into it."

One major advantage of small robots with embedded software is that it doesn't need to be customized for each deployment. Users can move the robot around to teach it what needs to be done. "In the past, robots have been customized per installation. They had to be programmed every time you put them to use. These new robots don't need that customization," said Lawton.

Given that a user can take the small robot and teach it what to do, it's often surprising how the robot is used. "Customers send us videos of what the robot is doing. We look at that and think, 'I wasn't sure we could do that,'" said Lawton. "We're learning a lot as we go. Eager customers are trying new things. It's part of how the industry is going to roll out over the next two to five years."

Lawton noted that the startling advances in robotics are coming from smaller robots. "Traditional robots are very mature and well understood. Those high-volume robots are improving, but it's incremental," said Lawton. "The space with collaborative robots is where you can do things in really different ways. You have all this artificial intelligence and the ability to combine the different forms of technology."

Design engineers and professionals, the West Coast's most important design, innovation, and manufacturing event, Pacific Design & Manufacturing, is taking place in Anaheim, Feb. 10-12, 2015. A Design News event, Pacific Design & Manufacturing is your chance to meet qualified suppliers, get hands-on access to the latest technologies, be informed from a world-class conference program, and expand your network. (You might even meet a Design News editor.) Learn more about Pacific Design & Manufacturing here.

Related posts:

Megatrends Shaping PACs & PLCs

Megatrends Shaping PACs &amp; PLCs

Programmable machine control that started with simple ladder logic has extended its tentacles into all aspects of advanced automation, motion control, safety, vision systems, and enterprise connectivity. But controllers themselves, whether they are PLCs, PACs, or industrial PCs, are continuing to evolve as technology moves ahead.

Not surprisingly, the megatrends shaping the factory of the future are also shaping the direction of new machine controllers, since they play a point-forward role in actual control, data collection, and the connection to plant information systems. These directions include integration of advanced functionality into the controller itself, more robust connectivity options, and a push to become the hub for information-enabled machine control.

More Integrated Functionality

One broad trend is a combination of integrated control and integrated information. System engineers are looking to take all their device data, along with machine and station information, to create a point of focus within the programmable automation controller (PAC).

"Customers want to integrate more functionality and move beyond discrete machine control," said Keith Staninger, business manager - Controller and I/O Platforms for Rockwell Automation. "The shift from PLC to PAC is starting to drive buying behaviors which means more push for a multi-disciplined controller. There is a move to integrate batch processes into the machine control, and integrated safety is becoming the standard for most projects. Overall, the focus for customers is extending beyond machine control to what the PAC or PLC can provide to the application as a whole."

"The trends we see in the systems are focused on tighter controller integration, and the ability for fewer engineers to do more work," said Mike Chen, pan-america marketing group manager for Omron. "The goal is for one engineer to do all the controls work in one environment, ideally with one vendor and on one platform. That is where we like to focus strategically and where we think the broader trends in machine control will go."

Technology partnerships at the vendor level are also becoming very important. "There is a trend to large partnerships, for example Rockwell Automation and Cognex, for vision systems," Chen said. "What we do is to create our own solutions internally for the best interoperability. Many technology companies cannot provide customers complete solutions. Companies either need to develop technology internally to do advanced applications, or need to partner with other technology providers."

"Applications on the high end require integrated vision, integrated motion, and advanced safety systems. Suppliers need to have a partner or develop their own robotic solutions. Customers are expecting these types of more complex integrated solutions to integrate with the controller platform," he added.

According to Jeff Payne, product manager - Automation Controls Group for AutomationDirect, the top technology trends for PACs and PLCs used in machine control focus on the capabilities of the hardware platform, software programming tools that shape application development. Increasingly, there is a focus on the ability to improve data collection and easily implement higher-level enterprise connectivity. This not only includes links to internal ERP and MES, but also access to important data points using mobile devices outside of the facility.

Payne said the biggest technology trends for PACs and PLCs used in machine control focus on quality and up-time performance which are two top goals for every machine operation.

"Memory is cheap; it is a small price to pay to have a processor with ample memory to be able to store all of the machine's control program documentation. This will greatly improve troubleshooting and reduce machine downtime," Payne said.

Megatrends Shaping PACs & PLCs

Rockwell has a three-pronged approach for machine control: converging the system infrastructure using standard unmodified Ethernet; converging its programmable automation controllers with Control Logix; and its design environment with Studio 5000. <br> (R

Programmable machine control that started with simple ladder logic has extended its tentacles into all aspects of advanced automation, motion control, safety, vision systems, and enterprise connectivity. But controllers themselves, whether they are PLCs, PACs, or industrial PCs, are continuing to evolve as technology moves ahead.

Not surprisingly, the megatrends shaping the factory of the future are also shaping the direction of new machine controllers, since they play a point-forward role in actual control, data collection, and the connection to plant information systems. These directions include integration of advanced functionality into the controller itself, more robust connectivity options, and a push to become the hub for information-enabled machine control.

More Integrated Functionality

One broad trend is a combination of integrated control and integrated information. System engineers are looking to take all their device data, along with machine and station information, to create a point of focus within the programmable automation controller (PAC).

"Customers want to integrate more functionality and move beyond discrete machine control," said Keith Staninger, business manager - Controller and I/O Platforms for Rockwell Automation. "The shift from PLC to PAC is starting to drive buying behaviors which means more push for a multi-disciplined controller. There is a move to integrate batch processes into the machine control, and integrated safety is becoming the standard for most projects. Overall, the focus for customers is extending beyond machine control to what the PAC or PLC can provide to the application as a whole."

"The trends we see in the systems are focused on tighter controller integration, and the ability for fewer engineers to do more work," said Mike Chen, pan-america marketing group manager for Omron. "The goal is for one engineer to do all the controls work in one environment, ideally with one vendor and on one platform. That is where we like to focus strategically and where we think the broader trends in machine control will go."

Technology partnerships at the vendor level are also becoming very important. "There is a trend to large partnerships, for example Rockwell Automation and Cognex, for vision systems," Chen said. "What we do is to create our own solutions internally for the best interoperability. Many technology companies cannot provide customers complete solutions. Companies either need to develop technology internally to do advanced applications, or need to partner with other technology providers."

"Applications on the high end require integrated vision, integrated motion, and advanced safety systems. Suppliers need to have a partner or develop their own robotic solutions. Customers are expecting these types of more complex integrated solutions to integrate with the controller platform," he added.

According to Jeff Payne, product manager - Automation Controls Group for AutomationDirect, the top technology trends for PACs and PLCs used in machine control focus on the capabilities of the hardware platform, software programming tools that shape application development. Increasingly, there is a focus on the ability to improve data collection and easily implement higher-level enterprise connectivity. This not only includes links to internal ERP and MES, but also access to important data points using mobile devices outside of the facility.

Payne said the biggest technology trends for PACs and PLCs used in machine control focus on quality and up-time performance which are two top goals for every machine operation.

"Memory is cheap; it is a small price to pay to have a processor with ample memory to be able to store all of the machine's control program documentation. This will greatly improve troubleshooting and reduce machine downtime," Payne said.