I can see where managing ECNs (aka product change data) would actually be a more difficult task (or maybe I should say, more time consuming) than many of the actual steps in the design process. Who among us has not lost some vital piece of information that was at their fingertips just 2 minutes earlier. This data becomes ever more critical as SKUs proliferate and time to market pressures increase.
Managing the ECO (engineering change orders) is one of the low-hanging fruit applications of PLM and you're right, Alex, about the significant amount time spent trying to track down and stay abreast of that data--especially in light of mounting time to market pressures. With the new Service and Quality modules of Windchill 10.0, PLM is really branching out into territory that's been talked about for a while, not really been implemented in any grand fashion. It will be interesting to see how companies respond.
Given the advantages of PLM -- and its ever developing new tools - I would guess it is getting adopted widely. In the radio show, you asked what industries are the leaders (besides the obvious aerospace). They answer you received was vague. I would guess auto and electronics are big. What are you seeing in terms of adoption and industries?
Automotive, aerospace, and electronics have been the traditional sweet spots for PLM. The big companies have long adopted the platforms and even smaller suppliers in their respective value chains have gotten on board. Some of the newer industries where PLM is seeing traction is medical devices, shipbuilding, consumer products goods, and retail, particularly footwear. Any where there are farflung partners and lots of configurations of products or particularly large and integrated assemblies (shipbuilding is a lot like aerospace) is showing interest.
This posting is the first that I have seen that provides some believeable description of what PLM may be able to offer. So thanks for the education. It is clear now that not all organizations need to buy PLM software.
Beth, it sounds like PLM is mostly useful in larger companies with lots of different products and product lines to manage, is that correct? And perhaps also products with lots of different, or differently-sourced, components?
Ann, PLM definitely came into prominence via big companies, particularly those in the aerospace and automotive sectors, where development projects are large and complex and frequently involve a network of design partners. Today, PLM has evolved, both as a discipline and as a technology, where it's offered in a format that has appeal and value even for smaller manufacturers.
The idea is centralizing all product-related data and materials so there is a so-called "one version of the truth" and the different disciplines are working off the same vision of the product. As a platform, in addition to the central repository piece, PLM constitutes capabiities for creating cross-functional workflows as well as a variety of extended modules for handling everything from early requirements gathering to field service and support procedures and processes as part of the same integrated system.
Thanks for the clarification, Beth. A long time back I wrote about some of the earlier attempts at managing such life cycle data and integration of databases, so it's interesting to see how this all worked out.
It is interesting to watch the progression. I've been covering this stuff since early 2000 when it first started being discussed in its own right as a formalized business software category and business process. In some ways, while the technology has come a long way, it's really just now starting to do what it was positioned to do more than a decade ago. I guess what I'm saying is the PLM vision may be a decade-plus old, but the reality of the platforms supporting that vision on a broad, enterprise scale is really just coming into its own.
That time period sounds right. In the early 90s, I was writing about various efforts to make TQM (total quality management) a reality in US companies, and related efforts that sprang up around those efforts. Most of the initial work I wrote about then was focused on being able to trace components all the way through a product's design and development process and out the door into the field, to first analyze and then reduce causes for field failures. While much of this was prompted by mil/aero apps--and became today's incredibly complex part tracking system in commercial aircraft manufacturing--there were also attempts at integrating other databases, like manufacturing and test data, having to do with other aspects of the product's/system's life cycle before shipment to the customer. Needless to say, the technology for doing so was quite primitive by today's standards.
According to a study by the National Institute of Standards and Technology, one of the factors in the collapse of the original World Trade Center towers on Sept. 11, 2001, was the reduction in the yield strength of the steel reinforcement as a result of the high temperatures of the fire and the loss of thermal insulation.
If you have a Gadget Freak project, we have a reader who wants to make it. And not only will you get your 15 minutes of fame on our website and social media channels, you will also receive $500 and be automatically entered into the 2015 Gadget Freak of the Year contest.
Robots are getting more agile and automation systems are becoming more complex. Yet the most impressive development in robotics and automation is increased intelligence. Machines in automation are increasingly able to analyze huge amounts of data. They are often able to see, speak, even imitate patterns of human thinking. Researchers at European Automation
call this deep learning.
Focus on Fundamentals consists of 45-minute on-line classes that cover a host of technologies. You learn without leaving the comfort of your desk. All classes are taught by subject-matter experts and all are archived. So if you can't attend live, attend at your convenience.