Design News is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Sitemap


Articles from 2021 In February


Good Reads: Top 10 February Design News Articles

Design News readers demonstrated wide-ranging interests in February, lighting up our site for stories on the Super Bowl, cool cars, scary flights, and maybe even scarier robots. Which ones were your favorites, and where do they fall on the top 10 list?

How to Build a Better Telepresence Robot

OhmniLabs new resize.jpg

A telepresence robot is a machine that interacts with humans to help them to interact with people and their surrounding environment remotely. These robots enable virtual visits at hospitals and nursing homes, enable virtual walk-throughs for home buyers located far away from a property, and keep students connected with their classwork and peers when they are out sick.

A telepresence robot is, effectively, HMI on steroids.

OhmniLabsOhmni Med White.png

The OhmniLabs telepresence robot.

We caught up with Jared Go, CTO at OhmniLabs, to get a description of how to build a better telepresence robot.

Design News: What are the qualities that are critical to building a telepresence robot?

Jared Go: A  telepresence robot seems like a simple thing, right? You have a video chat, like any tablet, and it moves around. But when it comes to building one, you'll find quickly, there are so many things that go into it. You have to design a power system and optimize power to have these robots last several hours between charges. People rarely use these robots for a quick call, they usually log in to them and remain on them for three to five hours. With that comes designing a battery subsystem with lithium-ion batteries, and with lithium-ion batteries comes safety, and with that comes so many other things - and again, that’s just for the batteries. And then because it's a robot and someone else is remotely controlling it, you need a way to charge it, without requiring somebody to physically plug it in after they’re done using it - thus you have to develop some kind of docking system and build the robot to navigate itself to the docking station safely so that it can charge.

You also need the charger cable to supply sufficient power for quick charging, so that the robot is always ready to go. Additionally, you need it to be safe for a multitude of environments, including homes, hospitals, office buildings, nursing homes, and more, so it has to be non-hazardous to children, the elderly, and pets, and you need to make it non-obtrusive, if you have a massive charger and you're trying to deploy this in a certain environment, let's say a cramped hospital room, it's a no go. So, there are all these small factors to consider in design.

You have to be an expert at all sorts of layers. So, there's, of course, the mechanical layers, there are the electrical layers, and there are the software layers. On the mechanical side, there are simple things to engineer like putting a screen and camera up on a long, tall robot, but in doing this, you get movement with every dip or crack in the floor, right? You also get flex in the tube; you get flex in the system. And so, the challenge is in designing the system to respect the dynamics of everything you've created. And you can do that in different ways. You can have firmware that makes your acceleration smooth, you can design the mechanics to be stiffer or to absorb the shock, you can have software that does image stabilization. There are so many ways to solve these problems, but it requires a team that's capable of looking across all the layers that go into each feature, each component, and optimize the total solution across all that.

OhmniLabsOhmni_Details_B1-min-1.png

In our case, we run a variant of Android, we have the fork that we call Ohmni OS, and there's a tremendous amount of work just to integrate all the peripherals, like the external cameras, sensors, and even simple things like external power buttons and battery charging logic. With those considerations, you start to break out from the typical Android form factor, which is just one integrated device and you have various components that can all be connected, which becomes challenging alone. And on top of that, of course, the key challenge is having a good video streaming infrastructure.

So just like the big players, Zoom, Skype, and Teams, all these other players, we have to do a really good job of making a seamless infrastructure that is not just video and audio but also accommodates the need to drive the robot with low latency. We also need to design that video conference solution but enhance it with a very low latency control model where you can use either a gamepad, a mouse, a keyboard — any type of controller to move the robot. And as you can see, the number of things that go into this starts multiplying, it starts getting crazy.

OhmniLabssmaller Ohmni Base Components.jpg

The electronics in the base of the robot.

Another thing that's really, really interesting, and difficult is audio. And, you know, we all take it for granted, people buy things like Alexa, Google Home, and we take two-way communication with devices for granted. But designing a moving platform with a really good mic, and speaker setup is a challenging task. There's a lot of acoustic design that goes into that. And the acoustic design, sometimes based on the speakers you use and the materials you use, they go back to industrial design, there are just some designs where you can't engineer a good spot for a speaker where you create enough air volume behind the speaker or isolate the speaker and microphone from each other. And all these things, add complexity.

It’s very hard to know what will work the first time you design a system like this. So, it's really important that you can prototype quickly, you're able to test in the real world, you're able to assess these kinds of things, and then evolve your product accordingly before it goes all the way to market. One of the things about us is that we're an organization that was founded on the belief that, to be a good robotics company, you need to be very lean, and you need to know how to iterate very quickly and cheaply. And that's because robotics is still in the early, early years.

Even though they’ve been around for decades it's still early and people don't just see a robot and assume, okay, this is going to solve problem X and bust out their wallets to buy one — it’s new to them, and can be scary. So, what we've done is we've created an entire team, an entire infrastructure, and a process to do robotics development 10 times faster than the traditional model. And in doing so, we're able to iterate our product extremely fast.

The 13th generation of the Ohmni robot is going to be coming out later this year, and along the way, we've learned so many lessons about what works, what doesn't work. And even today, we find surprising things out about our products and the assumptions that go into developing our products. Sometimes we assume customers want a feature, then we go test it with them, and learn they didn't care for the feature or the problem we thought we solved was a non-issue, to begin with. So, there are all these interesting learnings that you can do when you can iterate like that and put real products in front of users.

So, to summarize, the most important things we’ve found when it comes to building telepresence robots is probably the same as what most engineering companies find… it’s having a multidisciplinary team that can communicate well across all parts of the design process, engineering, production, etc. because to make a robot, all of those steps need to work and work together well. And on top of that, it is important to have a really good process for prototyping, evolving, iterating, and learning in your organization and your engineering and production process, so that you can better yourselves and better meet the needs of your target customers.

OhmniLabssmaller Ohmni Full Folded.jpg

The folded version of the robot

DN: What are the steps from design through production?

Jared Go:

Phase 1: Concept and Design

This phase begins with a business discussion where we work with you to gather high-level requirements.  Then we move to design idea generation where initial designs are developed as rough sketches. These sketches and presentations are shared with you, the client to review and consider for more detailed design exploration.

Following your selection of the desired idea, we further develop the high-level concepts and explore the overall contours, surfaces, color, and detail elements. These detailed sketches include feedback from the OhmniLabs production team to ensure that they are overall manufacturable and have appropriate fitment for all components.

Once you confirm the final design, we refine the design and details into 3D CAD models and provide high-quality renderings.

Phase 2: Engineering

During the engineering phase, we develop the complete system architecture from your requirements. We conduct team design reviews for both you and external teams and system-level and component trade studies. We also complete system modeling and simulation, create technical specifications and design the product development road map.

Mechanical Design & Engineering

Electrical Design & Engineering

Software Design & Engineering

CAD & analysis

Kinematics design

Mechanical system design

System-level design

Complex electro-mechanical system engineering

CAD & simulation tools

Complex custom-designed control hardware

Wiring harness schematics, design, & build

Low & high voltage power distribution design

Power electronics development

Custom Printed Circuit Board (PCB) design & assembly

Custom mobile applications

Web apps & backends

Hardware & software integration

ROS module integration

At the end of this stage, we deliver an engineering proposal, including effort and risk assessment. Together we agree on what product will be delivered, including the cost and timeline.

Phase 3: Prototyping

The goal of the prototype phase is to quickly develop a first robotic solution with which you can test out a market or a specific use case. You’ll want to consider if there is enough of a need in the market. Is the customers’ pain big enough to justify paying for your robotic solution? Is the robot usable and convenient? And is the customer willing to pay enough for your product for you to make a large enough profit?

The deliverable at this stage is a robotic solution that focuses on functions sufficient to test your MVP or minimum viable product. Below are the goals and deliverables for each of the 3 validation testing phases.

 

Goal

Deliverable

Engineering Validation Testing (EVT)

To build a product tailored to your needs and based on your specific requirements document or list based on engineering specifications with all system functions validated and integrated holistically into your design

Typically, 1-5 robots that have been custom designed and hand-built by the engineering team

Development and Validation Testing (DVT)

To test the final aesthetics of your design using rigorous systems, to get your design certified, and to finalize packaging

Typically, 10-30 custom designs and engineering-built robots which are intended to look like the final version of your product

Production Validation Testing  (PVT)

To start full production using robotics technicians, contract manufacturers, vendors, and suppliers

Typically, production-ready samples followed by 1000s of market-ready robots

Phase 4: Manufacturing

At this point, you’re ready for full-speed production of your product with a full warranty, shipping, and fulfillment.

As mentioned in the initial table, the challenges here are much larger than when designing a prototype alone. At this step, we can help ensure that your product works in various environments when controlled by a range of users; that packaging is suitable for worldwide shipping; that your product receives the appropriate safety certification; and that your project is staffed with trained technicians that are continuously managed.

We can deliver the volumes at a fraction of the time and cost of traditional manufacturing, thanks to an advanced additive manufacturing process that we have been perfecting over the last five years.  We build our production-grade 3D printers that have been optimized for 24/7 operation. We can spin up lean manufacturing lines while ensuring an entire quality and production process.

Phase 5: Deployment

Unlike many traditional robotics companies that only offer concept, design, and prototyping services, OhmniLabs offers end-to-end solutions that support manufacturing at scale as well as seamless deployment via Cloud robotics. This Cloud infrastructure manages all robots in the field and tracks the unit health, usage statistics, and automated upgrades with a simple click.

We help you get your robots to market fast by leveraging our comprehensive library of modular robotic tech modules/components. This collection of pre-designed, production-ready technologies include, but are not limited to:

  • integrated lithium battery and power management systems
  • Brushless motor control systems and direct drivetrains
  • self-docking charging system
  • multiple parametric chassis designs and structural components
  • custom multitouch IPS HD displays and embedded systems
  • long-range microphone array and acoustically tuned speaker systems
  • fully integrated OS and drivers for sensors, motors, etc.
  • cloud-based fleet management and update system
  • deep learning and autonomy stack
  • ultra-lightweight teleoperated and soon autonomous robotic arms
  • certified UV-light/Disinfection kit with tech modules that are expandable on an infinite scale

Most of these components are already in mass production and we've invested the millions of dollars and thousands of person-hours required to take these components from prototype to production-ready.

On top of that, these components have been tested and refined through continual real-world usage.  Every day, we're shipping and supporting these components in the production of Ohmni telepresence robots as well as in other clients' custom robots.

By leveraging our library of tech modules and lean manufacturing process,  we enable you to reap the benefits of economies of scale even at lower volumes and drastically reduces risk and shortens time to market. 

Depending on the scope of the customizations, however, there's often still a lot of work to be done to create a high-quality, finished product, so plan accordingly and be prepared.

DN: What type of professionals are involved in the process? What is the team like?

Jared Go: You need an army of design, engineering, production, manufacturing, inventory, cloud developers, quality assurance, quality control, product management, shipping, receiving, customs, marketing, sales, and more. Beyond that, you need support. You need success. You need fundraising. And above all, it's not just who is involved inside the company, but also externally. So, it's really important to us that the partnerships we build with our vendors, our shipping carriers, and everybody involved is an instrumental part of delivering Ohmni Robot to our customers.

You might be surprised, but when you have a robot that has hundreds of parts, inventory itself becomes such a large issue, and just the physical arrangement of that storage, tracking, everything with processes associated with that. And, you know, when you work with vendors, your vendors must deliver consistency, providing you the same part every time. Managing parts vendors is a critical job that takes an amazing amount of skill and experience.  So, it's a large team and if you are ever going to make your robot, never underestimate the challenge of how many people it takes. It takes a village to make a robot.

I guess the one part that we are a little bit unique in is that our hardware team is very multidisciplinary. I think relative to a lot of traditional organizations that have very structured electrical, mechanical,  maybe quality, or production teams that are a little bit more siloed, we try to have everyone cross-train and understand the challenges in each domain. And the benefit of doing that, even though it's more difficult is that each person can now make optimizations at the right level. So, I think having a unified team is unique to how we operate. And I think that is, again, a key to how we can iterate quickly and how we can build a product quickly.

DN: What are the greatest challenges in building a telepresence robot?

Jared Go: Connectivity is by far the toughest challenge when creating robots because the quality of the experience depends on it. And normally, when you have a video call or audio call, it's not terrible, you know, if the quality is a little bit laggy, it becomes inconvenient, but it's not terrible. But with robots, spotty internet could mean losing control of a moving robot in a manufacturing plant - a massive no-no! So, the bar and the standard that we hold ourselves to is much higher than a typical call. And the challenges associated with that are so many, it could simply be that the user has a weak access point. Or the way that users have deployed access points is leaves dead zones. The problem can stem from a million different places. I think anyone who's ever worked in IT will tell you, consistent Wi-Fi coverage is challenging.

Beyond connectivity, is understanding how diverse real-world conditions can be, and making sure you're strict enough about your requirements and your testing, to design well. Sometimes it takes an iteration to discover the weak points. But, to learn about and solve little issues that arise is a huge challenge you need to face head-on. You never want to learn after the fact that you’ve made thousands of defective robots. So really, also, from a process perspective, making sure you are organized, learning fast, continuously learning, assessing risk, and making sure you're finding these issues before you can scale.

DN: What are the testing procedures?

Jared Go: So, for us, the testing procedure always comes from two things. One is observing the conditions that the product will be used in, and the other is assuring operation in real-world conditions. We start by asking customers how and where they plan on using our robots. Typical questions like, How are you using it? Is it dusty? Is it bumpy? So, from that, we develop our product requirements document. And with that, every component has a functional test of its own. There are also system-wide tests that we do. And they span all different areas, of course, mechanical, electrical, even software and cloud server tests.

So, the normal testing strategy starts with us deciding on what parts are worthwhile to test and at which level. And anything more of a component test or sub-assembly test, we typically do those things on the line. We also do full system integration testing in a bunch of different ways. And we do comprehensive tests and that help us uncover defects or critical areas. It's really important for us too that the system used to track defects is part of the closed-loop design process, so that the information you get when things go wrong, immediately gets back to your other teams, and they can prioritize and work on those.

DN: What are the quality assurances involved?

Jared Go: So, we have a strict set of criteria for every robot that goes out the door. And it covers everything from fasteners and fastener torques to observed behavior. For example, wobble, displacement, fitment, screen quality, microphone, and speaker quality - basically everything that constitutes normal operation of the robot. We also regularly test our servers and web service for quality of experience because the robot itself is only part of the experience provided by a telepresence robot.

DN: If you were to summarize some key points on build a better telepresence robot, well, what would be those key points?

Jared Go: In robotics, being able to iterate fast and test in the real world is critical.

It is ideal to maintain a multidisciplinary team that can communicate well and makes design tradeoffs across all domains simultaneously, not just individually optimize each separate area, because then you'll end up with a globally sub-optimal solution.

Don't underestimate the amount of time and capital it will take to get a functional product to market. Because there's a lot of things you'll learn along the way that will force you to change the design or improve upon and so forth.

For us, we’ve found success in quick iteration, and cost-effective manufacturing through our additive manufacturing capabilities. We do 3D printing at a production scale, which allows us to create several thousands of robots a year. This allows us to do two things: one is we can change our design in a matter of hours, instead of months. If there's a defect or if some part area needs to be strengthened it gives us tremendous flexibility to go and solve those problems immediately. It also allows us to print and do certain geometry that is very challenging for typical molding processes. Leveraging all of the advantages of 3D printing is one of the things that truly make us stand out and has helped us to move faster.

Rob Spiegel has covered manufacturing for 19 years, 17 of them for Design News. Other topics he has covered include automation, supply chain technology, alternative energy, and cybersecurity. For 10 years, he was the owner and publisher of the food magazine Chile Pepper.

 

 

Snap-On Connector Straps, Brushless DC Motors, and More Supplier News

In addition to a plethora of motors, this week’s supplier news includes cobots for first-time users and all-weather multitouch sensor.

Friday Funny: An Apple Ad We’d Like to See

Have you always wanted an iPhone with eight cameras? This video shows what it would be like if an Apple ad leveled with you.

Rob Spiegel has covered manufacturing for 19 years, 17 of them for Design News. Other topics he has covered include automation, supply chain technology, alternative energy, and cybersecurity. For 10 years, he was the owner and publisher of the food magazine Chile Pepper.

Tech Tidbit: Building an All-Wheel-Drive Minivan

Stellantis 2021 Chrysler Pacifica Pinnacle AWD.jpg
2021 Chrysler Pacifica Pinnacle AWD

Drivers have flocked to crossover SUVs in part because they value the all-weather security provided by their available all-wheel-drive systems. Minivans check even more of shoppers’ practicality requirement boxes than SUVs do, but their sales are lagging. In response, Chrysler added all-wheel-drive to the 2021 Pacifica minivan and managed to do so while retaining the vehicle’s popular Stow-n-Go in-floor seat stowage bins.

Design News had the chance to test the effectiveness of this new all-wheel-drive system during some winter snow and sleet and we came away impressed by the system’s seamless performance and the resulting stability.

The Pacifica’s ability to start from a stop on hills and on ice without just spinning the front tires unproductively is particularly noteworthy. Normally, with front-drive vehicles like the regular Pacifica, when weight transfers to the rear due to a hill or acceleration, the front tires lose traction.

Not so with the AWD Pacifica, which can not only shift all of its drive power to the rear wheels if needed, but it also controls grip between the left and right sides by applying braking to a spinning wheel to force torque to the wheel with traction.

The system comprises a laundry list of new or revised hardware for the van, including the power transfer unit, a three-piece drive shaft, the rear-drive differential module, tweaks to the

Brakes, wheel hubs and bearings, and the suspension’s hub carriers/knuckles. Self-sealing tires round out the changes for still more security.

The net of these changes is the addition of about 300 lbs. to the Pacifica’s mass, according to chief engineer Brian Swanson. The suspension geometry changes are in response to a 20 mm increase in ride height needed to provide clearance for the new drivetrain parts, but which has the benefit of providing a smidge more clearance when driving in snow too.

The system employs clutches in the rear-drive system to completely disconnect that portion of the drivetrain in regular driving to preserve the Pacifica’s fuel efficiency. The EPA dings the AWD by 2 mpg in its combined fuel economy score compared to the front-drive version, with 17 mpg city, 25 mpg highway, and 20 mpg combined ratings.

Stellantis2021 Chrysler Pacifica AWD badge.jpg

While some vehicles have switched to engage all-wheel-drive, Pacifica’s system is invisible, sending power to the rear wheels only when it is needed. “It is all automatic,” Swanson observes. “We turn it on using various triggers,” he says. “The first one is temperature. In cold weather, we actuate the system. It looks at different sensors like wheel-slip, electronic stability control actuation, whether the vehicle is on a steep grade.”

But the so-called “loose nut behind the wheel,” is another significant variable, so the Pacifica takes the driver's steering into consideration too. “We look at inputs from the driver,” Swanson continues. “If we see a lot of steering input we’ll assume they are avoiding an obstacle and turn the system on. A heavy throttle passing maneuver is another trigger we use to activate the system. We want it only to be there when they need it.”

Bone-Stimulating Device Uses NASA-Developed Tech to Help Prevent Osteoporosis

Image courtesy of Bone Health Technologies OsteoBoost from Bone Health Technologies

After bone density peaks, typically when people reach their 30s, most experience a slow decline from bone cells dying and being resorbed without being entirely replaced, said Laura Yecies, CEO of Bone Health Technologies, in an interview with MD+DI. “And now that we're living longer, that slow loss of bone catches up to us to the point where the bones can become very fragile, and then we have fractures that can be deadly and life-altering,” she said.

However, it has long been known that exercise, particularly high-impact exercise, stimulates bones to be healthier and denser, Yecies said. And per research with animals, vibration has been shown to essentially “trick” bones into behaving as if they are undergoing high-impact exercise.

NASA extended this research, studying the use of whole-body vibration technology for its astronauts, who were at risk of losing bone density in a zero-gravity environment where their bones were not subjected to even the force necessary to walk. NASA’s system consisted of a vibrating plate that a person would stand on with their knees locked to prevent or reduce attenuation. Attenuation refers to the dissipation of the vibration as it travels up the body.

Bone density improved for people who complied with the treatment. “That was a pretty significant finding,” Yecies said. However, she noted that compliance was not ideal, due to the amount of time needed to stand with knees locked and the plate’s high cost.

Yecies said her company saw this research and the issues surrounding it, and wondered how they could use vibration more effectively, especially targeting the hip and spine. “Hip fractures can be devastating—20 percent of people who break their hip die in the year following the hip fracture, and another 40 percent lose their independence,” Yecies said.

The company designed the OsteoBoost belt to be worn at the sacrum where the vibrations can be applied directly to the hips and spine. “We believe that the belt is both going to apply the vibration where it's needed, and also be very convenient, comfortable, and easy to use,” she said, noting that the frequency and strength of the vibration are specifically calibrated, based on the research from NASA.

“Very important,” Yecies said, “is that we confirm the level of vibration that is transmitted to the skeleton,” citing an example of two people--one very thin and one who has more fat deposits around their hips. “If the belt vibrates the same, the person with more body fat will have less vibration being transmitted to the skeleton.” Therefore, she said OsteoBoost has a calibration feature with an accelerometer that measures how much vibration is getting to the skeleton at the iliac crest. The measurement is completely dynamic, and the vibration is adjusted daily to accommodate for thicker clothing, for example.

OsteoBoost can be worn during almost any activity except sitting. “What we found in the clinical trial is that we have very high compliance rates,” Yecies said. “People even say that the device feels pleasant. It feels a little like massage. It's comfortable and it's pretty easy to incorporate it into your day.” It should ideally be used for 30 minutes a day, five days a week.

Anyone can use OsteoBoost, but postmenopausal women may benefit the most from the device. “One thing that I advocate for is knowledge about your bone density,” Yecies said. “Medicare pays for a DEXA scan, which is one of the most common tests for bone density at 65, but by then, it may be quite late to intervene,” she explained. “In the postmenopausal period, women will typically lose typically up to 20 percent of their bone density. This is as much as five or six percent per year that you're losing. We believe that we can cut that in half and slow it to two to three percent,” she said.

OsteoBoost was granted breakthrough device designation by FDA, and CMS has announced that Medicare will cover all breakthrough products for four years. “FDA agreed with our assessment that this is a significant unmet need and that there's nothing else on the market that is similar,” Yecies concluded. 

Automotive Aftermarket Trending Strong through 2026

Image: Nosorogua/Adobe Stock disassembled car

The automotive aftermarket components market is projected to show robust growth from 2020 through 2026, according to a report from Global Market Insights released at the end of December 2020. The industry is primarily consumer-driven and undergoes rapid changes owing to constantly evolving consumer preferences and demands, the report noted.

Shifting market competitiveness in emerging economies, such as China, India, Thailand, and Vietnam, is another contributing factor enabling manufacturers to offer economical goods to the global market, aiding global expansion. Advancement in technology coupled with rapid improvement in logistics also has allowed manufacturers to offer high-quality goods faster and farther.

In the United States, increasing used-vehicle sales are further propelling the maintenance and repair side of the automotive aftermarket. Global Market Insights provided statistics from the US Bureau of Transportation that cite used car sales rising from 37,255,000 in 2015 to 40,805,000 in 2019. Aging vehicle fleets are also considered to be a growth factor in the demand for aftermarket services and maintenance, as consumers keep their vehicles longer.

Replacement parts dominated the automotive aftermarket share in 2019 led by several factors from rising vehicle ownership to surging car accidents. The Alliance of American Insurers (AAI) said that the price to build a vehicle worth $25,000 only using OEM parts could cost over $100,000.

Evolving consumer lifestyles coupled with the rising need for functional and specialized gadgets are supporting consumer demand. The interior accessory segment leads the market due to an increase in the adoption of audio/video accessories, gauges, and switches. Fast-paced innovation and rapid prototyping are enabling manufacturers to quickly meet industry needs, said Global Market Insights.

Speaking of rapid prototyping in the automotive aftermarket, Xometry Inc. noted in a blog post on its website that as the lifespan of machinery increases, so does the need for spare parts to replace failed or worn original components. “Replacement car components are in high demand,” said Xometry. “The US automotive aftermarket industry, for example, totaled $318.2 billion in 2013, contributing more than 2.3% to GDP. Additionally, the average age of registered automobiles in the United States has been growing steadily and is expected to reach 11.7 years by 2019. . . . But, the extended lifespan of the automobile has increased the window for component failure, increasing the need for aftermarket parts.”

Xometry also notes that the aerospace aftermarket parts demand is growing rapidly, “as the average fleet age of the world’s five largest airlines (by passengers carried) was 12.2 years.

Xometry is an on-demand custom manufacturer specializing in 3D printing through a wide range of processes. The company also provides injection molding services. “With tooling costs largely driving the high price of aftermarket components, OEMs can harness the benefits of 3D printing to produce low-volume spare parts on an as-needed basis,” said Xometry. “This has the potential to dramatically shrink the supply chain, thereby freeing resources associated with the production, delivery, and warehousing of parts.”

OEMs also have the opportunity to realize profit from the production and sale of aftermarket components while protecting their intellectual property and fending off competition from third-part manufacturers who reverse-engineer and sell components at prices that OEMs often cannot match with traditional manufacturing methods, Xometry said.

Hedges & Company, an automotive digital marketing and research agency for the automotive aftermarket industry, released its most recent report, "Automotive Aftermarket: Transforming the Auto Parts Market 2021." It said that personal consumption of auto parts hit an all-time high in June 2020, reaching an adjusted $50.509 billion; July (revised) was $49.884 billion and August improved to $50,303 billion, reported Hedges.

Hedges’ research shows the North American online auto parts industry market size will be about $26 billion in 2021. It will be up 30% or more from 2020. The light-duty automotive aftermarket industry size is projected to be $290 billion in 2021. The entire automotive aftermarket/auto-care industry, including medium and heavy duty, will be about $388 billion in 2021. 

GM Fuel Cells Will Power Navistar Electric Trucks

Navistar, Inc. Navistar RH.jpg
Navistar RH truck.

Heavy truck maker Navistar has named General Motors as its partner to provide hydrogen fuel cells to power the company’s signature tractor-trailer rigs.

Although rapidly improving technology has made lithium-ion batteries the energy source of choice for electric passenger vehicles and around-town commercial vehicles, long-haul trucks are better suited to hydrogen fuel cells.

General Motors Co.210127_GM_Hydrotec Infographic truck.jpeg

The companies are working with OneH2 for hydrogen fueling equipment to provide trucking companies a turn-key hydrogen fuel cell fleet solution. The pilot customer is J.B. Hunt Transport, Inc., whose yellow logo is familiar to anyone who spends many miles on Interstate highways.

"Hydrogen fuel cells offer great promise for heavy-duty trucks in applications requiring a higher density of energy, fast refueling, and additional range," said Persio Lisboa, Navistar president and CEO. "We are excited to provide customers with added flexibility through a new hydrogen truck ecosystem that combines our vehicles with the hydrogen fuel cell technology of General Motors and the modular, mobile and scalable hydrogen production and fueling capabilities of OneH2."

Navistar’s International RH series trucks will debut the fuel cell powertrain in model year 2024, with testing to be completed by the end of 2022, according to Navistar. The aim is for the vehicle to have a range of more than 500 miles and for re-fills to take less than 15 minutes.

General Motors Co.210127_GM_Hydrotec Infographic_B.jpg

Each truck will pack a pair of GM’s Hydrotec fuel cell power cubes. Each of those contains more than 300 hydrogen fuel cells and its own independent thermal and power management systems. Compact 48” x 31” x 22” dimensions make the power cubes easy to package into many different applications.

"GM's vision of a world with zero emissions isn't limited to passenger vehicles. We believe in EVs for everyone," said Doug Parks, GM executive vice president of Global Product Development, Purchasing and Supply Chain. "We're thrilled to work with like-minded companies like Navistar and OneH2 to offer a complete solution for progressive carriers that want to eliminate tailpipe emissions with a power solution that can compete with diesel."

OneH2 will provide the hydrogen fueling infrastructure to support the trucks. That will include hydrogen production, storage, delivery, and safety. Navistar is taking a minority stake in OneH2 as part of the deal. The partners anticipate supporting 2,000 International RH Series fuel cell electric vehicles “in the near term,” Navistar said.

 

Robotic Pick, Place, and Ecommerce Fulfillment Pays Off

This must-see video case study at DCL Logistics provides a highly detailed business case for the direct-to-consumer automation system the company installed. It features Universal Robots’ UR10e cobot. Detailed metrics begin with a 500% efficiency increase, 50% labor savings, a three-month return on investment (ROI), and 100% order accuracy. See it all from the user’s point of view from four DCL leaders: David Tu, president; Brian Tu, chief revenue officer; Isaac Toscano, automation engineer; and Walter Perchinumio, senior software engineer.

Artificial Intelligence not Lost in Space

Adobe Stock AdobeStock_278074443_1540-800.jpeg

Engineers of a certain age might well remember the B9 Robot—usually just called “Robot”—that roamed the universe with the Robinson family in the TV series, Lost in Space. Robot was probably one of the earliest examples of an autonomous, advanced AI robot, i.e., not a sentient being but a robot with human-like intelligence.

The technical community has yet to create such a robot and, if Ray Kurzweil’s prediction from the 2011 Times article is correct, it won’t achieve such AI prowess until the year 2045. Still, AI continues to dominate the news. The reality has not always lived up to even a modest level of the hype. For example, the goal of fully autonomous vehicles for the general public remains elusive in the automotive space.

The space industry is a notable exception to the AI hype. Currently, AI is actually be used to help in the manufacturing of satellites and spacecraft systems. It’s easier to prevent biological contamination during the assembly of satellites if fewer humans are involved. AI is being used to help robots’ function more efficiently in such tasks.

AI-enhanced imagery is often the key functionality for many satellites in orbit. Some estimate that satellites process about 150 terabytes of data every day to capture weather and environmental images, to name a few examples.

Monitoring the health of satellites is another growing application for AI. By constantly watching sensors and equipment, AI can detect failures, provide alerts and sometimes carry out corrective action. SpaceX for example uses AI to keep its satellites from colliding with other objects in space.

To learn more about the application of AI and machine learning (ML) technology in space, Design News sat down with Ossi Saarela, Space Segment Manager, MathWorks. 

Design News:  What potential does AI have in the space industry, specifically for engineers?

Ossi Saarela: The space industry has a long history of striving to increase the autonomy of spacecraft, which goes all the way back to the early days of spaceflight. Today’s machine and deep learning technologies are being used to address the industry’s most challenging problems.

The uses of AI in the space industry will enable engineers to work on more capable systems with improved efficiency. Further, AI and ML have the potential to help engineers throughout the design, test, and operational phases of a space program:

  • AI will enable engineers to design capabilities for spacecraft that were previously prohibitively complex to implement.
  • AI will help to assure the quality of these complex systems by providing new insights into test data.
  • AI will continue to be used as an aid to operations engineers by monitoring spacecraft health data for potential issues.

Design News: What does the future of autonomy look like for spacecraft?

Ossi Saarela: The more decisions space vehicles can make on their own, the more valuable they are for space exploration. In deep space, if these vehicles depend on humans on the ground for decisions, they are left idle while waiting for ground commands to reach them for minutes, hours, or even days due to the distances involved. Autonomy is a crucial limiting factor for spacecraft and increasing it will allow improvements to existing space applications and enable completely new missions.

New missions are experiencing an increase of autonomy requirements, which is usually driven by more ambitious goals like deep space exploration missions to other planets, moons, and asteroids with a focus more on landings and returns as opposed to fly-bys. These types of time-critical maneuvers are not feasible without autonomy. The trend holds true even for crewed programs, where autonomous docking to other spacecraft and autonomous moon landings are becoming the required norm, allowing valuable astronaut training time to be spent learning other tasks.

Design News: How are machine learning and deep learning techniques pushing autonomy forward?

Ossi Saarela: The primary autonomy applications of machine learning and deep learning currently are improving the perception capability of spacecraft through computer vision applications and allowing spacecraft to make better sense of what they are seeing with their cameras and other instruments. For example, the European Space Agency (ESA) is flying a technology demonstrator Earth observation satellite called PhiSat-1, which uses AI to detect and filter out pictures of clouds, which for many Earth observation purposes are not useful. For deep space exploration, potential applications include safer landings on planets or moons utilizing autonomous hazard detection as well as autonomous driving capabilities for rovers.

Design News: How does engineering software play a key role in the advancement of AI?

Ossi Saarela: Machine learning and deep learning inherently requires engineering software because the models are generated by computers rather than humans. The software to test deep learning and machine learning prototypes often have a steep learning curve, which can lead to inefficient development cycles. There is a growing demand for software tools that allow engineers who are domain experts in their industry to more easily evaluate and deploy machine learning and deep learning models into their production programs. This is where tools such as MATLAB and Simulink, which have machine learning and deep learning capabilities but are designed for engineering workflows, can help.

Setting deep learning and machine learning aside for a moment, even traditional autonomy algorithms benefit from using modern engineering software. As decision-making capabilities are delegated from human operators to the spacecraft, the complexity of the design increases dramatically. Design errors in these complicated AI systems can be subtle and hard to catch. For example, it’s difficult to assess a vision-based sensing and perception algorithm’s reliability in multiple lighting and perspective conditions through review alone; doing so requires extensive simulation and testing. Engineering software tools can enable simulation and testing capability throughout the design lifecycle. They also enable engineers to assess the design at different levels of abstraction—from static architecture to dynamic behavior modeling, all the way to the source code.  These are crucial capabilities for good systems engineering.

John Blyler is a Design News senior editor, covering the electronics and advanced manufacturing spaces. With a BS in Engineering Physics and an MS in Electrical Engineering, he has years of hardware-software-network systems experience as an editor and engineer within the advanced manufacturing, IoT and semiconductor industries. John has co-authored books related to system engineering and electronics for IEEE, Wiley, and Elsevier.

ESA, CERN/M. Brice-Sat_AI_700.jpg

Caption: PhiSat-1satellite is providing AI for Earth observation. (Image Source: ESA, CERN/M. Brice)