Testing the Unanticipated: How Instrumental Brings AI to Quality Inspection: Page 2 of 2

Instrumental's founder and CEO Anna-Katrina Shedletsky talks about how her company applies machine learning to quality assurance inspection, lessons she learned as an engineer at Apple, and what Industry 4.0 has gotten wrong.
Instrumental's software displays inspection images of a product through every phase of manufacturing. (Image source: Instrumental)

DN: Your system is partially cloud-based, but you there are also some edge processing capabilities within it too. Given this, how do you handle things in terms of scalabilty?

Shedletsky: The cloud is very scalable. We use GPU acceleration. Also, for our cloud processing we don't have to provide a result in a split second. It's okay for it to take a couple of seconds or minutes to get a result because our cloud results are latent – our users log on after the fact to see what has happened.

Once someone is expecting a real-time result that's when we push it down to the edge where we can control our GPU hardware on the line, which is pretty cheap these days. Thanks to Nvidia and other companies like that we're able to provide pretty-high powered processing units with every station for not a lot of cost, which enables us to do our computations very quickly.

DN: Does the cloud also offer advantages in terms of deploying updates to your algorithms?

Shedletsky: Absolutely. Our algorithms are constantly learning so each individually-tuned algorithm essentially learns on its own what it's supposed to be doing based on very limited feedback.

We're constantly evaluating our algorithm against a kit of of aggressive cases to actually improve our detection in challenging cases. We've made significant improvements and strides in that area and we continue to invest there.

In the last quarter, for example, we actually built new algorithms and that can automatically read barcodes from an image. The barcode is important because that's how you get traceability to look up a unit later and go back and figure out what's wrong with it.

DN: Is there any sort of crosstalk going on behind the scenes where the algorithm is learning not just from where its deployed but from algorithms deployed with other companies as well?

Shedletsky: There is no portability in the learning – meaning if we're working with a company and their proprietary data is in our system they own their data. So one customer's data is not being used directly to inform another customer's result. We're working on improving the algorithm performance for their specific detection.

However, there are opportunities from having access across the data set to make universal algorithmic improvements and do some really cool stuff that can do some useful automation. We're actually working on that now with customers who have given us permission to use their data.

DN: Instrumental sits in a very interesting space at a time when there are a lot of conversations happening around AI, robotics, and factory automation. How do you see your work playing a role in the larger trend towards Industry 4.0?

Shedletsky: As far as the future, what I would say is when I think people sit back and dream about the future of the manufacturing space they see robots everywhere. But they're actually dreaming of something much bigger than that.

People think automation and Industry 4.0 is the answer, but that's only halfway there. What people are really dreaming about and want is autonomy. But automation is not autonomy.

DN: So you see Industry 4.0 as only a stepping stone to something larger?

Shedletsky: Autonomy is a whole other level on top. It's a brain that will create factories that can actually self heal and self optimize. That's what Instrumental is building. We're building brains. We have an application that has to do with visual inspection today, but within the next year we'll be working on data that's not just visual in nature.

I talked about all the waste that occurs in manufacturing to bring one product to market. We can we can start to chip away at that if the line itself can actually improve. And I'm thinking much bigger than an individual line. You've got to zoom way out because there's a whole supply chain and there is an opportunity to connect the data all the way from the top of the supply chain down to the customer.

DN: Ultimately, would Instrumental like to link all of these pieces of the supply chain together in an autonomous way?

Shedletsky: Yes, and way that we're doing that is, frankly, very different than other companies who talk about Industry 4.0. We're a manufacturing data company that does not sell to the big manufacturers. Our customers are the brands that build and design these products, whose logos go on the product – because the brands own the supply chain.

Having access across the supply chain is really where the value is going to be in the long term – having all of that data and creating a brain that can process that data and push back correction.

In the first steps it will be pushing insights to a person who's still evaluating those insights before they implement an action. The next step is to actually tell the human why the feedback is happening so they don't have to do that stuff either. Eventually, we'll get to a point where we can even take the action. There will be robots on the line to handle those things and we'll just plug right in.

That's the blue sky opportunity of the space. And I think that the Industry 4.0 people have it wrong. To make an analogy to automotive: If the goal is a sort of Level 5 autonomy for manufacturers, Industry 4.0 is only Level 3.

*This interview has been edited for content and clarity.

Chris Wiltz is a Senior Editor at  Design News covering emerging technologies including AI, VR/AR, blockchain, and robotics.

 

Comments (0)

Please log in or to post comments.
  • Oldest First
  • Newest First
Loading Comments...