AI Moves Down to the Edge and into the Robot
2024 will see generative artificial intelligence move from the cloud to the edge as it also enters robotics.
November 27, 2023
At a Glance
- Generative AI at the edge
- Robots gain AI
- Responsible AI and ML
2024 is likely to be another major year for developments in artificial intelligence. Many of the coming changes include the practical use of AI in industrial applications. That includes using AI to drive production, from the edge to robotics. We asked two AI experts to prognosticate on likely 2024 developments. Krishna Rangasayee, founder and CEO of SiMa.ai, an edge ML company, looked at coming attractions in AI and machine learning at the edge. Brendan Englot, associate professor and director of the Stevens Institute for Artificial Intelligence (SIAI) at Stevens Institute of Technology offered a look at AI and robotics
Krishna Rangasayee sees AI moving from the cloud down to the edge.
How will generative AI be used at the edge?
Krishna Rangasayee: As we move further into the digital age, the convergence of AI and physical devices is poised to drive transformative changes across various industries. The first wave of generative AI happened in the cloud, in mostly the form of a consumer-like experience. The second – and more meaningful wave – will happen at the edge. OpenAI’s recent pause on ChatGPT Plus sign-ups was a glaring indicator that the cloud can’t handle the scale and performance required to succeed in supporting the mission-critical work happening at the edge (think what such a “pause” would do to unmanned drones or medical devices actively in use). While much remains to be determined when it comes to the look and feel of multimodal AI at the edge, one thing is clear – there’s no doubt 2024 will bring fundamental changes to the machines humans rely on.
Will AI and ML be driven by responsible industrial use?
Krishna Rangasayee: Machine learning model accuracy will underpin responsible AI and ML in 2024 and beyond. Chatbots, agents, and copilots have taken off in 2023, despite near-constant hallucinations. In 2024, responsible AIML will become a topic du jour, particularly as use cases expand and more demographics begin to interact with generative AI. That push will be driven largely by a focus on the accuracy of ML models, particularly as generative AI continues to push into highly regulated fields like healthcare and finance.
As edge ML becomes more prominent, this will get easier. Cloud-based models are generally operating on pre-processed data, making it much harder for developers to understand why their models make the decisions they do, and even more challenging to correct. 2024 will bring increased scrutiny on companies like OpenAI and Anthropic, which will trickle down to smaller providers, as well as open-source developers, as engineers utilize edge AI/ML, smaller models, and further fine-tuning.
As well as AI at the edge, is Industry 4.0 generally moving to the factory floor?
Krishna Rangasayee: Yes, finally! Advanced technologies – sensors, machine learning, computer vision, robotics, edge computing, etc. – have proven to increase supply chain resiliency for manufacturers who adopt them. While robotics and industrial automation companies have touted these capabilities for years, 2024 is the year they become real. As tech companies realize the need to diversify their operations and embrace Industry 4.0 technology to be more resilient, factories will become smart manufacturing ecosystems, where AI-driven systems are seamlessly integrated into every stage of the production process.
Artificial Intelligences Meets the Robot
We checked in with Brendan Englot of the Stevens Institute for Artificial Intelligence (SIAI) at Stevens Institute of Technology for his predictions for AI and robotics.
What’s the role of generative AI in robotics?
Brendan Englot: Generative AI has made a great impact on robotics. Researchers are working hard to establish foundation models inspired by large language models like GPT, capable of generalizing across large classes of robotic systems. In the future, these models may assist complex, sophisticated robots and autonomous vehicles to achieve high-performance levels with very little real-world practice.
You’ve mentioned generative AI and model-based planning. How does that work?
Brendan Englot: We’ve seen amazing advances this year in generative AI capabilities, but also the various ways it can "hallucinate" when answering our questions. Soon, sophisticated tools will merge generative AI with state-of-the-art model-based planning, simulation, and optimization frameworks. These tools will engage in back-and-forth conversations until they converge upon solutions for complex problems that obey all the rules of a given domain and are free of hallucination.
Explain how generative AI can be used in engineering systems.
Brendan Englot: In the long term, we’ll see generative AI, human domain experts, and optimization tools working together to rapidly design and analyze complex engineering systems in a fraction of today’s time.
Finally, you have talked about robots growing legs.
Brendan Englot: After many years of development and refinement, robots with legs are finally poised and ready to take center stage. Biped and quadruped robots are now capable of navigating autonomously in complex real-world environments with high reliability and safety. If you haven’t seen them yet, expect to witness them in action in the next few years, taking on various tasks, including moving freely in challenging wilderness or hazardous environments.
About the Author
You May Also Like