NVIDIA: Real-time AI drives industrial automation’s next phase

NVIDIA: Real-time AI drives industrial automation’s next phase

Source Node: 2520068

Ryan Daws is a senior editor at TechForge Media, with a seasoned background spanning over a decade in tech journalism. His expertise lies in identifying the latest technological trends, dissecting complex topics, and weaving compelling narratives around the most cutting-edge developments. His articles and interviews with leading industry figures have gained him recognition as a key influencer by organisations such as Onalytica. Publications under his stewardship have since gained recognition from leading analyst houses like Forrester for their performance. Find him on X (@gadget_ry) or Mastodon (@gadgetry@techhub.social)


.pp-multiple-authors-boxes-wrapper {display:none;}
img {width:100%;}

The heavy lifting in manufacturing, factory logistics, and robotics is getting assistance from real-time AI, according to NVIDIA CEO Jensen Huang’s keynote at GTC 2024.

A simulation-first approach is paving the way for the next phase of automation in these industries that often involve bulky products, expensive equipment, collaborative robotic environments, and logistically-complex facilities.

Huang demonstrated how developers could use digital twins to develop, test, and refine their large-scale, real-time AI entirely in simulation before deploying it in industrial infrastructure—saving significant time and cost. NVIDIA’s Omniverse, Metropolis, Isaac, and cuOpt platforms interact in “AI gyms” where developers can train AI agents to help robots and humans navigate unpredictable or complex situations.

In a demo, a 100,000-square-foot warehouse’s digital twin built with Omniverse operated as a simulation environment. It included dozens of digital workers, multiple autonomous mobile robots (AMRs) running NVIDIA Isaac’s multi-sensor stack, vision AI agents, and sensors. Metropolis created a centralised occupancy map by fusing data from 100 simulated camera streams to inform optimal AMR routes calculated by cuOpt’s complex routing optimisation AI.

This all happened in real-time while Isaac Mission Control coordinated the AMR fleet using cuOpt’s mapping and routing data. When an incident blocked an AMR’s path, Metropolis updated the occupancy grid, cuOpt planned a new optimal route, and the AMR responded accordingly to minimise downtime.

[embedded content]

Using Metropolis vision models and the Visual Insight Agent framework, developers can build AI agents to help operations answer questions like “What happened in aisle three?” with insights from video analysis. These visual AI agents will help industries extract actionable insights from video using natural language.

The AI capabilities demonstrated are enhanced through continuous simulation training and deployed as modular NVIDIA inference microservices, driving the next phase of industrial automation powered by real-time AI.

(Photo by CHUTTERSNAP)

See also: Flying taxis and delivery drones set for UK skies by 2030

Want to learn about the IoT from industry leaders? Check out IoT Tech Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with other leading events including Cyber Security & Cloud Expo, AI & Big Data Expo, Edge Computing Expo, and Digital Transformation Week.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

Tags: ai, artificial intelligence, automation, cuopt, digital twin, industrial automation, IoT, isaac, metropolis, nvidia, omniverse

Time Stamp:

More from Iot News