Metaverse 2.0: Evolving from Virtual Worlds to Industry Giants
The Metaverse isn’t dead; it’s evolving to reshape how we connect and interact, blending real-time digital communication with GenAI.
metaverse, Enterprise
“The Metaverse is dead!” is a common claim as AI takes the spotlight. But the metaverse, beyond avatars and virtual worlds, is very much alive and evolving quietly. As digital transformation spreads, metaverse-like tech is reshaping how we interact with data, machines, and each other—especially in industries like manufacturing and enterprise, often driven by generative AI.
Common Perceptions of the Metaverse
When people hear "metaverse," they often picture virtual worlds seen through VR headsets—digital avatars interacting, sometimes AI-generated, exploring imaginative environments. Tools like MidJourney help create futuristic visuals, suggesting an immersive digital universe for socializing, working, and playing with cutting-edge creativity.
Beyond Avatars and Islands
The future of the metaverse goes beyond a single launch or virtual worlds with avatars. It’s gradually emerging through real-time digital interactions across many areas, including technology like autonomous robots. This shift is like moving from physical maps to GPS—a broader transformation mirroring the metaverse’s evolution.
Thought Experiment: Metaverse and Autonomous Robots
Futuristic urban environment technology integration, render visualization, featuring a fleet of drones flying in formation, grid-like patterns, and precision in autonomous operations. Photo: compoundY
Autonomous robots are typically trained in simulations like virtual worlds and digital twins, undergoing countless iterations to develop AI models. These models are then used in real robots, prompting the question: are we effectively the metaverse for these physical robots?
When we interact with them, are we engaging with their digital training? Thus, autonomous robots can be viewed as extensions of the metaverse, which doesn’t replace the physical world but enhances and expands it through technology.
Industrial Metaverse
Industrial Internet of Things (IoT): Sensors and devices collect data from machines, equipment, and operations.
Data Gathering Leads to Digital Twins: This data creates digital replicas (digital twins) of physical assets.
Run 'What If' Scenarios on Digital Copies: Digital twins are used for AI predictive maintenance, machine learning optimization, employee training, supply chain modeling, energy efficiency exploration, and interacting with factories via large language models (LLMs). In essence, when real-time interaction occurs with a digital twin, it forms part of the metaverse.
Digitally Train Autonomous Robots: Simulated environments allow robots to be trained and optimized.
Deploy Physical Robots: The AI models created in the digital environment are applied to physical robots.
Start Digitally and Repeat - Digital Thread: This continuous feedback loop between digital and physical assets forms the backbone of the industrial metaverse.
Industrial Metaverse: A New Space Race
Floating 3D digital environments showcases the potential of digital twins in technology innovation. Virtual landscapes simulate urban planning, industrial setups, and interactive spaces. LLMs (large language models) powered digital twins enable predictive analysis, interactive design, and decision-making, vital for industries merging physical and digital realms in the metaverse.
The industrial metaverse is becoming a competitive space, with Siemens, NVIDIA, BMW, and Microsoft leading the way. These collaborations enable digital-first factories, where entire production systems are simulated in detail—“down to the last bolt”—before any physical build begins.
Likewise, Siemens, NVIDIA, and Hyundai HD have created a 7-million-part digital ship model, cloud-rendered and accessible on any device, not just high-end workstations. Once unimaginable at this scale, these innovations are reshaping both enterprise and consumer applications of the metaverse, expanding the potential of digital twins and virtual simulation.
OPENUSD: Open Universal Scene Descriptor
Created by Pixar, Universal Scene Description (USD) is the pioneering open-source software designed to efficiently and reliably exchange 3D scenes made up of various assets, sources, and animations, all while enhancing collaborative workflows.
OPENUSD (Open Universal Scene Descriptor) is a rising standard that enables 3D asset interoperability in metaverse environments. Initially made for Hollywood CGI, it’s now used across industrial, enterprise, and consumer 3D applications. Supported by major engines like Unity and Unreal, OPENUSD aims to unify 3D content creation and management, helping standardize the metaverse.
Final Thought: Reality Blends Digital and Physical
When digital creations become physical—like 3D-printed items or factories—they affect the real world. The metaverse isn’t just a virtual destination; digital and physical experiences influence each other. Some things need physical presence, others thrive digitally. It’s about balance, like Yin and Yang.
Beyond VR headsets and avatars, the metaverse will include surprising, imaginative elements—like flying penguins—adding whimsy to its diverse experiences.
Note. This article is based on the *BCS Sussex branch event: "Metaverse, Did You Think It Had Gone Away?" The presentation provides a mix of both old and new perspectives on the significance and ongoing evolution of the metaverse.
Speaker Biography
Ian Hughes, known online as epredator, is the Chair of the BCS Animation and Games Development Specialist Group. In 2006, he played a key role in introducing colleagues and companies to virtual worlds like Second Life, beginning with the Wimbledon Championship. This work earned him the title of metaverse evangelist, exploring innovative ways to communicate online, many of which are now seeing a resurgence. A lifelong gamer since 1970, his passion led to a career in software engineering, as detailed in the BCS article Memoirs of a Bedroom Coder.
Venue:
University of Sussex
Chichester Lecture Theatre
Falmer BN1 9RH
About the Author
Razvan Chiorean is a published author of compoundY and a cutting-edge researcher in quantum computing, AI-ML, and blockchain technology. Through his #AIResearch handle, Razvan continues to conduct research, blog, and educate, bridging cultures and inspiring technological progress while consistently sharing his findings and insights. He collaborates with leading tech companies, contributes to open-source projects, and is dedicated to fostering ethical standards and inclusivity in technology, ensuring a future where advancements benefit everyone.
****
Related Posts
Inspiration & Education
Quantum Leap: The Next Era of Computing
Breaking Boundaries with Quantum Technology
Blockchain Inspired Art
Web3
Take Control of Your Digital Assets
Recent Posts
How AI and Quantum Computing Are Shaping the Future of Work
The Future of Finance
CeFi vs. DeFi: Understanding the Shift from Traditional Finance to Decentralization
Follow the Money
Explore More
- June 2025
- May 2025
- April 2025
- January 2025
- December 2024
- October 2024
- September 2024
- August 2024
- July 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023