Nvidia GTC: How to build the industrial metaverse

Nvidia GTC: How to build the industrial metaverse

Did you miss a session at the Data Summit? Watch On-Demand Here.


Nvidia’s Omniverse is a precursor for the open metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One. And it is rapidly moving beyond science fiction into the realm of industrial applications.

That’s because the metaverse isn’t going to be just about fun and games. Enterprises are using the open platform of the Omniverse as a way to simulate in the digital world before they build something in the real world. These “digital twins” of things like BMW factories and even the climate model of the entire Earth are helping to give real meaning to the metaverse among enterprises and other non-game applications.

For the third time at an Nvidia GTC event, I moderate a panel on the Omniverse and the progress that companies are making as they build 3D assets, simulate their engineering designs, and share across company lines. Our panel assessed how much progress we have made and what kind of progress they expect to make in one, five, or ten years.

The panelists included Rev Lebaredian, head of the Omniverse and simulation technology group at Nvidia; Virginie Maillard, global head of research for simulation and digital twins for Siemens; Amy Bunszel, executive vice president of architecture, engineering, and construction design solutions at Autodesk; Timoni West, vice president of augmented and virtual reality at Unity; and Lori Hufford, vice president of engineering collaboration at Bentley Systems.

Event

GamesBeat Summit 2022

Re-experience the excitement of connecting with your community live at GamesBeat Summit’s in-person event on April 26 in Los Angeles, CA, and virtually April 27-28. 30+ sessions and 500+ attendees are set to arrive, so don’t want to miss this opportunity to expand your network. Early bird pricing ends March 25. Get your pass today!

Register Now

Here’s an edited transcript of our panel.

Nvidia’s GTC panel on the industrial metaverse.

VentureBeat: My name is Dean Takahashi, and I’m the lead writer for GamesBeat at VentureBeat. I’ve organized a couple of metaverse conferences, and so I’ve heard a lot about the metaverse lately. I’m organizing another event dubbed GamesBeat Summit 2022 on April 26-28. This is also the third metaverse session I’ve moderated at a GTC event in the past year.

During all this time, the best definition I’ve heard of the metaverse is that it’s the “spatial internet.” While I believe that the metaverse isn’t here yet, thanks to these sessions, I believe I’ll know it when I see it. I feel like the point of this session is to figure out where we are on the path to the metaverse, what industries are involved in it, and how far we still have to go.

Normally we hear a lot about the metaverse from the gaming point of view, but this is about the industrial metaverse, about things like digital twins of factories and the Omniverse. With that in mind, I’d like to have our panelists introduce themselves. Let’s start with Virginie.

Virginie Maillard: I’m the global head of research for simulation and digital twins for Siemens.

Amy Bunszel: I’m the executive vice president of our architecture, engineering, and construction design solutions at AutoDesk.

Timoni West: I’m the vice president of augmented and virtual reality at Unity.

Lori Hufford: I’m vice president of engineering collaboration at Bentley Systems.

Rev Lebaredian: I lead our Omniverse and simulation technology group at Nvidia.

VentureBeat: Let’s start by talking about your short view of the metaverse and your company’s place in it. Virginie, can you start us off again?

Maillard: For Siemens, the industrial metaverse has been here for a while in the form of our digital twins and digital threads. We’ve applied these in factories, transportation, buildings, smart cities. All along the full life cycle for design, production, and operation. Our industrial engineering software already provides key building blocks for the industrial metaverse, including the immersive user experiences with AR and VR capabilities. We can already provide that, for example, in Simcenter or Process Simulate.

In the future we believe the industrial metaverse will expand, because of the availability and combination of multiple technology enablers together, offering more realistic immersion, real time feedback, and more openness and interoperability between tools. We say that these new capabilities will allow our customers to extend their experience, improve interaction with the digital world, enable collaboration between different people in the digital world, and also combine physical and digital to more easily and quickly test for different scenarios in the real world. For Siemens, the industrial metaverse will be an evolution of what we already provide to enable better digital twins and digital threads.

Jensen Huang, CEO of Nvidia, introduces Omniverse Avatar.
Jensen Huang, CEO of Nvidia, introduces Omniverse Avatar.

Bunszel: One thing that excites me about the industrial metaverse is how it will take down barriers between collaboration and enable innovation, while also helping us operate all of our built assets more effectively. This includes all the way into renovation and decommission, which is going to be important for sustainability.

I feel like on the one hand, the industrial metaverse enables the creation of digital twins of places and processes and real world objects, but I also want to talk about what we’re doing using the metaverse to provide rich context for new designs. If you think about how our customers can create a metaverse that helps them visualize, simulate, and analyze physical and functional performance before they ever build anything–this is good for buildings and infrastructure, and it will enable them to engage with stakeholders broadly across the entire world, helping them explore more ideas, from the crazy to the practical, before they ever begin to build anything.

Another great thing about the metaverse is it will enable our customers to bring together data from any vendor. The idea of the metaverse being inherently open and enabling our customers to aggregate data from anywhere. I think we all know that people use tools from multiple vendors. Pulling everything together can sometimes be a challenge.

As far as the role of AutoDesk, in case it’s not obvious, we’re not building the metaverse. Our solutions really contribute to the metaverse. We’re looking at ways we can help our customers unlock the value of the data they create in our solutions by enabling them to easily bring this data to the metaverse, to explore all that’s possible there.

West: My view is a bit different. I tend to think of the metaverse as a metaphor, in much the same way we talked about the information superhighway in the ‘90s. It’s less of a thing that will come to exist and more of a way to describe what I consider a much more fundamental and interesting shift, which is how computers have changed over the last decade. They went from being networked and portable, which is a big change we saw up to 2010, to now having a ton of miniaturized and high-fidelity sensors and machine learning. This gives us the ability to have computers that truly understand things like scanning a space, recognizing objects, recognizing faces, being able to recognize voices, and being able to recognize the input parts of a system.

When we talk about the metaverse, we’re talking about the application of what we call contextual computing. Computers have context now through sensors and machine learning and so on. The metaverse is what people are going to do with this new wave of technology. That’s how I tend to think about it. Having realistic, real-world data – either real time or recorded, that can be simulated and virtualized – is what allows us to have highly realistic simulations and real time updates. Real time 3D mirrors of the world, for example. That gets us to some very interesting use cases for industry.

As for Unity’s place in that, we like to take in all this data from all of our partners, from the real world itself, directly from the device, and allow our developers and our community to be able to make apps that make sense of this data. That’s our place in the ecosystem.

Hufford: Some great insights on the metaverse from everyone. Bentley is the infrastructure engineering software company. So what does the metaverse mean for infrastructure, and why do we need it for infrastructure?

There’s an urgency in how we balance our relationship to the natural world. Infrastructure plays a key part in that. The intersection of digital twins, the metaverse, and the physical world is the perfect place, we believe, to invest so that teams can work better together and leverage talent from anywhere to solve these problems.

Breaking this down, a digital twin for infrastructure is a digital representation of the physical world that’s continuously updated with information from the physical world. It has to contain rich engineering data, and it has to be open to be able to take that data from any source. It also has to be able to track and visualize changes in real world conditions, IoT connected devices such as sensors and drones, that sort of thing.

But a key difference from entertainment–while in entertainment, engineering accuracy and the laws of physics don’t have to apply, because that’s one of the things that makes it fun, in infrastructure they do have to apply. Digital twins must provide millimeter precision accuracy, be geospatially aligned, and support complex engineering schemas.

So where does the metaverse come in? The metaverse allows humans to teleport or immerse into these digital twins, or to augment the physical world with information from digital twins as they design, construct, and operate the world’s infrastructure.

Lebaredian: We have some really interesting responses. Everything I’ve heard, I have to agree with everyone. It’s all right. It’s interesting, because this is such a big thing. It’s kind of amorphous right now. There are many views of what the metaverse is. It’s sort of like back in 1993 when the first web browser came out, Mosaic. People would ask, “What is the internet? What is the web?” It’s hard to answer that question when it’s just happening.

Omniverse Showroom
Omniverse Showroom is a new addition for viewing digital assets.

The way we look at it, this metaverse thing, whatever it ends up being called – maybe that word will go the way of “information superhighway” and we’ll stop using it at some point, or it might stick – whatever it is, it’s clear that virtual worlds and the representation of our world in a digital form, with the computer as a key part of that–how humans are going to interact with these virtual worlds, our interface into them, is going to be something more like the natural human experience predating computers. We’ve been using computers through these interfaces that are very two-dimensional and abstract. With all of this 3D spatial technology overlaid on top of all our information, we can now experience it in the same way we’ve been experiencing the world around us forever.

Many opportunities open up. Largely, the discussions about metaverse have been focused on entertainment, social, media, that sort of thing, and around VR in particular, which is just one mode of experiencing it. But really it’s going to be much larger than that. The panel we have here is interesting because this is a very different perspective, and in my opinion a more immediate and more important use of the metaverse and all of these technologies for mankind.

As far as Nvidia’s place in all of this, we feel that the metaverse, just like the internet, is bigger than any one company, greater than any one company. For us to get to where we want to go, we all have to help contribute to it and build it together. What Nvidia can do to help this is take our core capabilities, our skills and our passions, which is computing and computing stacks, and help power this metaverse, the simulation of these virtual worlds.

To do that, the very first thing we need to do is describe these virtual worlds in a way that everybody agrees upon, how the descriptions work, so we can exchange these worlds between tools and between simulators. The best tool and the best simulator, the best set of tools for each job, it can all work together in harmony. That’s where we’re focused.

VentureBeat: I wish we could just take everything you’ve said and visualize it. We’ll see whether we have a match up here or something totally different from everyone. We’ll get there one of these days, I think. Our first point of discussion is around how we get to the metaverse. Where have we started from already? The first wave of metaverse development is coming from game developers and content creators in places like Roblox. Rev, I wonder if you can tell us what kind of head start we already have from that perspective.

Lebaredian: I believe that if you want to know what’s coming in the future in terms of technology, you should go look at games, at any moment in time. What our children and our teenagers are doing today is what adults are going to be doing 10 or 15 years from now. All of the interesting things that have come about in technology in recent decades started in gaming first. Voice over IP, social media, digital currencies, all these things were concepts in the gaming world well before they became mainstream. Simulation of 3D worlds has been around in video games now for more than 20 years. Now that’s becoming applicable–we’re applying that to many non-gaming purposes.

But we can look there first to see how these gamers interact with each other, how they use these tools. They’re the canary in the coal mine for what’s coming. Then we can go apply it to non-gaming, non-entertainment purposes.

West: It’s funny. Every game, as it turns out, requires a game system. There are game designers who create game systems. I read a wonderful article once that explained the different roles of different people in a game studio. The game designer is the one who says, “When we come to a door, does the door open? Do I have to press a button? Does it magically disappear? Can I pull it? Do I just walk through it?” Everything that describes what’s possible in the world.

Every game engine needs to have the ability to let you describe, in real time, whatever the user is doing with that level of fidelity around the system. The curious thing is that it turns out, when you want to make a real world simulation, or if you want to input a bunch of simulation that simply reflects a real world situation, you basically need to have what we used to call a game design, but now we’ve started calling world design. We want to reflect the fact that it takes in real world data.

It was incidental. I don’t think there was a very long term plan around that, although there have always been very interesting futurists in the game design community. I’m sure people predicted this long ago. But the reality is, everything you need to design a very complex game world, realistic or unrealistic, can then be used – the physics system, animation system, terrain system – to either reflect real world information or to use it for simulation to predict real world information. That’s when we have this really interesting moment, when the tools are here just as the computers are starting to come online to be able to give that to us.

This GTC will have a lot about robots.
This GTC will have a lot about robots.

VentureBeat: Timoni, can you talk about how gaming technology is leading to the industrial metaverse? I happened to talk to Jason Rubin at Meta recently, and he was saying that he doesn’t think you can build the metaverse without a game engine. It seems like a fundamental tool here. Do you have your own view on how we’re getting to the industrial metaverse?

West: I think he’s right. We’ll see an evolution of how the tools are used. For example, back in the ‘70s, the killer app for the first business microcomputers was spreadsheets. But they were used in a very specific way. They were used for finance, for inventory. Now you can see spreadsheets being used constantly. Airtable is a great example of the evolution of a spreadsheet into something more like a data manipulator for people who understand what they want to do with their data.

In the same way, game engines were created for games, but they’ve started to be used for things like education, scientific visualizations, data simulation for AI data set training for self-driving cars. As sensor data gets more and more high fidelity–Lori talked about this. The reality is that in order to be considered industrial quality, you have to have a level of precision that you don’t necessarily need to have in video games. You need to be able to input and react to information much more quickly. You need to have much better visualizations, extremely high fidelity.

We’re just getting to a point where computers are fast enough, and our technology is advanced enough, that we can start to consider game engines for this. We’re at a point where we’re actively developing tools directly to support these types of use cases.

VentureBeat: Let’s talk about the industrial side of the metaverse here. The metaverse is often showcased in VR in consumer experiences, but the industrial opportunity seems much more expansive. Virginie, can you kick us off on that front?

Maillard: We identify different application domains of the industrial metaverse in all phases of the product life cycle, from design engineering to testing and validation to production, operation, and even training. Let me give you an example in each domain.

The first one, in the design and engineering phase, the industrial metaverse will allow more collaboration between stakeholders, with additional possibility to involve non-technical stakeholders in the process thanks to new user experiences. From this perspective, the industrial metaverse drives a democratization of simulations, allowing experts to communicate more effectively with non-experts. In that role, it will be possible to explore more design and manufacturing options more interactively, involving more viewpoints to create better products with shorter time to market as a benefit. This is the first concrete vision we have for the metaverse.

The second one, in the testing and validation phase–the combination of photorealistic environments with multidisciplinary, multiphysics simulations gives us the possibility to virtually explore more operational scenarios. For example, we can generate synthetic data to train and test autonomous systems through machine learning, like AGVs in factories, robots, or autonomous cars. Since everything is virtual, we can investigate a large range of critical situations, like accidents, and perform massive testing that we wouldn’t be able to afford in real life. The intensive virtual training and testing of systems can ensure more safety, and again, shorter time to market.

The third example I have for you is in production. The major use case for the industrial metaverse is virtual reconditioning. That means you can virtually install and test new devices and software without disturbing an ongoing production line. For example, you can visualize new equipment in the factory with augmented reality, and you can also simulate interaction with existing manufacturing assets. This already exists, but let’s say the industrial metaverse will bring an augmented user experience into this domain.

The last example I have for you is in operation. The major use case for the industrial metaverse here is already pretty well-known. It’s visualization of workflows and data around physical assets. For example, the operation data of a machine can be displayed in augmented reality on a tablet screen. The next step, we think, will be to bring information through new interfaces to the shop floor, to the operators, so they can make faster and better decisions. We call these executable digital twins. That’s a reduced model, self-contained, that can run on edge devices. This will be a compilation of the full scope of information we can have on the shop floor. As you can see, there’s a wide range of industrial applications for the metaverse.

Nvidia's Earth-2 will be a digital twin of the planet.
Nvidia’s Earth-2 will be a digital twin of the planet.

VentureBeat: Amy, what are opportunities that you’re focused on for your metaverse activities?

Bunszel: First and foremost, there’s a lot of overlap with what Virginie talks about. In general, the themes of collaboration and innovation, having all this data together and all the stakeholders together, that’s going to be revolutionary for all the industries we work in. In particular, there’s one area I’ll double click on a little bit, and that’s in AEC – architecture, engineering, and construction.

We’re looking at design, build, and operate. One of the great things about looking across that whole life cycle is you can also start to look at sustainability, bringing decisions you would have been making in the build phase up into the design phase where you can leverage all the context you get in a metaverse or a digital twin to really understand the downstream implications of decision you might make about materials or construction methods. We’re excited for the opportunity, for both infrastructure and buildings, that the metaverse will give our customers to make high impact decisions earlier in their design cycle, when they can have the most impact in the long term.

VentureBeat: Lori, where are you focusing your development for industrial metaverses?

Hufford: I agree with a lot of the same themes Virginie and Amy are talking about. For us, again, it’s the melding of the digital physical world to produce better outcomes for infrastructure. That’s where we’re focused. For example, during design, humans are interacting with digital twins to conduct interdisciplinary design reviews globally across the supply chain. In the world we live in today it’s important to be able to leverage talent anywhere, any place, to end up with the best design possible.

With the digital twin, an infrastructure digital twin, we’re able to bring in models, reality context, maps. We can identify issues and visualize changes as they’re introduced into the design process, communicate with markups, and use digital tools in the metaverse to ensure a validation of design. We’re also looking, in construction, at using augmented reality to visualize and review construction sequencing, minimizing risk and leading to better outcomes. I can also echo some of the things Virginie talked about regarding operations, using augmented reality with infrastructure digital twins to improve safety, do simulations training, that sort of thing. There are so many opportunities to meld the worlds together that result in better outcomes.

VentureBeat: Let’s talk a bit about portability and interchange. Rev, how important is it for people to be able to take assets from one virtual world to another in the metaverse? How are you overcoming the universal interchange issues here, moving data from one industrial world to another?

Lebaredian: If you step back again to 1993, when the first web browser came out, it’s hard to imagine that the web and everything we know about the internet would exist today if we didn’t have a common way of describing a web page. HTML–it wasn’t just nice to have a format that could be exchanged between browsers. It was essential. If every web browser that was created had a different way of describing content, then we wouldn’t be where we are today.

If we look at the metaverse as this evolution and extension of the internet, of the web, a spatial overlay, then the content inside this metaverse has to be in a form that’s interchangeable, that everyone can contribute to and understand and participate in. We’re still very early, though. Just like in 1993, HTML 1.0, it couldn’t do much. You could put some text together with an image and hyperlinks. It took about 20 years for us to add all of the features that we enjoy today with video and interactive content and procedures and all these things. It’s going to take a long time for us to get to where we need to go with the metaverse, but there’s a lot we can do today.

We settled on Universal Scene Description as the beginning of this. We didn’t see anything better out there, and quite frankly we think it’s quite good. But there’s still a lot of work to do. We need to layer more standards, how to describe everything we might want in a world, and how to do that in a very physically accurate way, one that matches the real world. Not just something that looks good. We’re contributing heavily to USD, and we invite all of our partners and everyone else to do so as well. Over the coming years we’ll sort out what the specific forms are, the specific patterns of how we describe different aspects of these worlds for different industries, layered on top of a common scheme.

Nvidia's earth two digital twin display
Nvidia’s Earth 2 digital twin goals unveiled at its 2021 GTC conference.

VentureBeat: Timoni, do you want to take a crack at that as well? How important is portability and interchange, and how are we going to do that?

West: Rev gave a great background. USD is awesome. We’re also big fans. What I’m very curious about is–I like how Rev talked about the web in the early days. There’s always the question of what we actually need to have interoperability about. The reason why I’m not quite full-on “rah rah interoperability”–I’m big into web standards. I still am to this day. The only problem is that it is really, really tricky – and here I go back to thinking like a game designer – to be able to describe the behavior of an object in one space versus another.

Maybe with industrial it’s a bit easier because they’ve already gone ahead and created amazing standards and levels of interoperability that make sense of the massive amounts of data that they deal with. But a classic example we see all the time is, I want to take a sword from one game and play with it in another game. Or another space, I should say, a virtual world. What does it do? What if this is a game with no swords? What if it’s a pacifist world? We may have the file format aspect that we can solve, but that later behavior of interoperability and interactivity that is a crucial aspect that differentiates the metaverse from the internet we know and love today, I really don’t have an answer to that. My guess is that it’s going to very much be a needs-must situation. As people demand this, as they find a need for it, we’ll see a homebrew effort to enable the type of behavior that users are asking for.

But as far as standards and file formats and that type of interoperability, I think that we’re making tremendous headway. We’re having great conversations and doing great work, actively discussing these things all the time. We talk to Nvidia and AutoDesk quite a bit along these lines.

Lebaredian: I totally agree. It’s super hard. There are no easy answers to any of these things. That being said, I don’t think it means we shouldn’t try. A lot of smart people in the world, a lot of engineers are out there that can solve these problems. If there’s a will, I think we’ll figure it out.

If you go back to the early days of the web, we had the same things being said about how to make applications that have behavior and interactive in there. And we had many false starts. We had things like Flash and ActiveX and all this other stuff. It sorted itself out. Strangely, we ended up with JavaScript as the way in which we describe everything.

West: How did that happen? But it happened.

Lebaredian: If you told me 20 years ago that not only would JavaScript be powering the internet and the web, but we’d actually compile C++ into JavaScript for web assembly, I’d say you were insane. That’s impossible. It will never work. But engineers do miraculous things when they need to. If there’s a need for this, I think we’ll figure it out. We just have to be willing to go and try. We’re going to have a lot of experiments, and many will fail, but some things will work.

West: I don’t mean to sound negative. I was thinking more about–I think we’re saying the same things. I know people are going to want this. I’m very curious to see what gets pushed out. Like you said, we could not have predicted JavaScript as the de facto lingua franca of the dynamic internet. But it’s going to be very interesting to see. It’s a fun time to be here. I’m glad to be at this level of exploration right now.

A scene from an Ericsson Omniverse scene.
A scene from an Ericsson Omniverse environment.

VentureBeat: I’m curious, for our other panelists–can you think of a scenario for portability in why you might want to use something from somebody else in your applications for the metaverse? Gamers buy a lot of digital items and create avatars. They want to carry that stuff with them from one place to another. But what about in the industrial metaverse side? Can you think of a reason right now that you would want to have that kind of portability?

Maillard: For sure. What we call openness or interoperability is part of the future of this industrial metaverse as an enabler, for the simple reason that it’s a way to connect the tools together, provide appropriate interfaces and consistent data management between the tools. The second aspect is to easily transfer virtual assets into different virtual environments. For example, a model of a car can flow from design and validation into production and test and operation, at each stage gaining more maturity to feed back and improve the next iteration of design as a full interconnected system. To do that we need interoperability, because we need different tools to talk together.

VentureBeat: I suppose that in a digital factory, you might have people making cars, but all the parts of those cars might be commonly used among different factories and different digital twins. In that kind of scenario, if there’s not necessarily a need to create the same part for two different digital twin factories, that’s where it becomes useful.

I am curious about what’s here and now about the metaverse. What’s tangible for you? What are some concrete examples of metaverse work flows?

West: It’s interesting. Again, it comes down to what you think the metaverse is. In terms of tangibility, the biggest examples of what I would see as solid, concrete uses of contextual computing would be a combination of devices. The metaverse originally was an idea about virtual reality. I’ve since–when I got into VR I was very much thinking, “This is it! It’ll replace all our screens with AR glasses! No hands!” And in the last seven years I’ve come to realize that screens are great, and so are keyboards. Every computing device we have comes with its advantages and disadvantages. VR is great for presence. It’s terrible for writing code. If you want to write the great American novel or do transcription, you need a keyboard. All these devices are just tools to help achieve your goal.

And so a great example of what I would call a futuristic use case is when you have multiple different people who are able to use these different devices in the way that best suits the workflow they have. For example, somebody is wearing a HoloLens and they’re looking at a detailed report on an engine. Someone else has an iPad, and a third person is joining them from far away in VR. That type of interoperability between devices, that real time interactivity and ability to display information in the ways that best suit the needs of each person via their device, to me that’s the best and truest expression of what excites me about this new type of computing.

Hufford: I couldn’t agree more with Timoni’s comment about–we have to be able to meet people where they are. We have to be able to incorporate this technology into their existing processes and workflows. That’s important to us. It has to be accessible and it has to be open, so that users can imagine their own applications and develop their own metaverse applications – in our case for infrastructure design, operation, and construction.

I can give a specific example about how a leading organization is using infrastructure digital twins in the metaverse today. The ITER Project in France consists of 35 countries collaborating to build the world’s largest tokamak, a magnetic fusion device designed to prove the feasibility of nuclear fusion as a large-scale, carbon-free source of energy. It’s built on the same principle that powers the sun and stars – nuclear fusion, not fission. This is a huge design and construction project with a lot of complexity, complex engineering models, millions of parts. It has to be constructed off site and lifted into place.

ITER wanted to allow humans to get into the digital twin and experience it. The container models were brought into Bentley Synchro, which is powered on the iTwin platform for 4D construction modeling and sequencing. Then, leveraging Unreal and Omniverse and using Cloud XR, all of this came together to create this amazing experience to immerse the team.

The level of fidelity is important here. We have to be able to do this without reducing the complexity of the model in order for it to provide engineering value. It’s the first time that the client has been able to be inside the model. The photorealistic lighting enhances the experience, and the infrastructure digital twin, which contains all of the engineering data needed, allows the client and the users to fully make use of the experience.

Another example being used today is remote bridge inspections. There’s so much existing infrastructure in the world. Performing bridge inspections to ensure safety and identify maintenance can be dangerous and time-consuming. The metaverse can help with that. Again, there has to be a high level of detail to be able to visualize cracks and rust and that sort of thing. Drones are used to capture reality models of a bridge, and then inspectors use HoloLens to inspect the resulting models. Inspectors can perform more inspections with less travel and work remotely, improving their safety as well.

Lockheed Martin is using Omniverse to simulate wildfires.
Lockheed Martin is using Omniverse to simulate wildfires.

VentureBeat: I’m curious about what the next evolution of the metaverse is going to be. What do you think is going to be real in, say, one year or five years? What might take as long as 10 years?

Maillard: As I said at the beginning, for Siemens, the industrial metaverse is an evolution of what we already provide to enable digital twins and digital threads. In the future we see that realistic immersion, real time feedback, and openness will allow our customers to extend their experience of the industrial metaverse.

We already discussed interoperability, so let me talk about immersion and real time. Regarding immersion, I mean the photorealistic quality of virtual 3D worlds. It will definitely allow for a better fit between human cognition and human interaction with the digital world. The second benefit is training autonomous systems, as I mentioned earlier, using synthetic data. That’s why immersion is important as an evolution. That won’t necessarily involve VR and AR devices, though. A laptop or tablet can be enough, because they already provide immersion. That’s true in gaming today.

The other important point is real time collaboration. In the future, users should get instant feedback from their actions and interaction with others. They’ll feel a synchronous experience. The challenge here is not to sacrifice the accuracy of the simulation and prediction. In the end, the purpose of this real time simulation is to test what-if scenarios quickly at the early stages of design and production and solve problems. That’s why, at Siemens, we already provide a set of multi-physics high-fidelity tools, and we continue to work on the next generation of technologies that will accelerate simulation while keeping the needed accuracy. Even if we already have a good foundation, the building blocks of the industrial metaverse, we still have an exciting time to come.

Hufford: A lot of folks here have been talking about communication. Our view of communication is going to evolve and improve over the course of time. Of course people to people. We’ve been talking about that. But people to assets, assets to people, and assets to assets. A variety of modalities, including XR and others, will be involved there.

To give some examples, assets that are able to alert that they’re experiencing an operation condition and require maintenance, or that would be able to guide a maintenance worker through their repair processes. Virtual lobbies where people and assets can interact. It’s going to allow us to more greatly leverage talent to design and construct and operate infrastructure. For example, using holograms to bring remote experts for consultation on site, expanding our ability to leverage talent across the globe.

As I mentioned before, we need to manage our relationship to the natural world. As time goes on and the metaverse continues to evolve, we ourselves at Bentley–we want to be good stewards of the planet. The volume and complexity of this infrastructure data–it’s a lot of data that’s going to power the metaverse. We need to be looking for architectures and software designs that minimize compute and reduce carbon footprint along the journey.

VentureBeat: What are some of the tech blockers that are getting in the way of these goals?

Bunszel: First, I want to reiterate that–I think all of our customers are able to experiment today with the technology that exists out there. Digital twins have been happening for many years. They’re giving people a great context to jump off and do even more. But when I think about blockers, I have three that come to mind.

First, in many cases, if we’re taking existing design data, we’re trying to use it for a new purpose. We’re going to need to automate how we get the right amount of information into the metaverse. To the point about being mindful of compute and the carbon footprint of everything we run in the metaverse, there are things we need to do to prep the data so it’s most effective for what we’re asking to be done there.

Another topic we haven’t really discussed yet is trust. In many cases our customers are putting their intellectual property into the metaverse. Figuring out how we leverage all of the technology we have in other areas around security, privacy rights, compliance, reputation, fraud–there’s a lot of potential to do amazing things, but we also need to make sure we’re securing all the IP that goes into these environments.

The third part for me is, again, this notion of dynamic–I think Virginie mentioned this. The metaverse is not static. Things will change in the metaverse. They might need to be reflected back on the original data and vice versa. Having this real time, interactive, always up to date environment is going to unlock even more capability for collaboration when we figure out how to transfer all that data back and forth seamlessly.

Virtual Jensen Huang of Nvidia.

VentureBeat: What are the next major technological leaps that need to occur to get to this next level of the metaverse?

Lebaredian: Everything we’re talking about here today with the metaverse implicitly assumes that we have the capability of accurately simulating these worlds and having it match the physical world, the real one that we’re in. As of today, the world simulators that we have are largely designed for entertainment. They’re designed for games. We need to find ways of expanding these world simulators to be physically correct and take advantage of the computing power that’s available to us, using it to its maximum so we can attain these superpowers.

Once you can simulate a world and do that accurately, you get a few things. If we simulate how light interacts with matter, we can see these worlds. We can see how they’re going to look. If you can take a world, ingest it from the real world and reconstruct it inside these virtual worlds, simulate it that way, we can teleport virtually to anywhere in the world. If you can run a physics simulation and see how the state of that world changes, and actually have confidence in how it evolves, you essentially get a time machine. We can go into the future and see what this world is going to look like, how it’s going to behave some time in the future. That allows us to explore many possible futures. We can change the initial state, explore all the possibilities, and choose the best future, the one we think is going to be the most efficient, the most sustainable, the safest, or whatever you’re optimizing for.

Our challenge here is to get that physical accuracy, and that’s going to require a lot of computation, but it also requires a lot of cooperation between all of the tools and all of the simulators and everything that’s out there, so we can combine all of the elements of these worlds that we already have into a form that’s simulatable. If you think about it, every man-made thing around us exists in some 3D digital form somewhere. Somebody used a CAD tool for every product that you’re using, every building you’re in. Unfortunately, they diverged at some point from what was actually built, what’s in reality. The better we make these tools in the collaboration frameworks and allow this synchronization of the real world and the digital world versions of things, the sooner we’ll be able to attain those superpowers of teleportation and time travel.

VentureBeat: Timoni, any closing thoughts?

West: I’d just add to that. We do have the CAD files that provide references to a lot of things that exist today. But we don’t have that for things that are handmade, things that existed prior to computer-aided design. I do love that we’re leaning deeply into things like photogrammetry and object reconstruction and everything else we need to be able to bring the real world online, in a sense, and start to merge the digital and physical to a point where we can have these types of super-realistic simulations. That will help us all make better decisions, and more important, to help computers respond to us in a way that’s truly democratizing, to allow all people to use computers in a way that makes sense to them.

VentureBeat: Thank you, everybody. I do hope we get to that digital twin of the Earth that Jensen Huang has been talking about. If we get there, that should solve everything for everyone else. We’ll see.

GamesBeat’s creed when covering the game industry is “where passion meets business.” What does this mean? We want to tell you how the news matters to you — not just as a decision-maker at a game studio, but also as a fan of games. Whether you read our articles, listen to our podcasts, or watch our videos, GamesBeat will help you learn about the industry and enjoy engaging with it. Learn More