Building an Active Digital Twin Using NVIDIA Omniverse and Project Gemini

Building an Active Digital Twin Using NVIDIA Omniverse and Project Gemini

The Acceleration Agency, a digital innovation and product design firm, is working on an active digital twin framework and toolkit called Project Gemini. Inspired by the United States space program of the same name, Project Gemini uses active sensor fabric data and a wide range of data from sources like Google Sheets and Customer Relationship Management (CRM) platforms to replicate real-world settings in the virtual world. 

The active digital twin framework and toolkit will be fully connected to NVIDIA Omniverse–a scalable platform for design and collaboration–using Universal Scene Description (USD).

The project launched with a digital replication of The Acceleration Agency’s main office located in Austin, Texas. Instrumented with a dense sensor fabric for real-time and historical spatial computation, the digital twin of the office includes employees and employee information (job title, ID#, gender, and date of birth) provided by Salesforce. It also tracks inventory items on site and can display information such as quantity, date of last interaction, temperature, and orientation.

With NVIDIA Omniverse real time, true-to-reality physics from PhysX, and physically accurate RTX rendering capabilities, the  team anticipates that the Gemini active digital twin can be simulated with an unprecedented level of visual and physical fidelity and with complex simulations. 

Leveraging USD and Omniverse Nucleus, users of the Project Gemini digital twin platform will be able to update content in a variety of tools in real time collaboratively instead of having to wait for new builds.

Connecting Google Sheets to NVIDIA Omniverse with a Kit Extension

Multiple abstraction layers and a sensor fabric layer allow a variety of sensors, databases, CRMs and object integration tools to connect to Omniverse. The connection allows real-time updates to inventory objects and information like temperature, humidity, and location.

To accomplish this, the team created a simple Omniverse Kit Extension enabled by a Python script that reads data from a Google Sheet and attaches the data to an object in Omniverse Kit. It allows someone to control the location, scale, and rotation of any selected object in Omniverse applications like Omniverse Code or Omniverse Create using the metadata in the spreadsheet. You can access the AccelerationAgency/omniverse-extensions through GitHub. 

Using database and CRM tools with the extension makes the task of manipulating object data more scalable. When building digital twins at the scale of factories, stadiums, warehouses, and even cities, hundreds, thousands, and even millions of objects may need to be manipulated rapidly.

The Acceleration Agency loaded the USD version of their office digital twin into the Omniverse stage and used the extension to select and manipulate object data. 

The images below show an example of how this process was done for a Tesla in the parking lot outside the agency office. Building this was fairly straightforward and only took a few days for a single developer to create. It can be extended to any data source.

Figure 1. Google Sheet with object location, scale, and rotation information
Figure 2. Selecting the Project Gemini-enabled extension from the extensions tab in Omniverse Code
Figure 3. The object before running the extension to pull in the data from the Google Sheet

Figure 4. After running the extension to pull in the data from the Google Sheet, the object now has different parameters

Figure 5. Running the extension using the USD version of the office digital twin as the data source, then selecting the Tesla as the data object to manipulate
Figure 6. Tripling the scale factors of the Tesla in the Google Sheet updates through the extension and then propagates into the stage

Watch the extension in action with Starr Long, Executive Producer at The Acceleration Agency: 

Adding RTX Renderer and Nucleus Collaboration

The next step for Project Gemini is to render in real time with the NVIDIA RTX Renderer and allow for real-time modifications through Nucleus. The real-time modifications are one of the advantages of working with the powerful USD 3D framework and composition engine. This will be coupled with historical recordings of real data which when played back can be mixed with these modifications to try different scenarios. Some of the use cases the team is targeting include construction sites, hospitals, and live event venues. To learn more, visit the Project Gemini website. 

Figure 7. Digital twin of The Acceleration Agency office running in the NVIDIA RTX Renderer

Figure 8. Sensors and tags that send real-time data about location, temperature, and other factors to the digital twin

Learn more about building custom USD-based applications and extensions for NVIDIA Omniverse in the Omniverse Resource Center and with these USD-specific resources. 

Don’t miss NVIDIA at SIGGRAPH, August 8-11, 2022. Watch the Omniverse community livestream at SIGGRAPH on August 9 at noon, Pacific time, to learn how NVIDIA Omniverse and other design and visualization solutions are driving breakthroughs in graphics and GPU-accelerated software.

You’re also invited to enter the inaugural #ExtendOmniverse developer contest, open through August 19, 2022. Create an Omniverse Extension using Omniverse Code for a chance to win an NVIDIA RTX GPU.

Follow NVIDIA Omniverse on Instagram, Twitter, YouTube and Medium for additional resources and inspiration. Check out the Omniverse forums, and join our Discord server and Twitch channel to chat with the community.

Leave a Reply

Main Menu