The Extended Reality (XR) wave has been dampened by a rapidly changing technology ecosystem and a wide spectrum of capabilities from Virtual Reality (VR) and Augmented Reality (AR) hardware. On one end is Enterprise VR, with limited deployment and select installations of high-powered VR devices. On the other end is mass-market VR taking off with lightweight, untethered Head-Mounted Displays (HMDs) that have significantly less onboard computing power. In between, there are consumer-grade VR and AR products being run in larger enterprises with professional software, GPUs, and workstations, professional-consumer, or “prosumer”-grade hardware. This has led to some unfortunate compromises, often limiting accessibility to high-end VR hardware and software to select departments, and wider use of mass-market hardware in different areas within the enterprise.
IC.IDO users are accustomed to specialized hardware – the need for original CAD geometry without additional data optimization, real-time solid mechanics and elastic hose/cable simulations, plus collaborative workflows, initially limited deployments to VR centers with CAVE and Powerwall clusters. More recently, the introduction of VR headsets tethered to engineering workstations enabled individual exploration and in-process reviews, but the struggle to include more review participants remains. Human-Centric Product and Process Validations require the power of a professional Graphics Processing Unit (GPU) to render complex products and environments, providing rich interactive experiences, leaving more portable and accessible hardware for more mass-market uses.
The NVIDIA CloudXR, announced today, shifts entry level for Enterprise VR to include untethered VR devices. By “streaming” VR experiences from powerful 5G enterprise networked rendering nodes to wireless VR devices, virtual reality can get out of the CAVE and engage more enterprise stakeholders across a broader range of hardware.
When NVIDIA’s development team offered ESI early access to their CloudXR SDK, for streaming virtual and augmented reality server-side rendering, ESI‘s existing enterprise customers had already sought portable and lightweight implementations of IC.IDO to expand their human-centric workflows. Based on earlier collaborations with NVIDIA and our contributions to technical conferences like GTC and Siggraph, we had many of the required building blocks. Pilot programs were launched with select industry leaders to grow the deployment of their IC.IDO subscription base and reach more of their engineering, manufacturing, and service method planning teams. The intent was to lower application complexity for distributed collaborative reviews, reduce hardware restrictions for participants, without sacrificing the capabilities they selected IC.IDO for in the first place.
The opportunity to get a head start on NVIDIA CloudXR and support our customers' business outcomes seemed a perfect storm – until a global pandemic canceled the physical GTC event, postponing the announcement of NVIDIA CloudXR, and stalling many companies' technologies and digitalization projects. It also shifted attention toward different challenges, or at least different perspectives on the same challenges. We used that delay to develop a high-fidelity demonstrator, that leverages the core of IC.IDO, NVIDIA CloudXR, NVIDIA Quadro RTX powered server nodes, and some exceptionally clever new code, to successfully link wireless HMD, AR Tablets, and even mobile phones to a collaborative virtual workspace.
In the video below, we see the user explore, interact with, and experience the virtual product. In a full-scale review, with a large empty walking space, he can walk around the entire truck to access the engine after the cab tilts forward, validate the ease of access to components requiring service, evaluate the reachability of critical parts during assembly, and assess the comfort or safety in completing any task. Because this scenario has a full IC.IDO installation running on a server, if it were necessary to exchange the design geometry for a new revision, it could be done in the session on the fly; no optimization needed to downscale complexity for wireless VR. If the virtual interactions for the process in validation needs to change, there is no reprogramming required to reflect physics changes or code new gamified interactions. It would function just like the IC.IDO they use for such tasks in full engineering workstations.
This initial proof-of-concept is still evolving with input from our pilot program partners, to determine the functionality required to turn this minimally viable product into a maximally desirable product for their enterprise, to develop into a solution that fulfills their virtual build and virtual maintenance needs, that adds value beyond what they get from IC.IDO releases today and then ultimately becomes a commercial solution. In the meantime, enterprises still need to establish the 5G networks and server nodes required for spatial computing, and untethered HMDs need 5G connectivity. Rendering and interactions computed on remote servers do risk additional latency between user interaction and the stream of the visual feedback from a server. The risk is greatly reduced by the acceleration unlocked by NVIDIA CloudXR on Quadro RTX GPUs.
The ESI team looks forward to working with their industrial partners in this expansion of the industrial applications of IC.IDO to address Human-Centric Product and Process Validation concerns. Meanwhile, IC.IDO 14.0 expected to release mid-2020 with new innovations in collaborative virtual workflows continues to run on individual or network connected workstations with tethered HMDs, single node Powerwalls, and multi-surface projection CAVE cluster environments to help users to experience their proposed product designs and planned processes in fully immersive VR.
For more info visit Assembly Process Validation
Click here to watch the complete GTC Conference presentation on CloudXR and IC.IDO