In early May, the GPU Technology Conference (GTC) will be opening up a window into new advancements in the computing industry. Various sessions from deep learning and AI to big data analytics will be held with various companies speaking on their developments. Mixed reality will have a huge spotlight on it as well, with a handful of VR and AR sessions and speakers taking the stage at the event May 8 – 11 (specific dates and times TBD):
Computational Focus-Tunable Near-Eye Displays
Description: We’ll explore unprecedented display modes afforded by computational focus-tunable near-eye displays with the goal of increasing visual comfort and providing more realistic and effective visual experiences in virtual and augmented reality. Applications of VR/AR systems range from communication, entertainment, education, collaborative work, simulation, and training to telesurgery, phobia treatment, and basic vision research. In every immersive experience, the primary interface between the user and the digital world is the near-eye display. Many characteristics of near-eye displays that define the quality of an experience, such as resolution, refresh rate, contrast, and field of view, have been significantly improved over the last years. However, a pervasive source of visual discomfort prevails: the vergence-accommodation conflict (VAC). Further, natural focus cues are not supported by any existing near-eye display.
Presented by: Nitish Padmanaban (CTO) at Stanford Computational Imaging Lab
Assembly Chain Training with Professional VR
Description: Optis has been involved in advanced optical simulation for the past 25 years and has recently invested in VR for virtual prototyping. Its latest HIM built for human ergonomics evaluation in combination with advanced, real-time, physics-based rendering enables precise environment reproduction for appropriate prototyping or training. We’ll present the latest integration for assembly line training with HTC Vive and feedback powered by NVIDIA® PhysX®. Companies such as Tesla Motors and Bentley are the proud early adopters of this solution. We’ll demonstrate our software and show customer use cases and their data to explain how to improve the VR experience with haptics and audio simulation in the future.
Presented by: Nicolas Dalmasso (Innovation Director) at Optis
How To Bring Engineering Datasets To Head-Mounted Displays
Description: Hear visualization experts explain why people in professional visualization, in particular virtual engineering, are great candidates to unleash the full potential of HMDs and how close today’s technology pushes application developers to the finish line of discovering massive datasets with HMDs. Learn about new hardware (NVIDIA Pascal™-powered NVIDIA Quadro® GPUs), extensions, APIs (NVIDIA VRWorks™: NVIDIA SLI® VR, Single Pass Stereo), techniques (GPU culling), and next steps that enable ESI to create amazing VR experiences even with high node and triangle count.
Presented by: Andreas Mank & Ingo Esser (Software Development Team Leaders ) at ESI Group and NVIDIA
Improving Patient Care Using EchoPixel’s Interactive Virtual Reality Technology
Description: Get the latest information on how virtual reality is being used to change healthcare outcomes. EchoPixel, a company focused on VR in healthcare, has developed the True 3D Viewer, a real-time, interactive VR platform. It offers physicians an unprecedented opportunity to view and interact with patient tissues and organs in an open 3D space as if they were real, physical objects. The resulting improvement in clinical efficacy and workflow has had a significant positive impact on patient care.
Presented by: Janet Goldenstein (Lead Engineer) at Echopixel
NVIDIA VRWorks Audio: Improving VR Immersion with Acoustic Fidelity
Description: The demand for realism increases dramatically the instant a player puts on a head-mounted display (HMD) – images, sounds, and interactions make or break the immersiveness of the experience. We’ll provide an overview and examples of the NVIDIA VRWorks Audio SDK, a geometric acoustics rendering toolkit that helps developers improve realism and immersion through realistic acoustic simulation and audio rendering.
Presented by: Tony Scudiero (Developer Technology Engineer) at NVIDIA
GPU Computing for the Construction Industry, AR/VR for Learning, Planning, and Safety
Description: We’ll dive head-first into some of the current challenges of the construction industry, how we’re addressing them, and how we’re planning to use virtual/augmented reality and real-time GPU computing to address them. To optimize the construction of a building, site logistics must be planned and all systems analyzed and coordinated to confirm constructability. Along with the use of building information modeling (BIM) and the advent of inexpensive GPU and AR/VR hardware, we’re building tools to redefine the planning and analysis process for construction management. Virtual and augmented reality systems aren’t just for entertainment anymore; they can help us plan faster, help confirm our client’s design goals, and facilitate stronger communication among our team members before and during the construction process.
Presented by: Kyle Szostek & Ken Grothman (Senior Virtual Construction Engineers) at Gilbane Building Company
If you’re a creator, you still have a little bit of time to submit to NVIDIA’s GTC contest as the deadline is on the 15th of this month. The details can be found here and you must be able to attend the event with at least two representatives for your team (the exhibition floor space must always have at least one person on hand).
This is sponsored content which has been produced by UploadVR and brought to you by NVIDIA. NVIDIA did not have any input into the creation of this content.