IEEE VR logo

March 22nd - 26th

IEEE VR logo

March 22nd - 26th

IEEE Computer Society IEEE

IEEE Computer Society IEEE

Research Demos

Days 1 and 2

Demo ID: 1005

CalorieCaptorGlass: Food Calorie Estimation based on Actual Size using HoloLens and Deep Learning

Shu Naritomi, Keiji Yanai

University of Electro-Communications

Hubs Link: See in Hubs

Abstract: “We propose “CalorieCaptorGlass”, a food calorie estimation system implemented on HoloLens. We take advantage of the function of 3D environment recognition of HoloLens for estimating the actual size of foods for accurate estimation of food sizes and calories. We combine it with CNN-based food image segmentation and estimate calorie intake of 46 kinds of meals based on estimated 2D sizes and meal-dependent quadratic-curve between meal size and calories.”


Demo ID: 1008

DRoom: a gamified demonstration of Real Haptics technology

Alvaro Villegas, Pablo Perez, Redouane Kachach, Francisco Pereira, Ester Gonzalez-Sosa

Nokia, Bell Labs

Hubs Link: See in Hubs

Abstract: “We present DRoom, a gamified demonstration of our Real Haptics technology, a novel interaction method for virtual environments. Participants in this demonstration will experience the benefits of Real Haptics while playing an escape room game in which they will have to solve several puzzles using objects in the room within a predefined time frame. And they will have fun.”


Demo ID: 1012

Aditya Raikwar, Francisco R. Ortega, Jaclyn Stephens

Colorado State University

Hubs Link: See in Hubs

Abstract: “Virtual Reality is gaining popularity due to the gaming industry. But there are many other applications of VR like training, rehabilitation, and simulation. This is our effort to combine all these application to create a technique for assessing return-to-play readiness in athletes with concussions using Virtual Reality (VR). The subjects perform various Soccer related simulations. By measuring their responses during the execution of these tasks we can asses their ability to return to the game.”


Demo ID: 1013

High-Resolution Interactive Immersive Renderings of Real-World Environments

Kevin Ponto, Ross Tredinnick

University of Wisconsin-Madison

Hubs Link: See in Hubs

Abstract: “Recent advances in scanning technology have enabled the highly detailed capture and generation of 3D models. However, due to the complexity of these data sets, they are often not experienced immersively. This demo will showcase recent advances in rendering technologies that enable complicated real-world environments to be experienced in VR. The ability to experience a virtual environment replicating a physical space has applications for areas such as CSI, healthcare and cultural heritage.”


Demo ID: 1015

Simulating Next-Generation User Interfaces for Law Enforcement Traffic Stops

Jeronimo Grandi, Zekun Cao, Mark Ogren, Regis Kopper

University of North Carolina at Greensboro, Duke University

Hubs Link: See in Hubs

Abstract: “We present the design of a next-generation user interface for law enforcement officers, developed to assist with current traffic procedures. Our design leverages the futuristic capabilities of augmented reality displays, integrating real and virtual elements. Our team has created a traffic stop scenario in immersive virtual reality, where the participant assumes the role of a police officer and interacts with a simulated augmented user interface and a virtual driver.”


Demo ID: 1018

Adjustable Pointer in Virtual Reality for Ergonomic Interaction

Powen Yao, Tian Zhu, Michael Zyda

University of Southern California

Hubs Link: See in Hubs

Abstract: “In a conventional virtual reality system, the user is provided interaction tools based on the hardware. We propose moving the interaction capability from hard-fixed to the hardware to a virtual one that’s loosely-tied to the spatial reference point of the hardware. In our demonstration, we provide two methods for the user to quickly adjust their pointer to interact with objects in atypical situations without the need of additional buttons.”


Demo ID: 1020

XREye: Simulating Visual Impairments in Eye-Tracked XR

Katharina Krösl, Carmine Elvezio, Matthias Hürbe, Sonja Karst, Steven Feiner, Michael Wimmer

TU Wien, Columbia University, Medical University Vienna

Hubs Link: See in Hubs

Abstract: “Many people suffer from visual impairments, which can be difficult for patients to describe and others to visualize. To aid in understanding what people with visual impairments experience, we demonstrate a set of medically informed simulations in eye-tracked XR of several common conditions that affect visual perception: refractive errors (myopia, hyperopia, and presbyopia), cornea disease, and age-related macular degeneration (wet and dry).”


Demo ID: 1022

OnBodyVR: Virtual Reality Application with Eyes-free on-Body Interface Based on three Tracked Points

Manuel Dixken, Tobias Schultze, Matthias Bues

Fraunhofer Institute for Industrial Engineering IAO

Hubs Link: See in Hubs

Abstract: “In this work, an application is presented which uses a heuristic method which allows eyes-free on-body interaction using a virtual reality system with 3 tracked points. Via the manner the user carries head-mounted display and the two hand controllers combined with the anthropometric findings on maximum rotations of the involved joints, on-body interaction can be enabled. In this research demonstration a game-like application is presented, where the user can try out the on-body interface.”

Days 3 and 4

Demo ID: 1002

Manipulating Virtual World with Props in Real-world

Toshiro Kashiwagi, Kaoru Sumi

Future University Hakodate

Hubs Link: See in Hubs

Abstract: “We present a system that makes it possible to use familiar hand-held tools in Mixed Reality space”


Demo ID: 1003

Design of Visual Deficit Simulation for Integration into a Geriatric Physical Diagnosis Course

Drew Alexander, Thuy Nguyen, Patrick Keller, Jason Orlosky, Shilpa Brown, Elena Astapova Wood, Onyeka Ezenwoye, Wanda Jirau-Rosaly

Augusta University

Hubs Link: See in Hubs

Abstract: “One major challenge in the field of medical education is teaching students to empathize with patients. To help address this need, we have designed a system in virtual reality (VR) that can simulate macular degeneration and that requires medical students to carry out a self-medication task from the perspective of a patient. The simulation includes a home environment, interactive medication bottles and pills, reading requirement, task list, and completion goals.”


Demo ID: 1004

AVoidX: An Augmented VR Game

Rafail Athanasoulas, Prodromos Boutis, Anargyros Chatzitofis, Alexandros Doumanoglou, Petros Drakoulis, Leonidas Saroglou, Vladimiros Sterzentsenko, Nikolaos Zioulis, Dimitrios Zarpalas, Petros Daras

Centre for Research and Technology Hellas

Hubs Link: See in Hubs

Abstract: “This work will demonstrate volumetric capture in an Augmented VR obstacle avoidance game.”


Demo ID: 1006

SImBa: An Augmented Reality approach for creating Smart Immersive Bays in Software Delivery Environments

Vibhu S. Sharma, Rohit Mehra, Vikrant Kaulgud, Sanjay Podder

Accenture Labs

Hubs Link: See in Hubs

Abstract: “In this paper, we present SImBa, a novel prototype implementation that transforms the traditional software delivery bays into Smart Immersive Bays by virtually augmenting the physical space with software delivery insights (both textual and metaphorical). This allows the project manager to keep track of ongoing project progress and potential future issues, while on-the-move, and engage with the relevant stakeholders armed with rich context and when most pertinent.”


Demo ID: 1007

Demonstrating COLIBRI VR, An Open-Source Toolkit to Render Real-World Scenes in Virtual Reality

Gregoire Dupont de Dinechin, Alexis Paljic

PSL University

Hubs Link: See in Hubs

Abstract: “This demonstration showcases an open-source toolkit we developed in the Unity game engine to enable authors to render real-world photographs with motion parallax and view-dependent reflections in virtual reality (VR). First, we illustrate the toolset’s capabilities by using it to display interactive, photorealistic renderings of a museum’s mineral collection. Then, we invite audience members to be rendered in VR using our toolkit, thus providing a live, behind-the-scenes look at the process.”


Demo ID: 1009

Virtual Reality for Post-Stroke Rehabilitation

Tobias A. Boyd, Eric Nahe, Brian A. Cohn, Roghayeh Barmaki

University of Delaware, Microsoft Research

Hubs Link: See in Hubs

Abstract: “Rehabilitation of the upper extremities after stroke helps patients regain important functions of their upper limbs. However, many patients report feelings of frustration, boredom, and powerlessness during traditional physical therapy. Our goals are to provide patients with therapy that engages their minds as well as their bodies and to provide physicians with an objective tool with which to evaluate patients’ progress.”


Demo ID: 1010

TEllipsoid: Ellipsoidal Display for Videoconference System Transmitting Accurate Gaze Direction

Taro Ichii, Hironori Mitake, Shoichi Hasegawa

Tokyo Institute of Technology

Hubs Link: See in Hubs

Abstract: “We propose ”Tellipsoid”, an ellipsoidal display for the video conference system, that can realize not only accurate eye gaze transmission but also practicality in conferences, namely the convenience and the identity of the displayed face. The display consists of an ellipsoidal screen, small projector and convex mirror, where the bottom-installed projector projects the facial image of a remote participant onto the screen via the convex mirror.”


Demo ID: 1011

MeteorologyAR: A Mobile AR App to Increase Student Engagement and Promote Active Learning in a Large Lecture Class

Scottie D Murrell, Fang Wang, Eric M. Aldrich, Xinhao Xu

University of Missouri

Hubs Link: See in Hubs

Abstract: “A study is presented using a mobile AR App to enhance large lecture classes. The App was tested in an Introductory Meteorology class. A survey was administered to collect feedback on students’ attitudes, engagement and satisfaction level, app performance, and other user experiences. Responses from 95 students were received and analyzed. The survey results indicate that majority students have found the App to be engaging, satisfactory, and considered it to be a promising learning tool.”


Demo ID: 1016

Modified Playback of Avatar Clip Sequences Based on Student Attention in Educational VR

Adil Khokhar, Christoph W. Borst, Andrew Yoshimura

University of Louisiana at Lafayette

Hubs Link: See in Hubs

Abstract: “We present a system that sequences teacher avatar clips responsive to eye tracking in order to investigate subjective suitability of avatar responses to student misunderstandings or inattention. 3 different avatar behaviors are demonstrated to allow a teacher pedagogical agent to behave more appropriately to student attention or distraction monitored by eye tracking. We use an in-game mobile device that monitors rig conditions as a control mechanism for 2 levels of distractions.”