2019 IEEE VR Osaka logo

March 23rd - 27th

2019 IEEE VR Osaka logo

March 23rd - 27th

IEEE Computer Society IEEE VRSJ

IEEE Computer Society IEEE VRSJ

Become a Sponsor

Sponsors


Diamond

Osaka International Convention Center

Platinum

Mercari
OSAKA CONVENTION & TOURISM BUREAU

Gold


DELL

Microsoft

Tateishi Science and Technology Foundation

The Telecommunications Advancement Foundation

Silver


DAQRI
Osaka Electro-Communication University
Solidray

Bronze

BARCO
Huawei Japan
Knowledge Service Network

Flower

Archivetips
KYOHRITSU
SoftCube
Sumitomo Electric Industries

Exhibitors

Advanced Realtime Tracking (ART)
Archivetips
Computer Network Information Center, Chinese Academy of Sciences
Crescent
DELL
Fujitsu

Microsoft
PhaseSpace
PoSTMEDIA
QD Laser
SenseTime Japan
West Unitis

Supporters


IEEE Kansai Section

Society for Information Display Japan Chapter

VR Consortium

The Institute of Systems, Control and Information Engineers

Human Interface Society

The Japanese Society for Artificial Intelligence

The Visualization Society of Japan

Information Processing Society of Japan

The Robotics Society of Japan

Japan Society for Graphic Science

The Japan Society of Mechanical Engineers

Japanese Society for Medical and Biological Engineering

The Institute of Image Information and Television Engineers

The Society of Instrument and Control Engineers

The Institute of Electronics, Information and Communication Engineers

The Institute of Electrical Engineers of Japan

The Society for Art and Science

Japan Ergonomics Society

The Japanese Society of Medical Imaging

Exhibitors and Supporters

Tutorials

The following tutorials will be held at IEEE Virtual Reality 2019:


(AR) Hack Our Material Perception in Spatial Augmented Reality

Organizer:

  • Toshiyuki Amano, Wakayama University, Japan
  • Daisuke Iwai, Osaka University, Japan
  • Keita Hirai, Chiba University, Japan
  • Takahiro Kawabe NTT, Japan
  • Katsunori Okajima, Yokohama National University, Japan
  • Yoshihiro Watanabe, Tokyo Institute of Technology,Japan

Abstract: Spatial augmented reality (SAR), a.k.a projection mapping, alters the appearance of a real surface by visually overlaying computer generated images onto it. Compared to other AR approaches which apply either video or optical see-through displays, SAR has an important advantage, i.e., any display devices do not block observers’ sights, and consequently, the observers can see the augmentation directly on the surfaces with wide field of view and natural 3D cues without any physical constraints on their bodies. The ultimate goal of SAR is “believably manipulating the material properties of real world surfaces.” To this end, previous efforts solved various fundamental technical problems such as geometric registration and color correction enabling to display desired images onto non-planar and textured surfaces. However, they work only in limited situations where the surface is typically static, only view-independent material property can be manipulated, and so on.

In 2010s, Japanese researchers from multidisciplinary fields including computer science, psychophysics, and neuroscience, have tackled challenging technical issues to relax the limitations, under the support of Grant-in-Aid for Scientific Research on Innovative Areas by MEXT, Japan: “Brain and Information Science on SHITSUKAN” and “Understanding human recognition of material properties for innovation in SHITSUKAN science and technology”. Consequently, adopting advanced technologies such as high speed imaging and digital fabrication, they have successfully achieved various technical innovations allowing projection mapping on dynamic even deformable objects, view-dependent material property manipulation, and truthful appearance control in spectral domain. More than just relaxing the previously recognized limitations, they have also opened up new SAR research directions such as shape manipulation of real surface and material property manipulation in other modalities than vision. In this tutorial, we would like to share these advanced technologies with audiences. (: SHITSUKAN is a Japanese word meaning perceptual qualities of a material)


(EYETRACKING) Eye-tracking in 360: Methods, Challenges, and Opportunities

Organizers:

  • Olivier LE MEUR, IRISA University of Rennes, France
  • Eakta JAIN, University of Florida, USA

Abstract: As eye-trackers are being built into commodity head-mounted displays, applications such as gaze-based interaction are poised to enter the mainstream. Gaze is a natural indicator of what the user is interested in. Eye-tracking in virtual environments offers the opportunity to study human behavior in simulated settings, both for the purpose of creating realistic virtual avatars, and to learn models of saliency that apply to a three-dimensional scene. Research findings, such as consistency in where people look at in images and videos, and biases in two dimensional eye-tracking (e.g. center bias) will need to be replicated and/or rediscovered in VR (e.g. equator bias as a generalization of center bias). These are only a few examples of the rich lines of inquiry waiting to be explored by VR researchers and practitioners who have a working knowledge of eye-tracking.

In this tutorial, we will cover three topic areas:

  • The human vision system, the eyes, and models/parameters that are relevant to virtual reality
  • Eye-tracking technology and methods for collecting reliable data from human participants
  • Methods to generate heat maps from eye-tracking data

Intended Audience The tutorial will be of interest to students, faculty and researchers interested in quantifying user priorities and preferences using data from eye-tracking, develop gaze-based interaction techniques, and apply eye-tracking data toward generating virtual avatars.

Expected Value for Audience Eye-trackers are now being built into commodity VR headsets (e.g. the FOVE headset, Tobii eye-trackers built into HTC Vive headsets). As a result, researchers and practitioners of VR must quickly develop a working understanding of eye-tracking. The audience members for this tutorial can expect to leave with the following:

  • A basic understanding of the eye and the human visual system, with a focus on the parameters that are relevant to eye-tracking in VR
  • An understanding of methods for collecting eye-tracking data, including sample protocols and pitfalls to avoid
  • A discussion of methods to generate saliency maps from eye-tracking data, including pseudocode and MATLAB implementations


(STATISTICS) The Replication Crisis in Empirical Science: Implications for Human Subject Research in Mixed Reality

Organizers:

  • J. Edward Swan II, Mississippi State University, USA

Abstract: TBA


(UNITY) Unity3D for Behavioural Experiments: A Tutorial for Intermediate Users

Organizers:

  • Nikolaos Katzakis,University of Hamberg, Germany

Abstract: TBA

Contacts

For more information, to inquire about a particular tutorial topic, or to submit a proposal, please contact the Tutorials Chairs:

  • Tabitha Peck, Davidson College, USA
  • Stephan Lukosch, Delft University of Technology, The Netherlands
  • Masataka Imura, Kwansei Gakuin University, Japan

tutorials2019 [at] ieeevr.org