» » »

Improving VR/AR Experiences by Understanding the Human Visual System - Livestream

Poster

Virtual and augmented reality (VR/AR) wearable displays strive to provide perceptually realistic user experiences, while constrained by limited compute budgets, hardware, and transmission bandwidths of wearable computing systems. This presentation describes two different ways in which a greater understanding of the human visual system may assist in achieving this goal. The first looks at how studying the anatomy of the eye reveals inaccuracies in how we currently render disparity depth cues, leading to objects appearing closer than intended, or in the case of AR, poorly aligned with target objects in the physical world. However, this can be corrected with gaze-contingent stereo rendering can, enabled by eye-tracking. The second derives a spatio-temporal model of the visual system, describing the gamut of visible signals for a given eccentricity and display luminance. This model could enable future foveated graphics techniques with over 7x the bandwidth savings than those today.

Speaker: Brooke Krajancich, Stanford University

Register at weblink to receive connection information

Wednesday, 04/19/23

Contact:

Website: Click to Visit

Cost:

Free

Save this Event:

iCalendar
Google Calendar
Yahoo! Calendar
Windows Live Calendar

SF Bay Association of Computing Machinery


, CA