Shared Reality

Shared Reality Project Image

The Shared Reality Project investigates remote collaboration ecosystems that are spatial, immersive and widely accessible. The DualStream augmented reality prototype enables users to spatially share information about themselves and their surroundings in real time. Going beyond traditional audio and video conferencing tools, DualStream can convey three-dimensional information about facial expressions, creating avatars that look and move as we do in real life. Leveraging the mobility and ubiquity of phones, DualStream enables users to simultaneously feel they are “being there” in a remote location, and remote participants are “being here” in their local environment. DualStream envisions spatial computing that is more widely accessible than experiences reliant on expensive head-worn devices. By building cross-reality ecosystems with stronger connections across mobile devices, PCs and immersive setups, we can better support collaboration between people no matter where they are located or what tools they have access to.

ACME Lab

Infographic of DualStream system overview

Infographic of DualStream

Associated Researchers

Publications

Rishi Vanukuru and Ellen Yi-Luen Do. 2024. "". In: Proceedings of the 2024 ACM International Conference on Interactive Media Experiences (IMX'24). (Stockholm, Sweden, June 12-14, 2024).

Rishi Vanukuru and Ellen Yi-Luen Do. 2023. "". In: IEEE ISMAR 2023 Workshop on Cross-Reality Interactions. (Sydney, Australia, October 16-20, 2023).

Rishi Vanukuru,Suibi Che-Chuan Weng,Krithik Zanjan,Torin Hopkins,Amy Banic,Mark D Gross andEllen Yi-Luen Do.2023"". In:IEEE International Symposium on Mixed and Augmented Reality (ISMAR).(Sydney, Australia, October 16-20, 2023).