Skip to main content

CRAIVE-Lab

Project Description

This project, developing a specialized virtual-reality (VR) system, introduces a new concept for an immersive virtual environment which is optimized for collaborative group activities. The system addresses the auditory and visual senses with equal priority, which is currently only championed by head-mounted display systems. Unlike the latter, this system also enables direct communication between participants without the need for telecommunication devices. The system?s haptic floor display can be used to recreate floor vibrations (e.g., simulate concert stages) with an accuracy that is currently only known for smaller displays (e.g., car simulators).

The work addresses the need for specialized virtual reality (VR) system for the study and enablement of communication-driven tasks with groups of users immersed in a high-fidelity multi-modal environment located in the same physical space. While current multi-modal VR systems have achieved a high degree of realism, they either focus on the immersion of a single or very small group of users or on presenting material to a larger group of users in a cinema-type environment. In both cases, the systems provide homogeneous visual and acoustic fields. For group communication tasks, inhomogeneous fields that provide personalized visual and acoustic perspectives or each user, could provide better access to relevant information from the VR system?s display and at the same time increase the experiential degree of presence and perceived realism for interactive tasks. The project addresses the technical hurdles that need to be surmounted to establish a large-scale (18mx12mx4.3m), muti-user, muti-perspective, muti-model display. For visual domain, multiple point-of-convergence rendering techniques will be used to (re-)create scenes on a 7-projector display.. For the acoustic domain, a 192-loudspeaker-channel system will be designed for Wave-Field Synthesis (WFS) with the support of Higher-Order-Ambisonic (HoA) sound projection to render inhomogeneous acoustic fields. A haptic display, consisting of 16 platforms elements, will be used to simulate floor vibrations and also to provide infrastructure for other vibrating objects (e.g., handheld devices). An intelligent position-tracking system estimates current user positions and head orientations as well as positioning data for other objects. For the tracking system, a hybrid visual/acoustic sensor system will be used to emulate the human activity of extracting robust information by relying on different modalities.

Broader Impacts:
With new tools to study human perception, the system will enable research to explore new multi-modal, multi-user data displays that strategically utilize human ability to integrate cross-model sensory information. It will also serve as a platform to study interactions between humans and humanoid intelligent systems that can simulate environments at different degrees of complexity. The instrumentation supports research that will include the development of new interfaces for people with disabilities. The CRAIVE system will be an integrative facility for three school-wide centers, the center for Cognition, Communication, and Culture (CCC), the CCNI supercomputer, and the Experimental Media and Performing Arts Center (EMPC) to their full potential for collaborative research in this area, and to serve as a training platform for students who will engage in one of the enabled research areas.