Published: Feb 03, 2017
Updated: Dec 13, 2017

Cross-Platform VR-AR: Developing a multi-user virtual and augmented reality experience

Author: John Sunda Hsia

Guest post co-authored by Jarrell Pair, Senior Manager, Product Concepting, AT&T Entertainment Group and Sam Joly, AR/VR Developer


For the July 2016 AT&T SHAPE Tech Expo, we developed a cross-platform prototype to illustrate how networked entertainment, social interaction, and collaborative work environments can be developed utilizing different types of AR and VR devices.


The prototype was designed to allow three users to interact simultaneously. One person used a position tracked VR headset with handheld controllers. The second user experienced the virtual world using a spatially aware AR tablet. The third user controlled a desktop PC-based 3D workspace application. Users could interact by picking up and moving objects, drawing in space, and talking to each other. Three playfully designed flying TV sets served as user avatars.


For the prototype, we developed a 3D virtual space consisting of a circular room, anchored by a platform and interactive pedestal in the center. Small cubes were scattered throughout the environment. The cubes could be placed on the pedestal to trigger a variety of content experiences that would appear on the platform.


Figure 1: Shared 3D environment displayed alongside cross-platform AR/VR devices – spatially aware AR tablet, position tracked VR headset with handheld controllers, and desktop PC with gamepad or mouse/keyboard control.


Each cube represented a different AR/VR application area such as archaeology, mechanical engineering, art, and video entertainment.  For example, if the archaeology cube was placed on the pedestal, a 3D dinosaur skeleton would appear on the platform. One of the entertainment cubes could transform the space into an outdoor movie theater on a building rooftop.


Figure 2: Cube triggers an animated 3D engine model, pointing to AR/VR’s collaborative engineering applications.


Figure 3: The archeology cube makes a dinosaur skeleton appear in the shared 3D workspace.


Figure 4: The shared 3D space is transformed into a rooftop outdoor movie theater.


Figure 5: Volumetrically captured actor wearing a spacesuit appears (for more info on volumetric video here).


Each device presented a different level of immersion while offering a common set of interaction capabilities. For the VR headset, we used the HTC Vive. With its room scale tracking capabilities and wide field of view, it maximized the user’s sense of presence or “being there.”   The handheld controllers provided an intuitive method of working in the virtual space that allowed the user to walk, grab, and move objects as they would in the real world.


For the AR tablet, we used the Google Tango developer’s kit. Its motion tracking and depth perception technology gave users the ability to move and interact within the virtual space by using touchscreen buttons to grab or draw while walking. The 1:1 scale mapping between the virtual and real worlds supported seamless movement and manipulation in the shared 3D space. The AR tablet provided a way to fully interact with the virtual environment without completely disengaging from the real world. We also implemented a video see-through AR mode for the tablet that allowed the platform and its triggered content to be placed anywhere in the real environment. For further information on the Tango platform see this post.

Figure 6: Tango tablet app’s video see-through AR mode places a virtual TV in the real world.


Our desktop PC client was developed to show how this traditional computing platform can be integrated into AR/ VR experiences. Though the PC’s degree of visual immersion was limited, it delivered a broadly accessible and highly functional way to participate in the shared virtual environment. The mouse/keyboard or gamepad control options provided an interaction and navigation scheme familiar to console and PC game players. A key advantage of the desktop PC client was that it enabled the user to easily multitask between interacting in the virtual world while also using web browsers, spreadsheets, and other applications outside the shared 3D workspace window.


Which Experience Was the Favorite?

During the AT&T SHAPE Tech Expo, the prototype was used by visitors with a variety of backgrounds including school-age children, creative artists, software developers, and hardware engineers. Their reactions were overwhelmingly positive and quite diverse. The drawing tool was extremely popular as it allowed the three users to add comments or highlight different aspects of the virtual environment. Many enjoyed using the space as a playground where they could interact, leave messages for each other, and create pictures. Drawings of stars and dinosaurs were popular! Other users were very interested in learning how the demonstration was developed and how the devices functioned.


We observed that an individual’s comfort in using one device versus another varied widely and highlighted an advantage of developing cross-platform AR/VR experiences that support multiple platforms.

Figure 7: Three users interacting in the shared 3D workspace using different devices.


Currently, developers have an array of augmented and virtual reality platforms to consider utilizing. In the years ahead, the number of AR and VR devices in the market will continue to increase along with improvements in their capabilities and performance. Many developers will be faced with the challenge of building AR/VR experiences that can be used across multiple device types without being bound to a particular platform’s capabilities and market penetration.  Our prototype of a shared 3D workspace application provides a compelling design approach for developers which can accommodate established platforms such as the PC as well as the AR/VR devices of both today and tomorrow.

Share this post