VR Multi User Experience
post-template-default,single,single-post,postid-22017,single-format-standard,stockholm-core-2.1,select-theme-ver-6.9,ajax_fade,page_not_loaded,menu-animation-underline,,qode_menu_,wpb-js-composer js-comp-ver-6.4.2,vc_responsive
Title Image

VR Multi User Experience

VR Multi User Experience

Our own research and testing showed us a known fact in Virtual Reality development: a shared VR experience is much better when more users are simultaneously involved.

Using our NATS Virtual Lab as a base, we expanded the experience to support multiple VR users to view and interact with the content at once.

The scaled down model of Heathrow Airport has been moved to the center of the room, and now has real-time data powering it. Using systems provided by NATS we can show each plane in relation to the airport, so you can actually see when a real plane is landing or taking off!

This could provide great insight by using a 3D lifelike view that’s not been seen before, allowing the users to optimize routes and patterns in the flight data!

Both users can also interact with the Model of Heathrow Airport. They can select any of the buildings with their laser pointers to find out which building it is, and what it does in the airport. This will show up for both users, so it allows them to discuss the things they are seeing.