“You can actually feel like you’re in the same place”: Microsoft Mesh powers shared experiences in mixed reality

For years, Cirque du Soleil co-founder Guy Laliberté received countless proposals for virtual reality technologies, but they couldn’t match the magic of his intensely visual and mesmerizing live performances. Now, with a new platform provided by Microsoft, he’s rethinking that.

On Tuesday, he appeared at Microsoft’s Ignite digital conference via holoportation, which uses 3D capture technology to beam a lifelike image of a person into a virtual scene. In the company’s first keynote experience designed entirely for mixed reality, people attending the conference from living rooms and home offices around the world could experience the show as avatars watching events unfold in a shared holographic world.

It was the company’s first opportunity to showcase some of the experiences made possible by Microsoft Mesh, a new mixed-reality platform powered by Azure that allows people in different physical locations to join collaborative and shared holographic experiences on many kinds of devices.

“This has been the dream for mixed reality, the idea from the very beginning,” said Microsoft Technical Fellow Alex Kipman. “You can actually feel like you’re in the same place with someone sharing content or you can teleport from different mixed reality devices and be present with people even when you’re not physically together.”

Kipman appeared on the Ignite virtual stage as a fully realized holoportation of himself, narrating the show’s opening experience in real time as rays of light that simulated his physical body.

James Cameron, the filmmaker and ocean explorer, and John Hanke, CEO and founder of leading augmented reality company Niantic, Inc., also joined Kipman remotely to spotlight how Microsoft Mesh is helping them create shared experiences across the virtual and physical worlds.

Laliberté chatted with Kipman about a new collaboration to help Lune Rouge, another company Laliberté founded, realize a project called Hanai World. It’s a social mixed reality platform that he’s thought about for years — which would connect live and digital entertainment experiences into single events — but only now have technologies like Microsoft Mesh caught up with that vision.

Microsoft Mesh will also enable geographically distributed teams to have more collaborative meetings, conduct virtual design sessions, assist others, learn together and host virtual social meetups. People will initially be able to express themselves as avatars in these shared virtual experiences and over time use holoportation to project themselves as their most lifelike, photorealistic selves, the company said.

The new platform is the result of years of Microsoft research and development in areas ranging from hand and eye tracking and HoloLens development to creating persistent holograms and artificial intelligence models that can create expressive avatars.

Built on Azure, Microsoft’s cloud computing platform, Microsoft Mesh also benefits from Azure’s enterprise-grade security and privacy features, as well as its vast computational resources, data, AI and mixed reality services.

“More and more we are building value in our intelligent cloud, which is Azure,” Kipman said. “In these collaborative experiences, the content is not inside my device or inside my application. The holographic content is in the cloud, and I just need the special lenses that allow me to see it.”

With Microsoft Mesh-enabled applications, designers or engineers who work with 3D physical models — anything from bicycles to high-end furniture to jet engines to new sports stadiums — could appear as themselves in a shared virtual space to collaborate and iterate on holographic models, regardless of their physical location.

Avatars appear around a three-dimensional hologram of car schematics to illustrate a virtual design review session in mixed reality.

Microsoft Mesh, a new mixed reality platform, will allow geographically distributed teams to meet and collaborate in shared mixed reality sessions where participants appear as digital representations of themselves. Image by Microsoft.

Architects and engineers could physically walk through a holographic model of a factory floor under construction, seeing how all the pieces of equipment fit together in three dimensions, potentially avoiding costly mistakes.

Engineering or medical students learning about electric car engines or human anatomy could gather as avatars around a holographic model and remove parts of the engine or peel back muscles to see what’s underneath. Colleagues could simply get together and chat in a shared virtual space, or companies could use Microsoft Mesh-enabled apps to offer virtual all-hands meetings or trainings to employees around the world.

The Microsoft Mesh platform will in coming months offer developers a full suite of AI-powered tools for avatars, session management, spatial rendering, synchronization across multiple users and holoportation to build collaborative solutions in mixed reality, the company said.

Though users will have the richest experiences in mixed or virtual reality, Microsoft Mesh’s open standards will give developers the freedom to build solutions that will work across many different devices: HoloLens 2, a range of virtual reality headsets, smartphones, tablets and PCs.

At Ignite, Microsoft announced two apps built on the Microsoft Mesh platform.

Those include a preview of the Microsoft Mesh app for HoloLens, which allows team members to remotely collaborate and is available for download. Customers can also request access to a new version of Mesh-enabled AltspaceVR, which will allow companies to hold meetings and work gatherings in virtual reality with enterprise-grade security features including secure sign-ins, session management and privacy compliance.

Over time, the company said it expects customers will be able to choose from a growing set of Microsoft Mesh-enabled applications built by external developers and partners, and also to benefit from planned integration with Microsoft products such as Microsoft Teams and Dynamics 365.

“This is why we’ve been so passionate about mixed reality as the next big medium for collaborative computing,” Kipman said. “It’s magical when two people see the same hologram.”

Exploring the world together

On board OceanXplorer, one of the most advanced research and deep sea exploration vessels ever built, there’s only so much room to host all the scientists clamoring to learn from new data constantly collected by instruments and cameras on its deep sea vehicles that can probe everything from coral reefs and brine pools to sea life around deep hydrothermal vents and minerals around underwater volcanoes.

At Ignite, OceanX, a nonprofit that merges cutting edge science with compelling storytelling and product and technology experiences to support ocean education and awareness, announced a new collaboration with Microsoft to create a Mesh-enabled “holographic laboratory” on the ship that scientists could gather in — either in person or virtually from labs and offices around the world — to see 3D holograms of the areas the vehicles are exploring.

A woman wearing an OceanX shirt and a HoloLens interacts with an unseen hologram with an image of sea life in the background.

At Ignite, OceanX announced a new collaboration with Microsoft to create a Microsoft Mesh-enabled “holographic laboratory” on its research ship OceanXplorer. Image courtesy of OceanX.

Researchers trying to figure out why sperm whales hunt in certain areas, for instance, might see a holographic representation of a deep sea canyon with data collected from tags put on the whales, overlaid with information about salinity, temperature and ocean chemistry changes and integrated with data from fish finders showing where squid and other prey might be.

“The idea is to take all this amazing scientific data we’re collecting and bring it into a holographic setting and use it as a way to guide scientific missions in real time,” said Vincent Pieribone, vice chairman of OceanX.

The goal is to allow any researcher with a HoloLens 2 or other compatible device, using Microsoft Mesh, to appear around a table as an avatar and point to a particular area on the holographic seafloor that they might have a question about and converse in real time with other scientists about what they are seeing.

On OceanX’s research missions, there are often groups of people huddled around video feeds, posing questions and having sidebar conversations with their colleagues. Researchers who aren’t on the boat, even if they’re watching the same footage on a screen in their office, don’t always benefit from those interactions, Pieribone said.

“There’s a social component to this that’s essential,” he said. “We want to bring everyone into the same ‘room’ so they can bounce things off of each other and have that human connection.”

To expand on an entirely different kind of exploration, Niantic demonstrated at Ignite a proof-of-concept Pokémon GO demo experience that runs on HoloLens 2. It was designed to showcase the vision for a new collaboration that will build on Microsoft’s and Niantic’s mixed and augmented reality capabilities.

In the demonstration, which does not represent a consumer product, Hanke and a bevy of Pokémon at his favorite park were joined by Veronica Saron, product marketing manager for Pokémon GO, to battle in a shared mixed-reality session.

Niantic’s mission is to create technologies that allow people to socialize and explore the world together, Hanke said, whether that’s kids using Pokémon GO to explore their neighborhoods with parents or friends, or thousands of people gathering at parks for festivals.

“Microsoft Mesh offers a whole new way of doing that,” he said. “This notion of bringing my virtual friends along with me as I go out and walk and explore the world — I just love that concept and I’m really interested to see what we can do with that.

This has been the dream for mixed reality, the idea from the very beginning.

The demonstration shows the potential of the Pokémon GO experience built on Niantic’s planet-scale platform that has enabled millions of people to have augmented reality experiences in the real world, enhanced with Microsoft Mesh features enabling people to be present together in shared experiences across space and time, and running on HoloLens.

“Our part of this is the work of stitching the digital and physical worlds together, connecting the bits and atoms so these experiences can be possible using the Niantic platform,” Hanke said. “But social connections are really at the heart of everything we do, and Microsoft Mesh innovations just enrich that.”

‘Another layer of human connection’

Lune Rouge, the Quebec-based initiative founded by Cirque du Soleil’s Laliberté, is also beginning to explore how Microsoft Mesh might enable people to virtually attend concerts, theatrical performances, DJ events or even family celebrations from remote locations.

The Hanai World project —inspired by the Hawaiian word that, loosely translated, means to choose someone as family — aims to forge new connections between digital and physical entertainment experiences.

The aim is to create digital representations of entertainment venues around the world and capture live performances in enough 3D fidelity that people could experience the same event in the flesh or from their living room in mixed or virtual reality. The platform would curate a mix of Lune Rouge and user-generated content across a wide variety of media and genres.

“It would be a nice complement to live entertainment,” said Alexandre Miasnikof, executive director for production at Lune Rouge. “It brings in another layer of human connection, and it brings entertainment to people who wouldn’t normally be able to come to an event, whether because of geography or access.”

Two friends living on opposite coasts could join the same concert as avatars and experience the show together, or perhaps eventually one day a holoportation of someone’s grandmother living in another country could interact with family members in real time at a reunion.

“What we have today is the promise, and how soon we can realize that promise, we don’t know,” Miasnikof said. “But we think we have a good foundation with Microsoft Mesh and we’ll build from there.”

That’s precisely the goal, Kipman said: to see what kinds of solutions that might have previously been dismissed as impossible or too time-intensive to stand up can now be built much more easily with the Microsoft Mesh platform.

“When you think about what it actually takes to usher in a new medium for computing, you have to make deep investments across the ecosystem, which is really what Microsoft has done,” he said.

“Now we invite people to go create value on top of that and benefit from the years of really hard R&D we’ve done to offer them these features in a turnkey way.”

Related:

Microsoft Mesh

Microsoft Mesh – A technical overview

Introducing Microsoft Mesh

Microsoft Ignite 2021

Microsoft Ignite 2021: Satya Nadella’s keynote

Ignite Book of News

Jennifer Langston writes about Microsoft research and innovation. Follow her on Twitter.

Top image: During Microsoft’s Ignite conference keynote, Technical Fellow Alex Kipman showcased Microsoft Mesh, a new mixed-reality platform powered by Azure that enables shared collaborative and holographic experiences. Image by Microsoft.