New HoloLens 2 gives Microsoft the edge in the next generation of computing

Chelsey Potts felt so much pressure to succeed when she first stepped from the manufacturing floor into the cab of a Kenworth truck.

“When you get a job at Kenworth, most people are doubling what they made at their last job. And, you know, they’re nervous,” she said. “I have a 5-year-old and a 10-month-old, and saying, ‘Mommy’s not going back to work the next day,’ that’s my biggest fear.”

Her new job — outfitting the interior of a sleeper cab — required her to perform an intricate sequence of mechanical and physical tasks that were challenging to learn at first. For workers wielding tools in complex industrial settings, flipping through training manuals or consulting a screen takes focus and hands away from the work.

That’s not the only downside to a paper manual. Few people’s brains are good at translating between paper instructions and three-dimensional objects. New employees worry about slowing production with questions. Even when watching someone else install the bed that allows long-haul truckers to sleep comfortably, it was hard for Potts to see exactly where the other worker was putting his hands or figure out where the screw holes were.

Today, Microsoft is introducing HoloLens 2 — the next generation of its wearable holographic computer — with an integrated suite of new mixed reality services, out-of-the-box apps for businesses and sensors with the capacity to perceive and predict.

It’s exactly the device that Potts said she would have loved to have when she had that nerve-racking experience of first learning how to outfit a sleeper cab.

“If I had had this when I was trained, I would have been less nervous,” she said. “This shows you where the tools go and which way they turn and all the things you can’t see under a truck. It’s just there — step by step — however you want to learn.”

PACCAR employee Chelsey Potts assembles a truck door with a wireless drill

PACCAR employee Chelsey Potts works on a Kenworth truck assembly line. Photo by Microsoft.

Her employer PACCAR, a global leader in the design and manufacturing of commercial trucks, is one of the first companies to test Dynamics 365 Guides, a new mixed reality app also released by Microsoft today. It allows companies to easily create heads-up, hands-free holographic training materials for employees so they don’t need to flip through a book or consult a search engine to get information they need.

Also today, Microsoft is releasing a new Azure Kinect device providing developers new possibilities for creating AI-powered experiences. Azure Kinect combines the same depth-sensing camera technology found in HoloLens 2 with a circular microphone array and color camera and works with artificial intelligence services in Microsoft Azure. It enables developers to build new perception capabilities like identifying when a saw is operating dangerously based on the sound it makes, enabling robots to judge distance when packing pallets or identifying which item has been selected from a store shelf.

These new technologies are powered by intelligent services that can perform computations wherever it makes the most sense, whether that’s inside a device — so it could quickly spot unsafe conditions — or in the cloud, where virtually limitless computing resources can tackle complex problems. In short, this is the intelligent edge and intelligent cloud made real, Microsoft says.

Microsoft also says these mixed reality and perception tools will make it more practical for companies to adopt an entirely new wave of computing that bridges the digital and physical world. It’s only been made possible by recent advances in the intelligent cloud and at the intelligent edge — a diverse array of increasingly connected sensors and devices in everything from home appliances to warehouse floors to HoloLens 2 that can offer instantaneous insights about their surroundings.

“We are now in a place where this technology is solving real-world problems. You can really begin to see what this new wave of computing looks like and how it translates into real business outcomes, and I love that,” said Julia White, Microsoft corporate vice president for Azure marketing.

From flat screen to hologram

Until now, most people have experienced computing through a flat piece of glass: laptop, computer monitor, phone, tablet, video games on a TV. Microsoft’s mixed reality offerings draw digital information out of rectangular screens and allow people to interact with holograms in physical space. These can exist independently, like a three-dimensional rendering of a human heart that medical students can grab, resize and rotate to see all the structures clearly. Or they can relate to physical objects in the real world, like instructions superimposed on a furnace that show you how to change a filter.

Microsoft says the new HoloLens 2 provides a far more immersive, instinctual and comfortable experience for first-line workers whose hands are occupied by physical tasks. It can help them diagnose a problem with a jet engine or access step-by-step holographic instructions to assemble an electric bus battery. The person using it can go backwards to double-check a step with a nod of her head or a voice command. The person is able to see in three dimensions — on the physical equipment she is working on — precisely where each screw needs to go, or what direction to turn a ratchet.

Compared to the first generation of HoloLens, HoloLens 2 also offers new features like the ability to grab and rotate holograms as you would a real object rather than having to learn specific gestures. Eye tracking can sense when someone’s eyes land on a particular part of a machine and call up useful digital information about it. Words automatically scroll as you read. The end result is like going from watching a cartoon flip book to the truly immersive experience of actual cinema.

The new device’s field of view is more than double the size of the first-generation HoloLens, while new display technologies make holograms more vibrant and realistic.

“For the first time, you’re going to feel what it feels like to touch a hologram, to interact with a hologram and to play with it, almost where you forget that this is a piece of digital content you’re looking at as opposed to it just existing in the real world,” said Alex Kipman, technical fellow in Microsoft’s Cloud and AI group.

The business opportunities HoloLens 2 presents are powered by expanding capabilities at the intelligent edge. That’s comprised of an exploding array of devices — ranging from baby monitors and refrigerator sensors to something as sophisticated as HoloLens — that can process information quickly and locally.

These increasingly perceptive devices can tell a self-driving car when to turn, a piece of factory equipment when to shut off or store workers to clean up a puddle where customers could slip in instances where split-second decision making is necessary, and even in locations where connectivity is limited.

But as edge devices become more intelligent, they also need to become more secure. Apps that previously needed to run in the cloud and on local servers now need to work across a rapidly expanding collection of devices, with more developed each day for specific industrial processes. Azure now delivers secure computing power to and from the edge with solutions like Azure Sphere and Azure IoT, and it allows developers to design products that run seamlessly across diverse computing environments.

And when these edge devices do connect to the intelligent cloud, they provide deeper insights and allow for truly collaborative computing.

“HoloLens 2 is an incredibly versatile edge device — it will work offline, and it can connect to any cloud,” Kipman said. “But it was designed with Azure in mind. When it connects to Azure, it becomes a shared experience that anyone can access from any device or any platform. When they work together, that’s when the magic happens.”

Origins in Xbox and research

These advances in mixed reality and perception have been shaped by decades of research and business development at Microsoft: gesture recognition devices originally developed for Xbox, investments in AI, lessons learned from the first generation of HoloLens, applications that work wherever they’re needed, expertise in security and identity management, and a long history of collaborating with customers and developing enterprise solutions.

But to enable the truly collaborative and immersive computing experience that mixed reality can deliver, the tools also need to operate in an open environment and on devices that people already have in their homes or their pockets.

“The promise of mixed reality is that all of these devices are lenses into this connected content that exists in the world,” Kipman said. “Let’s say I want to place a hologram in the middle of the room. If I leave the room and you come in with your HoloLens or your phone or your tablet, should you see the hologram? Assuming privacy and permissions all apply, the answer is yes.”

A new cloud service called Azure Spatial Anchors, also released today, allows people to create these holograms that persist in a specific physical space. They can be accessed by multiple people using HoloLenses, phones or tablets — allowing, for instance, a clothing store manager to “leave” holographic images of outfits next to each mannequin. The next day, an employee could walk in, point an iPhone at each mannequin, see how it should be dressed and begin pulling clothes.

For the first time, you’re going to feel what it feels like to touch a hologram, to interact with a hologram and to play with it, almost where you forget that this is a piece of digital content you’re looking at as opposed to it just existing in the real world.

Technology for first-line workers

Information workers — people who do much of their work at a computer to produce words, design things, respond to emails, manage people or make business decisions — have benefited from an explosion of technologies allowing them to learn nearly anything with the touch of a mouse and communicate instantaneously. That’s driven unprecedented productivity gains for individuals and businesses.

But most people don’t work that way. People who use their hands to assemble, care for, repair, troubleshoot or interact with things have too often been an afterthought in this technological revolution. That’s left an enormous opportunity to give first-line workers the technology they need to make similar productivity gains, said Lorraine Bardeen, Microsoft general manager for Dynamics 365 Mixed Reality at Work.

“The first-line workforce in so many companies is vital not only to day-to-day operations, but also in the way they craft their products. And they’re often the majority of employees,” Bardeen said. “And yet they’ve experienced very little of the empowerment that technology has brought to people who work in offices or more traditionally compute-friendly environments.”

When Microsoft started asking companies about how mixed reality could benefit them, the same needs bubbled up across industries. They wanted help connecting workers in remote locations or disparate workplaces with experts to troubleshoot problems, in envisioning how equipment or furniture or physical objects will actually fit in three-dimensional spaces and in training new employees who need their hands free to perform work.

To deliver more out-of-the-box value — in the way that programs like Microsoft Word and Excel helped people find value in a new operating system called Windows — Microsoft has created mixed reality applications for Microsoft Dynamics 365. These allow companies to almost immediately use HoloLens 2 to meet workplace needs without needing to hire a small army of developers.

Dynamics 365 Guides now joins Dynamics 365 Remote Assist and Dynamics 365 Layout as Microsoft’s pioneering mixed reality applications for business. Guides allows companies to move training materials from flat paper and screens into an immersive three-dimensional experience.

“Everyone here wants to be successful, and the ability to get new employees in and make them productive quickly is invaluable,” said Rob Branson, senior director of global technologies and operations for PACCAR ITD, which has worked with an early version of Dynamics 365 Guides.

“If you think about the way adults learn, it’s very visual. And the ability to see step-by-step instructions overlaid on an actual physical object will really accelerate how an employee learns a new task,” he said.

HoloLens 2 was designed with these workers in mind — it’s lighter and far more comfortable than the previous generation of HoloLens, with a more balanced center of gravity, so a person can wear it all day. A new flip-up visor lets workers switch easily between physical and holographic worlds.

Wearing a device that slips on as easily as a hat, PACCAR employees are able to access step-by-step holographic instructions to guide them through unfamiliar tasks like assembling a truck door. In Dynamics 365 Guides, lighted arrows create a path from each instruction card to the precise hole where a wire needs to be threaded or to the location of the correct tool on the factory floor.

Holographic drawings superimposed on the actual door show how to perform that task and light up structures behind the steel panel that normally can’t be seen without superpowers like X-ray vision.

In industries with aging workforces, there’s also an urgent need to impart workplace wisdom that employees have accumulated through years of apprenticeship or decades on the job to the next generation of workers.

At Alaska Airlines, for instance, it can take roughly two years for a new mechanic to get fully trained and up to speed. The hope is that mixed reality tools might reduce that learning curve significantly. The immersive training environment also resonates with employees who have grown up with video games and nearly instantaneous access to digital information.

“It brings the paper to life,” said Mike Lorengo, director of Architecture and Strategy at Alaska Airlines. “Rather than seeing a flat piece of paper, I’m seeing 3D projected onto an engine.”

Intelligent conversations between people and things

Mixed reality is even more powerful when it takes advantage of the intelligent edge and intelligent cloud’s different capabilities.

In some scenarios, you want to quickly process information on the intelligent edge without sending that data to the cloud, such as in cameras that can alert you to imminent safety risks or algorithms that control braking systems. On a factory floor, not all the data from each sensor on each piece of equipment is relevant at any given time. So running less-complicated AI services on the edge can help filter out irrelevant information or perform tasks that don’t require the power of the cloud.

If you need a hologram to help potential customers envision how a new car or potential remodel will look with different options, a single HoloLens 2 using on-board capabilities in a showroom or living room will offer plenty of computing power and resolution.

But connecting that device to the new Azure Remote Rendering mixed reality cloud service can quickly produce intricate, three-dimensional digital models that begin to rival the sculpted clay or detailed architectural models that a company might spend days or months building today. That simply wouldn’t be possible without the graphics processing power of the cloud.

“Suddenly mixed reality goes from something that’s a novel way to augment what you’re already doing to being able to replace an entire business process — for example, using full digital construction in a way that just couldn’t happen before,’” said White.

PTC’s IoT and mixed reality tools help companies minimize downtime by empowering on-site workers to quickly diagnose and repair machines that are critical to their operations.

PTC, one of Microsoft’s partners, has developed integrated systems that combine IoT edge solutions, the Azure cloud and mixed reality tools to digitally transform businesses of all kinds, from aerospace and defense contractors to clothing brands and life science companies.

Think about a lab technician who comes into work one morning and finds a critical machine that processes blood samples isn’t working, said Jim Heppelmann, president and CEO of PTC.

Several years ago, a blinking light or vague error message might be the only clue to what’s wrong. She’d probably call the manufacturer, who might or might not be able diagnose the problem over the phone. Mostly likely, they’d have to dispatch a repair person for that specialized machine who might or might not work in that city. It could take hours or days of downtime to get it back up and running. Meanwhile, patients worried about their blood results would be left in the dark.

Today, with the ThingWorx for Azure service, she could put on a HoloLens 2 device and see a holographic dashboard with each component’s health and status mapped onto the physical machine. The data collected by tiny IoT sensors and sent to the Azure cloud might diagnose a problem with one of the cartridges. The lab tech could access step-by-step holographic instructions showing her how to open the cover, which lever to flip, how to insert the new part. If she can’t figure it out, a repair expert sitting in the manufacturer’s office in Nebraska could look at a screen, see exactly what she sees through HoloLens 2 and walk her through the job.

“It’s a closed loop between humans and things,” Heppelmann said. “The IoT devices tell me what’s wrong, and the mixed reality solutions allow me to repurpose that blood test technician into someone who’s able to fix a simple problem that saves time and money on airplane tickets and rental cars.”

For any first-line worker who might wear a mixed reality headset for a good portion of the day, the improved comfort and larger field of view in HoloLens 2 — which allows people to see multiple holograms, read text and view intricate details in 3D — will be transformative, Heppelmann said.

“Those two things take HoloLens from a device that’s interesting to play with and prototype with to one that could be put into widespread production in factories, in hospitals, in construction sites today. This is a big step forward,” he said.

Those features are also important to Bentley Systems, another Microsoft partner that develops software for engineers, architects and construction firms building massively complicated infrastructure projects.

When overhauling an urban train station or building a new soccer stadium with lots of moving parts and heavy equipment, looking down to access information on a phone or tablet can be dangerous, said Noah Eckhouse, Bentley senior vice president for project delivery. HoloLens headsets allow workers to access digital information while remaining aware of their physical surroundings.

Through HoloLens, the company’s SYNCHRO software allows workers to zoom in on a particular location on the construction site and access important digital information, like safety guidelines or installation instructions, for that particular job or area. Managers can see in three dimensions what the project is expected to look like two days or three weeks from now — based on constantly changing realities and projections — and anticipate any scheduling conflicts.

“A construction site is like a giant ballet — it’s a very highly choreographed operation with movements of materials and people that all have to exist within a certain space,” he said. “And the plan changes the first day you’re on the job.”

While it might be possible to store and update plans for a two-bedroom bungalow on a single device, it would be impossible to track all the moving parts on a massive infrastructure project without the cloud, Eckhouse said.

By connecting each HoloLens device on a job site to a master model that’s constantly updating in Azure, SYNCRHO ensures that everyone works from the same shared reality, with the latest information to sequence jobs, plan crane movements, track progress and keep workers safe.

“The cloud connectivity is critical because in these large projects the amount of information going back and forth between the field and the engineers and designers is continual,” Eckhouse said. “And the consequences of working on infrastructure projects in the physical world are very real.”

Bringing powerful perception tools to the edge

Two defining achievements in computer vision and AI contribute to HoloLens 2’s immersive experience. The ability to interpret physical spaces with semantic understanding allows the device to differentiate between walls and windows or a couch and coffee table. Natural hand-tracking now allows people to grasp, rotate and expand the holograms more instinctively, rather than having to learn gestures that mimic mouse movements.

Those advances are enabled by the fourth generation of Kinect, combined with AI tools that operate on the edge. That depth- and motion-sensing technology was originally developed nearly a decade ago to create a gesture-recognition accessory for Xbox. But the ability to sense depth accurately and pinpoint how human bodies are moving in space turned out to have far broader applications than gaming.

Ocuvera, for instance, is working with Azure Kinect in a system that aims to help prevent the roughly 1 million falls that occur in U.S. hospitals each year, and even more worldwide. It can sense when a patient who needs help walking is trying to get out of bed unassisted, with enough advance warning to alert a nurse to go help.

Using a depth-sensing camera and AI algorithms, the system recognizes patterns of movements before a patient gets out of bed, like sitting up or swinging their legs around. Initial results from pilot studies at 11 clinical sites found that unassisted and unobserved bed exits decreased by more than 90 percent after the technology was implemented.

CEO Steve Kiene said Ocuvera’s team has investigated every depth-sensing camera in the world and even tried to build its own. When it comes to distinguishing whether a patient is moving forward or just rolling over or detecting the first wiggle of a foot, none have come close to the accuracy and resolution of Azure Kinect.

“It’s like looking for tells when you’re playing poker,” he said. “Only Azure Kinect gives us the data to really see what’s going on with a patient in a hospital bed and predict their intent with enough accuracy. When we do a pilot with a hospital, they often tell us that’s just not possible, but then they find out it does work, and they’re amazed. It’s kind of like magic.”

Ocuvera is working with Azure Kinect’s depth-sensing camera to help prevent hospital falls by predicting when a patient is trying to get out of bed unassisted. Photo by Microsoft.

The value of the new Azure Kinect, said Kiene, is that it marries sensor hardware with Azure tools like Cognitive Services, which allow developers to deploy AI solutions easily and quickly. That could help the Ocuvera team more easily integrate voice recognition or translation services into their system, enabling patients to call out to a nurse in multiple languages, for instance.

As more of those services move to the edge, they can be run locally, without having to send data up to the cloud and eat up precious seconds, Kiene said.

“Azure Kinect isn’t just a camera — it’s a connection to all these other services that are really important, like speech recognition and body tracking. It’s the whole package that’s valuable,” Kiene said.

Kipman, who has invested more than a decade in making HoloLens 2 the most immersive holographic computer and most intelligent edge device in the world, said the real reward for any inventor is to see what people do with their creations.

“We put our hearts, bodies, souls and all our waking hours into creating this vision and bringing it into practice,” Kipman said. “This is now the moment where we get to see how this technology empowers our customers to compete, to digitally transform, to achieve something they weren’t able to achieve before, to do something that we’ve never imagined. All of those things I’m excited about.”

Top Image: Dynamics 365 Guides and HoloLens 2 can help employees learn new tasks, like working on a fuel injection system, with step-by-step holographic instructions. Photo by Microsoft. 

Related:

Jennifer Langston writes about Microsoft research and innovation. Follow her on Twitter.