In the aftermath of an earthquake, a snakelike robot that can crawl through rubble and tight air pockets is able to access places that no person could — or should — be able to go.
The Sarcos Guardian S, a small robotic visual inspection platform, is designed for exactly those scenarios: searching for cracks in industrial pipelines, finding people trapped in unstable buildings, sensing whether hazardous gases at an accident site could pose a safety risk to first responders.
Today, the robot is controlled by someone working at a safe distance, who sees the scene through its cameras and guides it with the equivalent of a video game joystick. Now, Microsoft and Sarcos are collaborating to add intelligent capabilities to the Guardian S that would allow it to navigate more autonomously — freeing the operator to focus on more important decisions.
The idea of automated industrial applications and using robots isn’t new. Robot arms now move products along an assembly line, machines turn hunks of metal into parts, a car shifts gears without your input.
But that’s a far cry from systems that are actually autonomous — ones that are capable of sensing their surroundings and knowing what to do when confronted with unfamiliar situations. Instead of performing specific tasks repeatedly without variation, these autonomous systems can dynamically respond to changing environments to solve a difficult problem. They also have vast potential to augment how people do their jobs or to perform work that is unsafe or cost-prohibitive for people to do.
Microsoft is building an end-to-end toolchain to help make it easier for every developer and every organization to create autonomous systems for their own scenarios — whether that’s a robot that can help in life-threatening situations, a drone that can inspect remote equipment or systems that help reduce downtime in a factory by autonomously calibrating equipment.
“Machines have been progressing on a path from being completely manual to having a fixed automated function to becoming intelligent where they can actually deal with real-world situations themselves,” said Gurdeep Pall, Microsoft vice president for Business AI. “We want to help accelerate that journey, without requiring our customers to have an army of AI experts.”
Today at the Microsoft Build developers conference, the company is announcing the platform’s first component: a limited preview program for developers to work with its experts to build intelligent agents using Microsoft AI and Azure tools that can autonomously run physical systems. That team includes longtime Microsoft researchers and engineers and experts from Bonsai, which Microsoft acquired last year.
Microsoft’s platform to help developers create autonomous systems employs:
- Unique machine teaching tools that enable domain experts to use their knowledge to create AI systems without data science skills
- Simulation technologies, such as Microsoft’s AirSim or industry simulators, that allow machines to learn in safe yet highly realistic environment
It will also draw from Microsoft’s diverse portfolio of Internet of Things services, an easy-to-use deep reinforcement learning platform and other AI solutions, and tools like ROS for Windows that allow developers to build intelligent robotic systems — all running on a trusted and secure platform, whether it’s on a device or in the cloud.
Early customers who participate in the limited preview program will learn how to use the same autonomous systems tools as companies like Toyota Material Handling, which is working with Microsoft to develop intelligent and autonomous forklifts.
Sarcos, for instance, was looking for an autonomous systems solution that would combine the best of what machines have to offer with human intellect and intuition, said Kristi Martindale, executive vice president and chief marketing officer for Sarcos.
Today, the person controlling a commercial Guardian S robot has to direct some of his or her attention to pushing buttons and levers on a joystick to guide it through tight spaces and over varied terrain. It can take several steps to appropriately manipulate each segment of the snake over a common landscape feature like stairs.
Using elements of Microsoft’s toolchain, engineers were able to develop an autonomous control system that enables the snakelike robot to avoid obstacles, navigate stairs and climb a metallic wall on its own.
In a real-world scenario, the operator would still play a role in guiding the robot. But if the Guardian S can sense its surroundings and perform all the intermediate motions to traverse stairs on its own, the operator can focus on assessing the scene and making more critical judgement calls, Martindale said.
“We are looking to offload the tasks that can be automated — how does the robot climb a stair? How does it move around obstacle? — so the operator can focus on the more important parts of the job,” she said. “The human is still there to say, ‘No you actually want to go to that obstacle over there because maybe that obstacle is a person who is hurt.’”
A journey from automated to autonomous systems
When people think of autonomous systems, many go straight to the vision of the fully autonomous car that drives itself while you sit in the back seat and read a book, said Mark Hammond, former Bonsai CEO and Microsoft general manager for Business AI.
But car manufacturers have been integrating autonomous features into cars for years, like cruise control or anti-lock braking systems that sense what a driver is trying to do when they encounter a hazard on a wet, slippery road. If that person slams on the brakes in a way that might lock the wheels, that control system takes over and prevents the car from losing traction.
Microsoft’s vision is to help other types of companies — from smart building and energy companies to industrial manufacturers — achieve these incremental steps towards autonomy in their own industries. As the Sarcos robot example shows, many will find the greatest value with humans still in the loop, Hammond said.
“In any sort of operation where you have a mechanical system that interacts with the physical world, you can probably make it smarter and more autonomous,” Hammond said. “But keeping people in the loop is still very desirable, and the goal is really to increase the capabilities of what those humans can do.”
Reinforcement learning is a branch of AI in which algorithms learn by executing a series of decisions and are rewarded or penalized based on which actions get them closer to an end goal. It’s well suited to help machines learn how to do autonomous control tasks, like deciding how to steer an underground drill or angle a tractor blade depending on whether the earth is lumpy or sandy or rocky.
But while deep reinforcement learning algorithms have successfully beat people in video games, mastering real world tasks has been more challenging. In the physical world, the dynamic environments that an autonomous system might encounter — with people and objects moving in unpredictable ways or minute-by-minute changes in temperature or weather — can be far more complicated. Pinpointing exactly where the system went wrong in a long sequence of steps is a difficult computational task.
Microsoft’s autonomous systems platform overcomes some of these challenges by using a unique approach called machine teaching. It relies on a developer’s or subject matter expert’s knowledge — someone who may not have a background in AI but understands how to steer a drill or keep the airflow in an office building at safe levels — to break a large problem into smaller chunks.
Instead of having reinforcement learning algorithms explore how to solve a problem randomly or naively, which could take forever, that person uses a programming language called Inkling to show the system how to solve simpler problems first and provide clues about what’s important. This shortcuts the learning process and enables the algorithms to hit on a solution much faster.
Microsoft’s platform also enables non AI-experts to establish and tweak the reward system, which is key to arriving at a solution that truly works. And it selects and configures the algorithms to tackle the task, eliminating the need for machine learning experts to custom build solutions.
For instance, team members worked with Schneider Electric, a global company working to digitally transform energy management in homes, buildings and industries, to test whether AI could help reduce the carbon footprint of HVAC systems that are used to heat and cool large commercial buildings.
“Schneider is very focused on sustainability, and large buildings are a top contributor to carbon pollution. So there’s a really important mandate to make HVAC systems more energy efficient,” said Barry Coflan, senior vice president and chief technology officer for Schneider Electric’s EcoBuildings Division.
Centered on a longstanding relationship, a proof-of-concept test was conducted using the Microsoft toolchain and Schneider supplied simulation to train an AI system to autonomously run the HVAC systems that controlled airflow and heating in a conference room. It had to balance saving energy with other goals, such as keeping the temperature comfortable for people inside and making sure there’s enough fresh air to keep carbon dioxide levels from building up.
Optimizing for all those factors — which are controlled by different physical systems — requires far more intelligence than a simple thermostat, says Microsoft’s Hammond. The system has to account for environmental variables that are constantly changing: energy costs that fluctuate throughout the day, people coming and going from the room, what the outside weather is doing, the physics of how air flows.
Using a machine teaching approach, Schneider and Microsoft experts first taught the reinforcement learning system to control temperature well. Then the AI system learned how to control air flows to keep air quality at healthy levels. Then it learned to consider how room occupancy affected those outcomes.
Taking all those factors into account, Microsoft’s AI system was able to reduce energy consumption in the room by about 20 percent, while preserving comfort and high air quality when it mattered. The teams are now embarking on a second phase of collaboration to scale the simulation across different types of rooms and further boost energy savings.
Coflan said the laddered approach to teaching and the ability to layer in different rewards enabled Schneider Electric to understand how the AI system was learning and track which factors contributed to the biggest gains.
“A lot of what we do has safety ramifications so we really need to understand how the AI system is making decisions,” Coflan said. “This approach lets you see how the system is getting smarter and gives you an audit trail that is essential for safety and reproducibility. Our customers would want that too — you can’t just put a system out there and say ‘Trust us.’”
Running simulation at scale in Azure
Because no company can afford to let a robot or an intelligent control system make millions of mistakes in a real-world factory or wind farm or highway as it is learning, reinforcement learning algorithms need to practice in a simulated environment that can replicate the thousands or millions of different real-world scenarios they might encounter.
The Microsoft toolchain also includes AirSim, an open source simulation platform originally developed by Microsoft researchers to use AI to teach drones, self-driving cars or robots to learn in high fidelity simulated environments. Or, the team can work with customers to train autonomous systems using existing industry-specific simulators.
In either case, running these data-hungry simulations in the Azure cloud enables the system to test thousands of different decision-making sequences in parallel, which allows the AI models to learn what does and doesn’t work much faster.
“If I have the ability to spawn thousands of simulations at once and in each one the pedestrian crossing the street is different and the curve of the road is different, suddenly the AI system is able to gather much more diverse experience in a short amount of time ,” said Ashish Kapoor, Microsoft principal research manager. “Azure gives us the ability to run these simulations at scale, which is really important.”
AirSim also allows developers to train different AI and control tools to solve different parts of more complex problems. In helping develop autonomous forklifts for Toyota Material Handling, for instance, researchers broke the task down into sub-concepts that are simpler to learn and debug: navigating to the load, aligning with the pallet, picking it up, detecting other people and forklifts, delivering the pallet, returning to the charging station.
In these complex scenarios, Kapoor said, it may make sense to use reinforcement learning to train a forklift on basic control tasks, like picking up a pallet. Machine teaching helps the system learn in progressively more difficult steps, such as aligning the lift horizontally and then finding the proper angles.
But other parts of the problem might be better solved by entirely different tools like obstacle detection and avoidance algorithms, robotics path planning or classical control techniques. Decomposing the larger task into smaller ones allows developers to select and deploy the best tool for that particular job.
“We are working to provide a comprehensive platform for customers who want to build intelligent autonomous systems, covering development, operation and end-to-end lifecycle management,” Hammond said.
Top image: An experimental version of the Sarcos Guardian S, a visual inspection robot that can be used in disaster recovery or for industrial inspections, has learned to avoid obstacles and climb stairs on its own using Microsoft’s autonomous systems platform. Photo by Dan DeLong for Microsoft.
Microsoft Build 2019 — related autonomous systems links:
- Watch: Microsoft Build 2019: Vision Keynote Highlights
- Visit: Microsoft Build 2019
- Explore: Microsoft Autonomous Systems
- Read: Machine teaching: How people’s expertise makes AI even more powerful
- Read: Microsoft to acquire Bonsai in move to build “brains” for autonomous systems
Jennifer Langston writes about Microsoft research and innovation. Follow her on Twitter