Satya Nadella: Microsoft Ignite 2016

Remarks by Satya Nadella, chief executive officer, in Atlanta, Georgia, Sept. 26, 2016.

ANNOUNCER:  Please welcome back Julia White.

(Applause.)

JULIA WHITE:  Welcome back.  I hope you had a fantastic day so far.  Now this morning and through the general sessions, you’ve heard a lot about technology that’s in market today or coming soon.  Well, now we shift to talk about the next horizon of innovation and transformation.

Now it’s these areas that will collectively shape our future in this industry.  So as you hear from Satya, first is to enjoy it, you’re going to hear about and see some amazing things.  But also encourage you to think about how you’re going to incorporate these technologies, these ideas and possibilities into your IT strategies today so you’ll have a map for how you can take advantage of them in the future.

So without further ado, please welcome Satya Nadella.

(Applause.)

SATYA NADELLA:  Thank you.

Good afternoon.  It’s fantastic to be back here this afternoon at the Ignite keynote.  We’re going to have some fun this afternoon.  We’re going to see some real magical technologies.  And I’m going to talk about in particular how we have set ourselves a goal of democratizing AI.  The central thesis and goal we have with AI starts with our mission, to empower every person and every organization on the planet to achieve more.

We are not pursuing AI to beat humans at games.  We are pursuing AI so that we can empower every person and every institution that people build with tools of AI, so that they can go on to solve the most pressing problems of our society and our economy.  That’s the pursuit.

And to build perspective on this, let’s go back to what we’ve been talking about, mobile-first, cloud-first.  In fact, this morning Scott talked about how we are living in this world, how customers are achieving digital transformation where it’s about the mobility of the human experience across all of the computing in our lives.  That’s what the cloud enables.  Even the cloud is not a single destination but a distributed fabric.  That’s what’s driving all of these ambitions that we have in terms of technology.

But what is at the intersection of our three ambitions is AI.  The ability to reason over large amounts of data and convert that into intelligence.  That intelligence shows up as handwriting recognition on Windows 10, or the Windows Hello feature, the ability to face recognize you, or even magical new devices like the holographic computer, where you have the ability to digitally reconstruct and recognize everything that you see and then to superimpose objects in that world.  How we infuse every application, Cortana, Office 365, Dynamics 365 with intelligence.  And the building blocks that constitute intelligence that are available as developer services in Azure.  That’s what we’re doing.  That’s the approach we are taking.

But to truly understand and perhaps build even more of a perspective, let’s step way back to perhaps what is the first machine that democratized access to information, the printing press.  In 1450 or so when the printing press came out, the Guttenberg Bible got published, removable type became prevalent.  Before that we had something like 30,000 books in the world.  And 50 years after the printing press we had 12 million books.  It changed how humans both created information and used information.  You can, in fact, trace back everything in the modern era to our ability to create and diffuse information and learn.

The next inflection point perhaps in this information explosion was 1989 and the birth of the World Wide Web.  And it’s pretty stunning to see the amount of data, the amount of information that we are generating.  I was just reading this weekend a report from the IDC which says in 2015 we generated close to 10 zettabytes of data.  What’s fascinating to me is what that report projects we will generate in 2025.  We will generate something like 180 zettabytes.  I mean we’re getting to a point where we don’t even know what to name things.  We’re getting to a point where that march from peta to exa to zetta, what comes next, we don’t even know.

So in all of this information explosion what has remained scarce is something that I’ve talked about in the past: It’s human attention and time.  Our ability to make sense of all of this information.  So that’s really what we all need to turn our attention to.  We’ve used technology very successfully to democratize both creation of information and the distribution and access of information.  And now we need to turn to technology to democratize creation and access to intelligence.  That’s the approach that Microsoft is taking with our AI efforts.

We have four core pillars to what we’re going to do.  It’s agents, applications, services, and infrastructure.  When we talk about agents, in our case Cortana, I think of it as the third run time.  What do I mean by the third run time.  Just like the PC operating system or the mobile phone operating system, or the web and the browser, it’s the new organizing layer.  It’s what helps mediate the human computer interaction, your ability to get to applications and information.

This new category of the personal digital assistant is a run time, a new interface.  It can take text input, it can take speech input, it knows you deeply.  It knows your context, your family, your work.  It knows the world.  It is unbounded.  In other words, it’s about you, it’s not about any one device.  It goes wherever you go.  It’s available on any phone, iOS, Android, Windows, doesn’t matter.  It is available across all of the applications that you will use in your life.

So we are well on our way here with Cortana.  In fact, we have 133 million active users each month using Cortana and they have across 116 countries, and they’ve already asked 12 billion questions.  And that is what’s driving even the skills ecosystem of Cortana, the fact that we have these SDKs that allow developers to be able to infuse Cortana with more intelligence is what makes Cortana even more relevant every day for our everyday use.

And so the first demo we want to show you is where we are with Cortana and the Cortana skills so that you can get more out of every moment of your life.  To do that, welcome Laura on stage from the Cortana team.

Laura.

LAURA JONES:  Thanks, Satya.

(Applause.)

Hi, guys.  We’re building Cortana to be an indispensable personal assistant.  Cortana learns about me, my organization, and the world around me to better assist me throughout my day.  I can start interacting with Cortana from the moment I get in front of my PC with my voice above lock.

Hey, Cortana, what’s my next meeting?

CORTANA:  Coming up next you have Show Off My Skills, at Ignite 5:00 o’clock to 6:00 p.m.

LAURA JONES:  Right.  So as my calendar said, today I’m going to be showing off some of Cortana’s skills.  Many of these are already available today in Windows 10 and some show our vision for the future of artificial intelligence.  Cortana is more than just a voice assistant.  By learning about me she can help me keep on top of the things that matter most to me.  Cortana works with Office 365 to get to know my organization and can tell me things like my next meeting, or when I have a conflicting meeting.  I don’t know which one I should choose from there.

And Cortana can also let me see who I’m meeting with.  So here’s Will, if I want to know more about Will, Cortana brings in information from places like LinkedIn, or my upcoming meetings with him.  I can even see our communication history and this way I’m always ready for what’s next on my schedule.

Now one skill that we’re bringing to Cortana that I’m really excited about is soon Cortana is going to be able to help me make sure I keep on top of the things that I’ve promised to do.  Using machine learning we’re picking out commitments that have been made in email, like this one that I sent to Collin telling him I was going to send a recap of this event.  Cortana’s proactively reminding me about that commitment.  So this way I don’t have to forget about missing the deadline and I can rest assured that Cortana always has my back.

Cortana can also keep on top of the things I’m passionate about, like University of Florida football, kind of a rough game on Saturday, but you know it’s still great to be a Florida Gator, some claps thank you.  So Cortana can also help me keep on top of my fitness routine, with a new skill that we’re bringing called Health Insights.  Cortana is taking information from the health cloud, things like my physical activity and sleep patterns, and combining that with what she already knows about me, like my schedule and my daily routine, and she’s bringing me these proactive insights using machine learning, things I wouldn’t have necessarily known or maybe don’t want to know, like I eat too much fast food when I travel.  And she can even tell me that I’m going to miss a workout at the gym this week, because I’m traveling to New York.

Not only do I get that alert, but I get a recommendation for what time I can reschedule my workout.  So this way I stay on top of my fitness routine.  Now Cortana also works across devices and in this way can bring me information to whatever device I’m on.  So here’s an urgent text message that my boss sent to me.  Of course it was originally sent to my phone.  But here I am on my PC and I’m getting that text message here.  I could go ahead and reply inline, but because he’s asking me to do something urgent that I don’t want to forget after this demo I’m going to add it to my to-do list.  And you know what, Cortana can help me there, too.

Cortana works with many apps and services to augment her skills.  One of those is Wunderlist.  So now I can easily add something to my list.  Add monthly report to my work list.

CORTANA:  OK, I added this to your work list.

LAURA JONES:  You can see it’s showing up right in the Cortana canvas, for when I need it, and I’ve also got it here on my Wunderlist app.

Now let me talk about this notification that I just got.  We’re thinking about ways that we can use proactive information so that you can keep on top of your business metrics.  So being on the Cortana team one of the metrics that’s really important for me to track is monthly active users.  What if Cortana, using information from Power BI, could alert me when I’ve hit an important milestone that I care about, or when things are trending up and down in those monthly active users?

I can take that notification and get more information and here Cortana is bringing in a data visualization from Power BI of those monthly active users.  Now this is monthly active users, but you can think about how this could apply to any business metric that really matters to you.  It’s all about getting the information you need just when you need it, so you can make business decisions.

The last thing I want to show you is how Cortana works with Sticky Notes.  So here’s a Sticky Note that I wrote earlier today, because the Sticky Note has intelligence behind it I can click on this phone number and make a phone call, straight from Skype, or I can click here and add a reminder straight to Cortana.  And since Cortana is across all my devices that ink text has now turned into text on my phone and here on my Android device that same reminder is here.  So it goes off whatever device that I have closest to me.

Sticky Notes also works with Bing to bring in information.  So if I were to write an address I could get a map or I’m going to write a flight number.  There we go.  And to bring in details of that flight directly into this Sticky Note and those are real-time details about the status of the flight.

So those are just some of the ways that we’re using intelligence to bring new skills to Cortana, and we look forward to showing you more in the future.  But for now I’m going to hand it back over to Satya.

Thank you.  (Applause.)

SATYA NADELLA:  Thank you, Laura.

So now let’s switch to talk about applications.  And how the same approach of infusing intelligence into Cortana as that personal assistant we can take that approach to every application.  In fact, I want to start with SwiftKey.  SwiftKey is one of the more popular keyboard apps on Android and iOS, a third-party keyboard app.  They have over 300 million users of it today.  In fact, it’s already taken over a trillion-plus keystrokes.  It’s saved, in fact, people something like 100,000 years of keyboard entry time.  And the technique we have used to do all of that was this n-gram-based approach, where we were able to predict the next word based on the previous n words.

Just last week we made a giant leap.  We’ve switched to deploying a neural network.  Think about this.  That means every one of us will have a neural net that learns on how we type.  So that means it goes beyond the previous three words, or four words that we entered to predict, it goes beyond that to the semantic meaning of what we are trying to communicate, across all the devices.  So it’s no longer that a keyboard is attached to a device.  The keyboard is attached to you and has a neural network that’s constantly learning and helping you get those magical typing skills that we always wanted.

And another application is MileIQ.  MileIQ is fascinating in the sense that it’s a context-aware location-aware intelligent app.  There are essentially ‑‑ in fact, the No. 1 finance category app both on iOS and Android today.  There are over 60 million Americans who are road warriors, who have to keep track of their mileage, especially for work so that they can get the tax refund from the IRS.  And, in fact, this MileIQ application has returned already $1.2 billion to the users of MileIQ.

And the way it goes about that is by really, again, taking all of the signal, in this case your driving and location information, but converting it into intelligent action.  That approach is what we now want to take into our very mainstream products, Office 365.  In fact, for me when we talk about Office 365 it’s not just simply that we’re moving to the cloud and this is a new way to deliver some of the same technology as a service.  In fact, the most profound shift is in the fact that the data underneath the applications of Office 365 is exposed in a graph structure and in a trusted, privacy preserving way we can reason over that data and create intelligence.  That’s what’s really this profound shift in Office 365.

And you see this in many, many ways.  You see it in this focused inbox.  I mean think about e-mail triage, e-mail triage on your phone or on your PC.  If you want to talk about that scarcity we have of human attention and time, the ability to deploy a custom neural net model that, again, understands your inbox, it’s not a generic model.  It understands the type of mail, the people that you’re corresponding with, the content, the semantic content of your inbox, and to be able to focus your attention on things that matter the most.

Skype Translate is something, again, that’s fairly magical.  And there have been three different strands of research that came together to make translation happen.  There was speech recognition, there was speech synthesis, there was machine translation.  And then Skype Data.  So you take those three technologies, apply deep reinforcement learning and neural net, and the Skype Data and magic happens.  We already have eight languages.  We see in these emerging phenomena like transfer learning, when you teach it one language it learns the other, and really solves that human language barrier.

Even inside of Word or Outlook, when you’re writing a document, we now don’t have simple controller-based spell correction, we have complete computational linguistic understanding of what you’re building or what you’re writing.  And so that means we can, of course, correct spelling.  In fact, I think that I would be unemployable but for the red squiggly.  (Laughter.)  And now I even have the capability to be a better writer in terms of style and grammar and understanding of what it is that I want to write about and communicate.

Some of the learning tools that we’ve built into Word and OneNote to even solve for dyslexia, to help students with dyslexia improve their reading rate.  Another new application of it that we just launched this week is TAP; it understands the content you’re writing.  And, just imagine, within that context to bring all the other content that was created in your organization just a tap away.

And lastly, MyAnalytics, because again going back to that notion of scarce time, just like the fitness tracker gives me all this information about all that I need to do to keep my calorie intake and outtake on balance, what if I had that same feedback loop informing me on terms of time I’m spending, who am I spending with on topics that I should spend it on.  That’s what MyAnalytics is about.

So Office 365 to me, beyond what is the traditional applications and workloads, it’s about infusing this next layer of intelligence.  And we’re not stopping there.  In fact, we’re expanding that to Dynamics 365.  Take something like sales, in any business application you always explicitly model the world.  When it comes to sales, you have modeled salespeople, their accounts, customers, leads, prospects, opportunity pipeline, it’s all modeled.  There’s lots of data that’s captured.  But there’s one real problem, which is most of the sales activity happens outside a CRM system.

And so the goal of intelligence is to be able to reason about your sales data model, not inside just your CRM system, but outside.  So we are building the relationship assistant that’s going to ship in November as part of Dynamics CRM to truly transform our CRM application from the inside out.  So when you login to a CRM system what you’re going to see are these cards, these cards that allow you to take action inside the system based on activity that is happening outside.

So for example, it’ll know because of its ability to crawl the web about changes that are happening with your customers.  Changes that are happening on LinkedIn on one of your prospects job title.  And so now you can go change the information in context of things that are happening on the outside.  So the web graph informs your CRM actions.

Similarly the Microsoft Graph, say you get an e-mail from one of your customers, instead of you having to triage, copy/paste, re-enter into a CRM system, what if when you login to a CRM system you got to triage, in fact, your email actions.  A new RFQ request comes in, a new lead comes in, a new opportunity, somebody — in fact, one of your colleagues in the sales account team sends you a mail and that gets flagged as an opportunity risk item.  And lastly, of course, we’ll apply the intelligence to the CRM data model itself so that you can get alerts around when is it that an opportunity is going to close or monitor account activity.

So this is a complete revamp of how one even goes about thinking about a CRM sales module.

A similar approach is what we’re taking with customer service.  Again, traditionally what we have one is to build, again, a model of what a customer service agent does, how do they open a case, how do they escalate a case, how do they keep track of all of the workflows that happen within the customer service department.

Really customer service starts with the customer contact.  So at Microsoft we, today, have at support.Microsoft.com a virtual assistant.  This is live in U.S. English today, and we’re going to expand this to all countries.  So customers come in and interact with the agent.  They ask it questions, this virtual agent answers those questions.  But, of course, it also runs out of steam and needs to escalate to a real customer service from time to time.

And that’s when the real magic starts.  If you go behind the scenes, this is the interface that our customer service reps are using today.  What you have on the left-hand side is the conversational canvas where the customer service rep is interacting with the customers, solving their problems.  But the bot or the assistant is on the right-hand side.  It is, in fact, helping the customer service rep get better.

So this virtual assistant through a mechanism called reinforcement learning is not only helping the customer service rep get better, it, itself, is getting better.  So this phenomena of applying AI to customer service will get your customer service outcomes to be more efficient, your customer satisfaction to improve.

These are true, simple, but profound examples of how AI in sales, in support, are going to transform Dynamics 365.

So I want to then move to service.  The capability that you see underneath Office 365, Dynamics 365, SwiftKey, all of that is what we want to expose as services, building blocks so that you can build the same kind of intelligent applications.

The Cortana Intelligence Suite today already is transformative.  Ecolab is using it for water management; Schneider Electric is using it for energy distribution in Nigeria.  LV Prasad Eye Institute is using it to bring affordable eye care in India.  Rolls-Royce is using it for fuel efficiency.  This notion of using machine learning on large amounts of data is already having that transformative effect across every industry, across every country.

And now we’re adding new capabilities to the Cortana Intelligence Suite.  The first is the Bot Framework.  Bots are like applications, just like how you build a website or a mobile app, every business for every business process is going to build a bot interface, because it’s a convenient way for users to interact with your information, your data, your process.

But in order to build a bot, you need to have these building block services that have conversational understanding, know how to parse natural language, how to have a dialogue.  So that’s what we’ve now encapsulated in this Bot Framework so that you can build a bot that is available on Skype, it’s available online, it’s available on Facebook.

So, again, we’re taking an approach where any bot you build is not captive to any one conversational canvas.  It is available everywhere.  And since Build, which is when we launched it for the first time, we have had 45,000 developers building these bots, Hipmunk in travel, Star Trek, StubHub, Getty Images, so many, many developers taking advantage of the bot framework already.

So we now are partnering, which is a fun bot, with NFL.  It turns out NFL, as you can imagine, has lots of data.  And one of the applications that they create is for fantasy football.  And it’s the most data-driven application at least I’ve come across.  And so we started experimenting, and this is something that we are in the early stages of building and I hope that by the next season we’ll have this bot, which will allow each one of us to really engage in a very different way with what all of us, at least in the United States, are obsessed with, which is fantasy football.

And so what I thought is to really showcase this I’ll invite up on stage someone who knows a thing or two about football, Deion Sanders.  Please help me welcome Deion on stage.  (Cheers, applause.)

DEION SANDERS:  I made it.  Forget the Super Bowl and the World Series, I made it.

SATYA NADELLA:  Deion, we’re going to go to this station.  I know that you’re the only person who was in the Super Bowl and the World Series.  And have you ever tried a real game of cricket, though?

DEION SANDERS:  Cricket, you mean the kind that you fish with?

SATYA NADELLA:  I’ll teach you that.  That’s the one sport you’ve got learn.  You need to learn it.  So we’re logged in here to your Skype account, and I look at your friends list, wow, one of these days I’ll have those kinds of friends.

DEION SANDERS:  No, your friends are doing pretty good.

SATYA NADELLA:  So this is a bot that we built, or we are in the process of building with NFL, that obviously you know a lot about.  And the idea is how can we change that fantasy football interface and make it more fun and more data-driven.  So what it allows you to do is to really compare player profiles, change your roster, or improve your roster.  So maybe what we should do is look at your roster.  Should we do that?

DEION SANDERS:  Yes.

SATYA NADELLA:  Let’s go, let’s click that.

DEION SANDERS:  I like when they talk to you like they know you, welcome back, Deion.

SATYA NADELLA:  That’s conversational understanding.

DEION SANDERS:  I love that.

SATYA NADELLA:  And so here is your roster.  I mean it’s a pretty ‑‑

DEION SANDERS:  Pretty good.

SATYA NADELLA:  Pretty good, I don’t know what 105 points means, is it good?

DEION SANDERS:  It’s OK.

SATYA NADELLA:  So how about we ask for some recommendations.  Let’s see what the bot says.

DEION SANDERS:  Wow, Matt Ryan with Drew Brees, they’re playing tonight.  I like Matt Ryan, but I love Drew Brees, because Atlanta’s secondary isn’t good, but when you think about Drew Brees if they’re losing he’s going to get more opportunities to throw, so he may receive more fantasy points.

SATYA NADELLA:  Do you think the bot is wrong?

DEION SANDERS:  It’s all right.

SATYA NADELLA:  How about let’s compare the players.  Let’s see what the bot knows that we may not.

DEION SANDERS:  Now that’s good.

SATYA NADELLA:  It’s almost saying that Bing predicts the Saints are going to win, and Drew Brees will score more fantasy points.  He’s playing at home and the weather will not be a factor, so you should bench Matt Ryan.

DEION SANDERS:  Well, you know, I played for Atlanta and it’s kind of hard for me to pick Drew Brees over Matt Ryan and I played sports right here.  But I like what they’re saying.  That’s a pretty good summary.

SATYA NADELLA:  All right, let’s do it.  All right, so here is your dream team from this week I guess.  So that is really the beginning of hopefully what can be transformative of even how a Deion Sanders can interact with a bot and manage a fantasy football league.

DEION SANDERS:  They had more knowledge than I did, because I didn’t even consider that in a dome the weather would not be a factor.

SATYA NADELLA:  There you go.  Thank you so much, Deion, for being here.  It is such a pleasure.

DEION SANDERS:  Thank you.  I appreciate it. (Applause.)

SATYA NADELLA:  So bot framework behind these simple conversational interfaces is some of the most sophisticated AI capability to understand human language.  And that means we will democratize application usage for everyone and everything.  That’s what’s exciting about this new interface.

I want to now talk about another aspect of Cortana Intelligence that’s going to be available to developers and is available to developers that is the Cognitive APIs.  Again, we launched these at Build and since then we have had more than a billion API calls to these Cognitive APIs.  First to kind of put this in perspective, the capability that is on tap, that is one API call away, are some of the world-class, world-leading technologies.  Microsoft has today the world’s speech record, the way you manage ‑‑ you sort of keep track of it is the word error rate.  Just two weeks ago we published a new world record at 6.3 percent word error rate in what is called a switchboard text.  So that’s the speech API that is now part of the Cognitive APIs.

Microsoft also has the world record when it comes to image recognition.  Then again in a competition on the Image Net we deployed a 152-layer deep deep neural net using a new technique called residual learning, which again has been pretty transformative in terms of its object-recognition capabilities.  That is available as a service.  It’s part of Cognitive Services.

So imagine what you as developers can start doing with this tech.  And that’s what’s leading to developers everywhere.  The first developer I want to talk about is Uber.  Just this week they launched a new app.  They now take selfies of their drivers, recognize and identify their drivers, based on image rec, and the reason for that is simple.  It’s driver safety and passenger safety.

Let’s roll the video.

(Video segment.)

SATYA NADELLA:  (Applause.)  Sticking with the automotive theme, I want to talk a little bit about the work we’re doing with Volvo.  I mean, Volvo stands for safety.  That’s their brand.

And one of the big issues, long before fully autonomous cars, is distracted driving.  In fact, the devices that we now take into cars are probably going to be a source of a lot of accidents.

And so Volvo has been working to sort of understand how they can recognize distracted driving, and then give driver feedback because they want to be able to design cars that can help drivers not be distracted.

So to do that, they built a simulator using, again, cognitive services.  It’s the ability to not just recognize people, but to even recognize emotions, distractions, let’s roll the video.

(Video:  Connected Car.)

SATYA NADELLA:  (Applause.)  This next example, perhaps, personifies what is possible now by bringing together the two most-magical technologies of these times — cognitive services and mixed reality.

We’re working with Lowe’s, in partnership with Pinterest, to completely reimagine what retailing could look like.

A home remodel, which I just went through, is a fascinating process.  You go to the store, you get samples, you go back home, you look at them, you go back to the store, and you kind of repeat that process what feels like an infinite number of times.  (Laughter.)

But what if we could, in fact, use the combination of the social signal from your Pinterest board — because, really, the idea for the remodel starts long before you even visit the store.  It’s in your boards on Pinterest.  What if we can take that signal, mix it with the ability to see the remodel before it’s done at the store?  That’s what we’re working on with Lowe’s.

And to really show you this in action, let me invite up on stage Jennifer Stevens from our HoloLens team.  Jennifer?  (Applause.)

JENNIFER STEVENS:  Thank you, Satya.  Microsoft and Lowe’s have partnered to deliver an innovative new approach to the home remodel experience, and it’s in pilot in stores today.

Remodels are big projects and big investments too.  So Lowe’s wants to ensure they deliver a highly personalized experience for all of their customers.

Interestingly, most remodels start before a customer ever walks into the store.  Many of us begin by pinning pictures of our favorite designs on social apps like Pinterest.  These images can be supervaluable because they’re a window into our unique style preferences.  But until now, it’s been hard for a retailer like Lowe’s to extract that data.

Let’s see how we can change that with machine learning and the work we’re doing with Pinterest.

As you can see here in the Lowe’s in-store app, there’s a couple, Peter and Senja, who are working on a remodel and they’ve shared their Pinterest images with us.

Now, this looks like a simple app, but on the back end, there’s a Cortana Intelligence deep neural network that’s been trained on millions of pictures of kitchens.  When I hit “analyze” that DNN is extracting Peter and Senja’s unique style preferences, and it’s been matching to the best fit of what’s in the Lowe’s product catalog.

As you can see, I got back a handful of recommendations.  It looks like the strongest match is with the Cornerstone, one of Lowe’s feature designs, with 83 percent confidence.

If I wanted to, I could continue to customize.  I’ve got some additional recommendations of fits here.  I think this is a good start for what I know about Peter and Senja.  A great customer experience isn’t just about better style matching.  Swatches, printouts can’t show how that new kitchen will really look in someone’s home.

But with HoloLens and the interactive product recommendations from Cortana Intelligence, we can find an immersive, life-sized experience and help make that no-regrets remodel possible.

Let me invite Peter and Senja to the stage to design their future kitchen using HoloLens.  Now, as they enter the stage, you’re going to see a camera.  And that’s to bring the holograms to everyone in the audience.

Hi, Peter.  Hi, Senja.

SENJA:  Hi.

PETER:  Hi, Jennifer.

JENNIFER STEVENS:  Before we get started, is it OK if we record this session to improve our customer experiences?

PETER:  Yeah, sure.

JENNIFER STEVENS:  Great.  I ran the Pinterest images you shared with me through our style app, came back with a recommendation.  It’s the Cornerstone.  I’d like you to put your HoloLens on now, and let me know what you think.

PETER:  Oh.

SENJA:  Cool.

PETER:  Do you see this?  Those are the cabinets you really like.

SENJA:  They are.  Look at the window, Peter, it’s so real.

PETER:  Oh, yeah.

SENJA:  This is really close to the pictures we picked.

PETER:  This is really close.  Check out the counter.  I like the sample we looked at earlier, but now that I see it full size, I think it might be too dark for our kitchen.

SENJA:  Maybe.

PETER:  Can we make changes?  Maybe try something lighter?

JENNIFER STEVENS:  Sure.  Let me pull up some recommendations right now.  You’re going to see cards hovering throughout the kitchen.  Just tap the ones that you want to change.

PETER:  OK.  I can try matching the counter with the cabinets.

SENJA:  OK.  But I think with this light countertop, we need a different paint color.  This one?

PETER:  That looks great.  I love this color.  What about those tiles?  Can we change them and make them a little higher?

JENNIFER STEVENS:  Sure, that’s the backsplash.  Let me pull up an option that’s been popular with that same cabinet and countertop.

PETER:  Yeah, I definitely like this better.  What do you think, Senja?

SENJA:  I like it too.  This is so much easier than just trying to imagine how things would look and fit together.

PETER:  Totally.  Jennifer, we want to add an island to our kitchen.  Can we try something like that too?

JENNIFER STEVENS:  Sure.  Let me put one up right now.

SENJA:  Sweet.

JENNIFER STEVENS:  Now go ahead and resize it to how you think it would fit best in your kitchen.

PETER:  Well, we like to entertain a lot.  Why don’t we try this at bar height?

SENJA:  I don’t know about this height.  We mostly cook for ourselves.  Let me try to lower it one more time and see.  Maybe here?

PETER:  Yeah.  I think you’re right.  This will give us the extra space that we need.  And I can see myself making cookies here.

SENJA:  Yeah.

JENNIFER STEVENS:  How do you both feel about the rest of the kitchen?

SENJA:  Jennifer, I love it.  I think for me, this is it.  And I can’t believe how much we got done in just one store visit.

PETER:  Yeah, this was great.  So what’s next?

JENNIFER STEVENS:  I’ll e-mail you a recording of this session and your design, you guys can take it home, review it, make any changes that you’d like, and we can finalize from there.

SENJA:  Sounds good.

PETER:  Great.

JENNIFER STEVENS:  Thank you.  (Applause.)

With HoloLens and the power of machine learning from Cortana Intelligence, retailers like Lowe’s can transform their business.  But there’s more we can do.  Do you remember when I asked Peter and Senja if we could record their session?  When we aggregate that data and anonymize it across all of the other customers who have been through similar experiences, we can create a powerful feedback loop.

Here’s the Cornerstone kitchen that we were just visualizing.  Now, when I add the heat map, I’m layering in the HoloLens telemetry data and I’m getting instant insight to where customers spent the most time looking.

You see the countertops, the cabinets, and even the kitchen sink.  And using a cognitive services API, I can add another layer.  This is customers’ verbal sentiments, the actual reactions they had to get a sense of how well they liked or didn’t like the products they saw in the kitchen.

With this particular kitchen, as I kind of glance through it, it looks like customers really liked it.  But there’s one item, those cabinet knobs, that didn’t seem to be resonating as well.

I can investigate to learn a bit more.  When I click on the cabinet knobs, the word cloud to the top right of the dashboard is updated with the most frequently used words to describe those knobs.  With Cortana Intelligence, finding customer insights like this is easy.

So what you’ve seen is how we’re able to leverage the intelligence and power of the Microsoft Cloud and seamlessly integrate it into a mixed-reality world.  This has been transformative for retailers like Lowe’s and for their customers.  Thank you.  (Applause.)

SATYA NADELLA:  That’s awesome.  I mean, it’s the most harmonious, amicable remodel that you can imagine.  (Laughter.)

And to think about technology that can change industries like retail, the cognitive services and mixed reality and signals from the web and social really are what’s now possible, and that’s really what I look forward to seeing in the years to come.

And I want to end with the last pillar, which is infrastructure.  Now, we’re now talking about infrastructure that allows you to create intelligence.  Scott talked earlier this morning about Azure and how we now have 34 global regions, how it’s the most-trusted cloud with its compliance, and then the intelligence capabilities.

But the most fascinating thing for me is how we’re able to support, in Azure, the CPU compute fabric at scale and having the scale of CPUs, but the ability for developers to use any framework for creating AI — Café, Torch or CNTK, which is where we’re innovating in building one of the best-in-class frameworks for creating intelligence.

But it’s not just limited to CPUs.  We now have the best-in-class GPU virtual machine support in Azure.  You already see developers like JellyFish using it for computational rendering.  Virginia Tech using it for genome sequencing.

But we’re not stopping there.  We are now taking those neural nets, deep neural nets, convolutional neural nets, and asking ourselves, what if we can run them not just on CPUs or GPUs, but on silicon?

And that’s what has led us to build out the FPGAs.  We now have FPGA support across every compute node of Azure.  That means we have the ability, through the magic of the fabric that we’ve built, to distribute your machine learning tasks, your deep neural nets to all of the silicon that is available so that you can get that performance, that scale.

And to show you this, what I believe is the first AI supercomputer in action, I wanted to invite up on stage Doug Burger from Microsoft Research.  Doug?  (Applause.)

DOUG BURGER:  Thank you, Satya.  I have to say, I’m really excited to share this with you today.

As a company, we’ve been on a journey to develop the world’s most-intelligent cloud.  Now, we already have industry-leading capabilities with our Azure GPU offering, which is fantastic for building trained AI models offline.  OK?

But to support live AI services with very low response times at large scale with great efficiency, better than CPUs, we’ve made a major investment in FPGAs.

Now, FPGAs are programmable hardware.  What that means is that you get the efficiency of hardware, but you also get flexibility because you can change their functionality on the fly.

And this new architecture that we’ve built effectively embeds an FPGA-based AI supercomputer into our global hyperscale cloud.  We get awesome speed, scale and efficiency.  It will change what’s possible for AI.

Now, over the past two years, quietly, we’ve deployed it across our global hyperscale datacenters in 15 countries spanning five continents, OK?

So let’s start with a visual demo of what happens when you add this FPGA to one of our cloud servers.

We’re using a special type of neural network called a convolutional neural net to recognize the content of a collection of images.

OK?  On the left of the screen, what you see is how fast we can classify a set of images using a powerful cloud-based server running on CPUs.

On the right, you see what happens when we add a single 30-watt, Microsoft-designed FPGA board to the server.

This single board turbo charges the server, allowing it to recognize the images significantly faster.  It gives the server a huge boost for AI tasks.  OK?

Now, let’s try something a little harder using a more-sophisticated neural network to translate languages.  The deep-neural-network-based approach we’re using here is computationally much harder, it requires much more compute, but it’s achieving record-setting accuracy in language translation.  OK?

So to test the system, let’s see how quickly we can translate a book from one language to another.  Now, I picked a nice, small book for this demo, “War and Peace,” about 1,440 pages.

And we’ll go over to the monitor here.  And using 24 high-end CPU cores, we will start translating the book from Russian to English.  OK.

Now, we’ll throw four boards from our FPGA-based supercomputer at the same problem, which uses a fifth less total power.

As you can see — thank you.  (Applause.)  We’re not done.

As you can see, our accelerated cognitive services run blazingly fast, eight times faster, while using less power.  These four boards can translate “War and Peace” in just two and a half seconds.

But even more importantly, we can now do accelerated AI on a global scale, at hyperscale.

Now, Satya is a big reader with a big personal library.  So we thought about translating his library.  But the thought of sneaking into my CEO’s house, scanning all of his books, and getting out without being caught wasn’t so attractive.

So we thought about, what’s a different store of texts that we can translate?  We decided to go to Wikipedia.

Now, English is the largest language in Wikipedia.  It has five million articles, about three billion words.  If printed on paper, it would be a stack a quarter of a mile high.

So if we threw the same four nodes running around eight trillion operations per second at the problem, as you can see, it would take nearly four hours to grind through that quarter-mile stack.

Now, of course, we have this fabric, which is global at hyperscale, so we could easily throw 50 nodes, 50 FPGA boards at the problem, which would take us up to about 100 trillion operations per second, and of course bring the time down further.

But to show you the raw power of this hyperscale AI supercomputer that we’ve embedded in our cloud, in our global cloud, I’ll show you what would happen if we decided to throw most of our existing global deployment at it.

OK.  Let’s go all the way over here and see how fast we could go.  Less than a tenth of a second.  (Cheers, applause.)

We could translate all five billion words into another language in less than a tenth of a second.

Now, some of you may have blinked in surprise when you saw that result.  And you may or may not know that it takes a human being about two-tenths of a second to blink.  So we can actually translate those five billion words if we threw our deployment at it in less time than it takes to blink once.  That’s hyperscale acceleration.

That crazy speed shows the raw power of what we’ve deployed in our intelligent cloud.

OK, so now what does it mean — you may have noticed on the screen that we’re running at over an exa-op, again, if we threw our whole deployment at it.

What does it mean to be at exa scale?  That means we can run a billion-billion operations per second.  It means we have 10 times the AI capability of the world’s largest existing supercomputer.

It allows us to do things on a scale that hasn’t been done before.  We can solve problems with AI that haven’t been solved before, things that weren’t even possible just a few years ago.  OK?

We’re already porting some of our cognitive services to run on this fabric.

Now, this FPGA-based fabric that we’ve built is very flexible.  OK?  So Azure already announced today that they’re using it to provide their accelerated networking, the fast cloud network, industry-leading cloud network running 25 gigabits per second at 10X reduction in latency.

So this fabric that’s been deployed in between our servers and our network can be used to run the world’s most-powerful AI, can be used to run the world’s fastest-cloud network, or both.

So it’s providing leadership for cloud networking and potentially for AI.

When you do a Bing search, you’re already touching this fabric.  Bing ranking is running on it.

While many companies are experimenting with much smaller-scale, bolt-on systems, Microsoft is the first to have its global hyper-scale cloud enhanced with post-CPU technology, in this case FPGAs.  It gives us the most-powerful cloud, the most-flexible cloud, and the most-intelligent cloud.  And we’re committed to using it to support and empower you, our partners and customers, as we move forward into the age of machine intelligence together.

Thank you.  (Applause.)

SATYA NADELLA:  Thank you very much, Doug.  Hopefully that gives you a feel for our ambition to democratize AI.  From what we’re doing with Cortana and its skills, infusing Office 365 and Dynamics 365 with intelligence, the cognitive services, the bot framework, the machine learning analytics for every developer, and then building out this first AI supercomputer.

But I want to close where I started.  It’s never about our technology.  It is really, to me, about your passion, your imagination, and what you can do with technologies that we create.

What societal problem, what industry will you reshape?  That is what we dream about, what we’re passionate about.

We want to pursue democratizing AI just like we pursued information at your fingertips.  But this time around, we want to bring intelligence to everything, to everywhere, and for everyone.

Thank you all very, very much.  Thank you.  (Cheers, applause.)

END