Scott Guthrie: Build 2015

Remarks by Scott Guthrie, executive vice president, Cloud & Enterprise, on April 29, 2015.

SATYA NADELLA:  To kick things off, let me have Scott Guthrie come up to stage and talk about our cloud platform.  Thank you very much.  (Cheers, applause.)

SCOTT GUTHRIE:  (Applause.)  Thank you.  Well, good morning, everyone, and welcome to Build.

Satya just talked about our three core ambitions as a company.  I’m going to now continue the conversation and talk about the work we’re doing to build the intelligent cloud.

This is a broad ambition that spans a wide range of cloud services at Microsoft including Office 365 and Dynamics.  Today, though, I’m going to talk about the work we’re doing to power this ambition using Azure.

Azure is Microsoft’s cloud computing platform and enables you to move faster and do more.  Azure is a hyper-scale cloud platform, and over the last several years, we’ve built out Azure to run all over the world.

The circles on this map indicate Azure regions, which are made up of clusters of datacenters where you can run and deploy your code.

Today, we have 19 unique Azure regions open for business.  That’s more locations in more countries than both AWS and the Google Cloud combined.  And this enables you to run your applications closer to your customers and employees than ever before, and compete in even more geographic markets.

We continue to invest billions of dollars each year building out new infrastructure, and our cloud platform now manages more than 1 million servers.  This enables you to build apps without having to worry about your cloud platform’s capacity, and enables you to scale your solutions to any size.

Azure offers the choice and flexibility of a full-spectrum cloud.  It enables you to build apps that scale from personal projects up to global-scale applications.  You can start from scratch for new greenfield apps, or you can easily leverage your existing investments and skills.

It’s open, it supports the ability to target multiple devices, use multiple operating systems, programming languages, frameworks, data services and tools.

We allow you to put all of your app in Azure, or support hybrid deployments that span on-premises or other clouds.  You can choose to use Azure just for infrastructure and just use it for base compute storage and networking, but you can also take advantage of a coherent set of highly engineered services to build your apps even faster.  And our cloud platform and tools combination together deliver unmatched productivity and enable you to move faster and be even more successful.

So a busy 12 months for the Azure team since our last Build Conference a year ago.  We delivered more than 500 new Azure services and features in that time and greatly expanded the footprint and capabilities of what Azure delivers.

What we’re seeing is as we’re kind of expanding our feature set, as we’ve expanded our footprint around the world, we’re seeing the usage of Azure continue to rapidly grow.

We have more than 90,000 new Azure customer subscriptions now being created every month.  And those customers are creating some really amazing applications.

More than 1.4 million SQL databases are now being used by applications hosted inside Azure.  We have more than 50 trillion objects now stored in the Azure storage system, 425 million users in our Azure Active Directory system, 3 million developers registered with our Visual Studio Online Services, which is our online suite of developer SaaS offerings.

And we’re seeing the growth and traction of Azure happen not just in the enterprise and business user space, but more than 40 percent of our Azure revenue and usage now comes actually from startups and from ISVs.

Behind these large momentum numbers are some truly amazing customer stories.  This slide contains some of the logos of just a few of the companies doing great things in Azure.  And throughout today’s keynote, you’re going to hear me talk more about some of their specific stories and some of the reasons why they’re using Azure.

One of the things we know is that a cloud platform must offer the flexibility of choice.  Azure enables you to use the best of the Windows ecosystem and the best of the Linux ecosystem together.  Azure enables you to easily reuse the skills you already have, regardless of the programming languages you use.  And we allow you to take advantage of both core infrastructure capabilities and higher-level platform services.

Microsoft’s unique in that it’s the only hyper-scale cloud provider that architects those higher-level services and program model capabilities to run in multiple environments including both on-premises datacenters that you already have, as well as other clouds besides our own.

From the beginning of Azure, we’ve embraced open source in a very deep way.  And over this past year in particular, we’ve taken a number of significant steps to use open source even more broadly.  Last year, we announced our plans to deeply embrace Docker and the Docker ecosystem with both Azure and Windows Server and make containers a fundamental part of our application platform.

We’ve also open-sourced our core .NET runtime technology and announced our plans to support it on multiple platforms.  As well as, obviously, make it first-class when running within containers.  And this combination is really powerful and enables even more developer flexibility and choice.

What I’d like to do to kick things off is to actually invite Ben Golub and Mark Russinovich on stage here to talk about this work and show off a pretty cool demo of them in action.  So here’s Ben.  (Applause.)

BEN GOLUB:  Thank you, Scott.  So if any of you are not aware, Docker has just celebrated its second birthday.  So frequently we’re used to thinking of Docker kind of like a human 2-year-old.  Occasionally stumbles, occasionally spits up and keeps those of us who are closest to it up at night.

And so it’s sort of surprising to be up here on stage at Build.  And, in fact, I’d like to talk about the five big surprises that we’ve had in working with Microsoft that all in many ways relate to this notion of how we can take openness, portability and flexibility and use it to empower developers.

Now, if you’re not familiar with Docker, it was started two years ago by Solomon Hykes, who felt that developers fundamentally are spending far too much of their time doing rework, worrying about dependencies, worrying about servers, instead of just building awesome apps.

And the initial innovation was, let’s make it possible to take any application and its dependencies, package them up in the digital equivalent of a shipping container, and enable any application that’s dockerized to interact well with any other dockerized application and run on any server.

Now, last October when we went up to meet with Microsoft, we were pretty proud of the fact that we’d gotten that working for Linux.  So you could take any Linux application, dockerize it and run it on any Linux server.

Quite frankly, we thought we’d leave Redmond with an agreement to make Docker for Linux run well on Hyper-V or run well in Azure.  But then we had the first of those big five surprises.

And the first big surprise was that we were not only going to work on Docker for Linux, we were going to work on Docker for Windows.  And so we’re going to enable the 4 million Docker Linux developers to join all of the millions of Windows developers and make it possible using standard Docker, using open source, to take any Windows application, dockerize it and run it on any server.

Second great surprise we had was this was not only about code, it was also about content and collaboration.  Now, if you’re familiar with Docker, you know we have a service called Docker Hub that has about 120,000 dockerized apps.

Microsoft has not only integrated Docker Hub into all of its developer platforms, but it’s contributing actively to Docker Hub, not just dockerized Windows components, but all of these great new open source, .NET and .NET for Linux components delivered, again, as Docker containers through the hub.

Third great surprise we had was that Microsoft embraced the notion of what you need to do to make multi-container apps work.  And they embraced the notion that if you want to make multi-container apps truly portable, you have to be open about how you do orchestration as well.

And so we’re so thrilled that Microsoft has embraced Docker’s open orchestration initiatives, Compose, Machine and Swarm.

Fourth great surprise — and yes, yes, there’s more — you combine those three things, you’re not only going to be able to make great multi-container Windows applications, you’ll be able to mix and match Windows containers and Linux containers.  Which means you’ll be able to use a Linux back end and a Windows front end or vice versa, basically chose the best damn tools for the best damn application that you want to build and run it on any server.

And the fifth, and perhaps greatest, surprise that we’ve had working with Microsoft is that this happened quickly.  And everything that I’ve spoken about is either live or working well in the lab right now.

And so without further ado, I’d like to bring Mark Russinovich, CTO of Azure, on board to show you what happens when you combine these great platforms together and, again, empower developers.  Thank you.  (Applause.)

MARK RUSSINOVICH:  Thanks, Ben.

BEN GOLUB:  And, Mark, I love the shirt.

MARK RUSSINOVICH:  Thank you.  Great shirt.  Good morning, everybody.

All right, so what I’ve got right in front of you here is an ASP.NET 5 application, Fabrikam.  And it’s an e-commerce site.  It’s kind of a standard site, and I’m going to be doing some dev-test on it.

The first thing I’d like to do is deploy it to a Docker container running on a Windows Server machine.

On this dev-test box, I’ve got a virtual machine running Windows Server with Docker support, Docker container support.  And I’m going to take a look at the containers that I’ve already got running in that virtual machine by executing the Docker PS command, which lists the active containers.

You can see that I’ve got two containers active.  One is a Minecraft server, and the other one is a node.js application, which shows that Docker containers on Windows support many different runtimes and languages that you might be using.

Now, what I’m going to do is package up the Fabrikam application so that I can also deploy it into a container, and I’m going to use several Docker commands here, one called Docker build, which will package it up.

I’m going to tag the image that’s produced with the name Fabrikam so it’s easy to remember.  And then go ahead and package that.

And this is going to be executing a series of steps which take that ASP.NET 5 application, package it up in a dockerized image.  And the next step is to run that container with that image.  So I’ll type “docker run dash IT” to make it interactive, and you’ll see why I’m doing this in a second.  Oops, I forgot the docker command.  Docker run and Fabrikam.

Now, the reason that I’ve had it be interactive is so that we can print out the IP address so I can see the IP address of the website that’s going to launch that Fabrikam application.

If I go back and do a Docker PS at this point, you’ll see that that Fabrikam image has been launched into a container here.

Now, one of the really fun parts about using the Docker client is that if you don’t specify a name for your container, it comes up with a random one.  So let’s just see which names it’s come up with.  It called this one Tender Ilion (ph.).  Oh, and look at that.  What are the odds of that coming up like that?  (Laughter.)

So let’s go take a look at that website now, which is going to be active.  I’ll launch IE.  The IP address is 1050.0.122 and we’re hitting 10.50.1002, and there we go, .NET running in a Windows Server container, executed with Docker.

But I’m showing you a lot of technology that some of you might not be familiar with.  So just a quick raise of hands.  How many people have heard of Linux?  (Laughter.)  OK, for those people watching online, about a third of the audience raised their hands.  (Laughter.)  Just kidding, everybody raised their hands, of course.

But what I’m going to show you is, Scott mentioned that we’ve got .NET support on Linux and we’ve integrated it with Docker containers as well.

So here I’ve got the same exact application.  And this time, I’m going to SSH into a Linux virtual machine and run the Docker PS command.  You can see I’ve got no containers running in that virtual machine.

Instead of using Docker run to deploy it, I’m going to deploy it right from Visual Studio using the publish command.  Right here, I’ve configured it to launch to that Docker host.  Here’s the IP address of that host.  When I type “publish” press “publish” here, it’s going to package it up and execute the same Docker client commands to deploy that Fabrikam application onto Linux.

If I go back and type “Docker PS” again, you’ll see there’s Fabrikam latest, and there’s the name Prickly Wozniak, so always fun to look at the names of the Docker expanders it creates.  (Laughter.)

Let’s go see if that website has come up.  And, sure enough, there it is.  And you can see that that’s the same IP address as the SSH command, so I’m looking at the Docker container running on that virtual machine.

But now there’s more.  I’m doing dev-test of this Fabrikam application.  And I’d like to continue to use Visual Studio to do that, and I can now with integration of container debugging into Visual Studio.

So I’m going to go to Server Explorer, go to virtual machines.  Here’s a Docker VM, and there’s the Prickly Wozniak container.  I attach the debugger and just as a normal Visual Studio debugging experience, you can see, sure enough, there’s the CoreCLR for Linux underneath this application that I’ve just attached to.

Now I want to set a break point here on the function that I’m debugging.  I go back to the website, hit refresh, and I’m at the break point.  Debugging .NET in Linux deployed into a Docker container.  Thank you very much.  (Applause.)

SCOTT GUTHRIE:  Thanks, Mark.  So we’ve been developing the cross-platform version of the .NET core runtime in the open on GitHub the last several months.  And I’m excited today to announce that we’re also releasing now a preview edition of the runtime in pre-compiled binary format that you can download and start using for Windows, for Linux, as well as for Mac.

Now, it’s an incredibly exciting time in the world today.  Disruption is changing the economy in a fundamental way not just in technology, but in traditional service industries as well.  You know, who would have predicted that both the taxi and the hospitality business would have been turned upside down five years ago?  And yet, innovative companies like Uber and Airbnb are doing just that.

Every organization is looking for ways to engage customers better, empower their employees more and ultimately transform the value they deliver.  And developers and the applications and solutions that they build play an absolutely integral role in enabling this.

In fact, it’s increasingly digital apps and technology that enable customers to delight their customers and differentiate from one another.  And it’s never been a better time, I think, than right now to be a developer.

Azure enables you to embrace this transformation and make you and your teams even more successful.  With Azure, we deliver a rich set of higher-level services that provide fully engineered solutions that enable you to move faster and achieve more.  And the combination of our cloud platform and tools really delivers unparalleled developer productivity as you do this.

You know, 3M is a customer who provides an excellent example of what Azure productivity can really deliver.  3M is one of the largest companies in the world.  They have 88,000 employees and literally deliver tens of thousands of products.  And 3M is using Azure to build the back ends for many of these products now.  And they love the agility that Azure provides them to move quickly and deliver even more value.

And one of the initial projects they hosted on Azure was an internal mobile app that they wanted to provide to their sales force to sell a new offering that they’d just actually come out with.

And a small development team that actually had never used Azure before signed up on a Friday.  Basically spent the weekend building the app.  And by the end of the weekend, had a fully production-ready version deployed on Azure ready to roll out to their sales force on Monday.

And this ability to quickly create robust, enterprise-ready applications with super-fast development cycles has really provided 3M with incredible business agility and helped them move even faster as an overall organization.

One of the great productivity solutions Azure delivers is our Azure App Service.  This is a really powerful offering that we released just last month and provides a suite of capabilities that enable you to quickly build and scale both Web and mobile applications anywhere in the world.

App Service allows you to write back-end code logic using .NET, Java, Node.js, PHP, and Python.  You can easily enable continuous integration workflows to it from online source repositories like Visual Studio Online, GitHub and Bitbucket.  And that enables it so that every time a developer checks in code into one of those services, Azure can automatically build it, test it, deploy it to a private production slot.  And if everything is looking good, then basically turn it on to production and do full monitoring and analytics on top of it.

Using Azure’s built-in auto scale capability, you can set up rules so that once you deploy the application, your application can automatically scale up based on incoming traffic.  In fact, if you get a massive influx of traffic, you can basically spin up a huge number of instances and handle basically any amount of load.  And when your traffic drops, Azure can basically automatically shut down those resources to help you save money and pay only for what you actually need.

A couple cool features we’ve added recently to the Azure App Service is enterprise connectivity.  And this allows you the ability to set up a virtual private network that allows you to connect back to an on-premises enterprise network and access apps and data that you wish to make available in the cloud.

This makes it really easy to integrate with existing enterprise solutions you already have.  And Azure App Service also now provides support for easily integrating data and functionality for popular SaaS solutions like Office 365, Dynamics and Salesforce.  And you can use the Logic App’s functionality within App Service to create long-running workflows to automate business processes.

What I’d like to do is invite Scott Hanselman here on stage to show off our Azure App Service and what you can build with it.  (Applause.)

SCOTT HANSELMAN:  Hi, friends.  You know, we’re really excited about the maker movement and things like Raspberry Pi and 3-D printers.  You really should pick up one of these, these are great.

And one of the applications that we’ve brought here, the Fabrikam application, is a maker’s space.  And they sell things like Printrbot 3-D printers and Netduino Wi-Fi and things like that.  But they also offer a service where I can go and upload a 3-D print.  I’ve got a 3-D object here and 3-D Builder that you can go and get up on the Windows Store.

So I’m going to upload him into this application that is running in Azure App Service.

And we bring him up and then hit upload.  This modern application, there we go, you’ve actually got a little JavaScript three.js there showing me the model.  It’s going to tell me where I can pick it up and when.  And this is now going to go through a process.  It’s talking to a back-end order service, the one that we saw.

And let’s switch over into Visual Studio and see how we’ve built that.  Now, this is that ASP.NET 5 application using the .NET CoreCLR and using modern techniques that is talking to the back end.

This order details back end is running within the context of Azure.  And we’re using HTML5, CSS3, all the kinds of modern things that you’d expect.  And we’re actually bringing in Bower and NPM inside of Visual Studio.

And because I’m a Web developer, I want to use my Web development skills to make other applications.  So we also have the back end for the administrator.

So when someone at the store gets that 3-D print, we can click on the order.  They’ve got access to that print.  So they’re using this application on a Surface to talk to the same order API that the website does.

We’ve got built into Visual Studio the tools for Apache Cordova that are also using HTML and JavaScript, and we’re reusing some of that JavaScript as well.  And I can go and generate mobile applications.  I can certainly use C++ or tools on Xamarin.  In this case, tools for Apache Cordova have allowed me to create a mobile app so I can go and make purchases from this store and also check on my order.

You maybe have seen the Windows emulator that runs on Hyper-V that comes with Visual Studio.  But you may not be familiar with the Android emulator.  This is an Android emulator that comes with Visual Studio, you have this, and I can go and check on my order from that Android emulator running under Hyper-V at full speed.  All part of the Apache Cordova tools that are built in here.  It also includes debugging and a really amazing experience for people making mobile applications on any platform.

Let’s switch over to Visual Studio Online and see how we’re managing this code.  Now, in this case, our team is working on a Git repository that’s running in Visual Studio Online.  You can see John Galloway is making some changes here.

But not only are we doing source control within VSO, but we’re also managing our backlog.  So we can keep track of what’s working and what’s not working in this Kanban VOR.  We’ve got full support for work items.  But more interesting to me is that workflow that the team is going through.

We’ve got the build running in Visual Studio Online as well.  So this ASP.NET 5, .NET CoreCLR application is building in VSO and then deploy out to the applications in Azure App Service automatically.

Now, if I switch over to Azure, I can see that application in the API host here.  And when I move that into production, we can click within Azure itself and see the deployments here as well.  So I can see a deployment that John put out on Sunday, click on that, and that integration with Visual Studio Online, I can see his commits, whether that build succeeded.

So I check into Git, Visual Studio Online, it runs the build, runs the test, makes sure that everything is OK, sends it out into Azure.  Once it’s in Azure in production, I can also scale it.  I can automatically scale it.  Coming here on scale by, I can set the number of instances and a target range of the CPU.  So if the CPU goes above a certain amount, the application will scale.

So I’ve got the full life cycle from Visual Studio all up through production.  And now this application in production scaling, I’m going to want to get insights into what’s going on.

Now, you would expect to have great Web insights.  We’ve got Application Insights showing us all sorts of details.  We have dependency tracking, all the way down to a single line of code.

So from Visual Studio and Azure, you’ve got the best dev ops that allow teams to create both Web and mobile apps, but with our recent integration of HockeyApp, I’ve got the same deep insights for mobile applications.

So Insight integrated into Azure, you can see here the mobile app crashes, events, sessions by country.  I can go all the way down to a single line of code.  And here’s actually the iOS version of this application.  And I can go and see specifically what worked and what didn’t down to an exact line, in this case of Swift code.  So it’s a 360-degree view of any platform from the highest level all the way down to the lowest level, to a single line of code.

So this is a lot to take in.  Let’s step back for a second and get an understanding about how this fits in.  We’ve got Web apps, we’ve got mobile applications, those are talking to APIs in the back end.  When I submitted my order, it went to this order API written in ASP.NET 5 and the CoreCLR.

Those API apps can be written by me, or they can be written by a third party or brought in from our marketplace.  But a really interesting thing is this concept of logic applications because there’s business processes like when that order gets completed, I want to get notified as a customer about that.

So let’s take a look at what a logic application looks like.

So let me switch over.  Here we go.  So this is an application here where we’ve got individual API apps.  The SQL Connector is going to be looking at the data for the orders, and then it flows through this workflow from one API up to another using a technology called Swagger, where that metadata is going through the system into Salesforce.

And then in this case, we’re posting a message to Slack.  I can bring in that order API that we created in ASP.NET and then get the details for that order.  You see that that actually got populated using the Swagger details.  And then it knows about previous things within this workflow.

This is a reliable, durable workflow in the cloud.  And now I can click on Twilio and bring in the Twilio connector and then send a message to let me know when this order is completed.

So here I can access any aspects of what is flowing through the system.  In this case, I’ll look at the phone number and we will have the “from” number, the “to” number, and then under text here, we’re going to have “hi person sales force first name your order is ready.”  This is the kind of common business process that you don’t necessarily want to write in code because it changes a lot.

And this is all happening in the cloud in this business orchestration service called LogicApps.  And you can integrate this with any existing application running on any language, not just .NET, Node, Python, anything that can create a Swagger application can do this.

Now, I think this is pretty amazing.  But I feel like, perhaps, I haven’t sufficiently blown your minds, and I promised a number of people that I would do that.

So let’s take a moment and then switch over to our Macintosh here.  Now, on this Mac, you can see that I can load up Visual Studio Online, I can manage my code and write my cross-platform .NET applications.

And I can do that using any one of the many editors that one would use on a Mac like Emac, Sublime and things like that.  But, inevitably, I find myself going into Spotlight on a Mac and just wishing that there was a member of the Visual Studio family that I could somehow run.  And now there is.  (Cheers, applause.)

I’m really, really happy to announce that Visual Studio Code — this is a code-optimized development tool that runs natively on Windows, Mac and Linux.  It supports dozens of languages out of the box, it’s a great application for all kinds of things that you’re going to do that are code-focused.

But it’s not just a simple editor.  It has deep insights into what’s going on.  This is the same Fabrikam order details application.  Here is a code peek, I can see the references.  I’ll go into order details.  From here, I can hover over individual objects, get information about them, and help.  And I get real IntelliSense, not just auto complete.  In this case, powered by the open-source tools of Roslyn and OmniSharp.  So this is pretty freaking amazing.

I can also do things like this.  We’ll bring in a little refactoring here, make a change.  Save this, and I’ve got Git and diffs and I can push it to Visual Studio Online where it will then run through the process, be built, tested, deployed and scaled to Azure, which is pretty fantastic.

Visual Studio Code, but this is still not really blowing my mind.  I feel I can do even better.  So why don’t we take a moment and switch over to Ubuntu?  (Laughter, applause.)

So now I’m in this application.  In this case, I’m using a Mono application, and big thanks to the folks at Mono who helped us with Mono 4.01 that this demo is running on.  I can now click on debug, launch an interactive debugging session.  So now I’m on Ubuntu in a .NET app doing a debug session.  I’ve got local variables, I’ve got a call stack, I’ve got break points, all in this great, lightweight code editor, Visual Studio Code.

This, I think, makes the point that Visual Studio is now a family of tools for every developer and Azure is the cloud back end for everyone.

And I just got a text message on my Band that my 3-D print is being delivered.  Oh, how nice.  Thank you.

SCOTT GUTHRIE:  There you go.

SCOTT HANSELMAN:  That’s great.  Now my Scott Gu action figure has a friend.  (Laughter, applause.)  Thank you, sir.  (Applause.)

SCOTT GUTHRIE:  Well, thanks, Scott.  That was kind of a great example of the type of end-to-end productivity that the combination of Azure and our Visual Studio family now provides.  You can see, as Scott mentioned, really a 360-degree view from coding to deployment to testing to source control management to running and scaling your application.  It does it all.

You also saw our newest addition to the Visual Studio family, which is Visual Studio Code.  Visual Studio Code is a code-optimized editing environment.  It’s small, fast and provides both IntelliSense and debugging support.  And as you saw, it has full integration for Git.

And one of the things that it can be used on Windows, on Mac, as well as Linux development machines.  And I’m really excited to announce that we’re going to be making it available for free.  (Applause.)

Even better, we’re pleased to announce that later today the first download of it will be available on all those platforms for you to start using.  (Applause.)

Now, our mission with Visual Studio is to provide best-in-class tools for every developer.  And we now have tools for developers who like lightweight, code-optimized editors as well as for those looking for a full development enterprise IDE.

And with Visual Studio Online, as Scott showed, we now provide a full suite of developer services that run in the cloud that deliver a full application life-cycle management experience including source control, bug and work item tracking, continuous integration builds, load testing, performance, crash analytics, and a whole bunch more.

What’s great is you can take advantage of all these services directly from any of our Visual Studio development tools, or you can also use them from any other development tool as well.  You know, this really enables you to build amazing applications in this mobile-first, cloud-first world.  And we’re really excited to talk more about this in the breakout talks and all the great features that are coming in Visual Studio this year.

Mark Andreessen wrote an article a few years ago where he talked about how software is eating the world and about how companies will increasingly deliver their value through online services.

Today, more than 90 percent of all technology companies are now delivering or building online SaaS services to deliver more value and better connect with their customers.

And many of our biggest customers using Azure today are using it to build and deliver SaaS services to their customers.  This includes both born-in-the-cloud companies as well as enterprises that are reinventing themselves by becoming technology firms.

AccuWeather is a great example of one of those companies.  They’re one of the largest weather forecasting services in the world.  They were originally actually an AWS customer and then switched to Azure to power their weather service.  They process more than 6 billion data requests each day through their APIs for temperature and weather forecasts.  And they use Azure for weather prediction as well as to power all of their iOS, Android and Windows apps.

DocuSign is another great example.  They provide an electronic signature signing SaaS service.  They’ve been growing at 300 percent year over year for seven years now, and they now service more than 120,000 enterprise customers.  And they’re a fantastic example of a successful startup and their valuation is now worth billions.

GE Healthcare is working to provide their customers, who are doctors and hospitals, easier ways to collect, analyze and report healthcare data.  And they deployed multiple SaaS applications on Azure while meeting the most stringent regulatory environments.

This week at Build, we’re going to talk about a number of new technologies that we’re delivering that enable you to build great high-scale SaaS services like these.  One of the technologies we’re going to talk about is something we call Azure Service Fabric.

Service Fabric is a high-control, distributed computing framework.  We created it to power our own high-scale cloud services, and we’ve battle-hardened it over the last several years under extreme loads and super-demanding requirements.

It supports the ability to create cloud services composed of both stateless and stateful micro services.  And it has support for hyper scale-out deployments, self-healing and core management, as well as the orchestration of code updates.

And we’re going to release the Service Fabric SDK to make it available on both Windows as well as Linux systems.  The first download of that will be available this week.

And in addition to supporting Azure, you’ll also be able to use it to build great solutions that run in a multi-cloud environment.  We’re going to have a great set of talks later today that will go into more detail on it.

Now, when you think about building great SaaS applications, one of the things that’s sort of fundamental to building those is to be able to do great data management.  And Azure supports a wide variety of data storage and management solutions.

You can use pretty much every commercial or open source data management offering on Azure.  And Microsoft also optimizes and delivers both relational and NoSQL data solutions as highly available managed services that you can basically deploy and use.

Our SQL database offering is one of our most popular offerings.  It’s a cloud-native database as a service.  It’s highly available, durable and fault tolerant.  And enables you to store and manage data without having to worry about infrastructure, patching, software updates or backups.  That’s all built in as part of the service.

You can provision a SQL Database in just seconds and run it literally anywhere around the world.  And you can set up a SQL Database to actively geo-replicate transactions across multiple Azure regions while you’re doing that, enabling automatic failover support within your applications as well as geographic scale-out.

We’ve added some great new features to SQL databases over the last few months like elastic scale support, which enables you to shard and scale out SQL databases to be petabytes in size, and in-memory column support which gives you 100X performance improvements for many core data scenarios.

And because, again, SQL Database is offered as a managed service as opposed to just a database inside a VM, developers are able to host, scale, and use databases in a true cloud-native way.

In fact, up to 160,000 SQL databases are now created or dropped every single day using Azure for a wide variety of different scenarios.

What I thought I’d do is actually just talk about and show a nice video of one of the customers that’s using SQL Database.  They actually have over 10,000 SQL databases running on Azure today.  And this is ESRI, who is the leading provider of geographic information services software.  So what I’d like to do is roll the video and talk about how they’re taking advantage of it.

(Video segment:  ESRI)

SCOTT GUTHRIE:  (Applause.)  I’m excited to announce today a number of great new enhancements to our SQL Database Service that you can start using immediately.  These include TDE support, which provides the ability for you to insure and control encryption at rest policies within your databases.  Full text search support, which enables you to run text queries against your data.  And a new capability I’m particularly excited about called Elastic Database Pool, which enables you to better manage lots of databases at even lower cost, and is perfect for SaaS solutions.

One of the common challenges for developer that are building SaaS-based applications that serve lots of B2B customers is architecting for the right data model.  One approach is to use a shared database where you have lots of customers sharing the same tables of data.  This is good from a resource cost perspective, but it introduces high operational complexity into your application.

Now, another approach that many SaaS apps use is to maintain a separate database per B2B customer.  This provides great data isolation and ensures that no two companies’ data is ever stored in the same database, but in the past, it’s typically been more expensive since each database uses a separate set of resources.

Today, we’re really excited to introduce our new Elastic Database Pool support, which gives you the best of both worlds.  Elastic Database Pool enables you to maintain completely isolated databases for each of your B2B customers, but then allows you to aggregate all the resources necessary to run those databases into a common resource pool.

This allows you to smooth out the peaks and dips of different databases within your SaaS app and allows you to end up using far fewer resources to run your application.  And the end result is that it saves you lots of money while also improving the overall operational complexity of running your SaaS solution.

What I’d like to do is invite Lara on stage to show this off in action.  (Applause.)

LARA RUBBELKE:  Thank you.  Hi.  We’re going to walk through some great new updates with Azure SQL Database.  And SaaS developers are going to love this stuff.

As Scott just talked about, you can let Azure dynamically manage your database resources using an elastic database pool.  And the best part is, this can also save you money.

When you first create a pool, one of the decisions you need to make is what databases should you even put in the pool?  We’ve removed a lot of this guesswork.  We’re using machine learning behind the scenes to produce a list of recommended databases to go into a pool.

We’re going to take that list of databases and we’ll just add them.  We want these databases to dynamically use a pool of resources.  So we have these visualizations that help us properly size what that pool should look like based on the databases that we’ve selected.

But what’s also very important is that you can specify the minimum and maximum performance characteristics of all of your databases.  Basically, what this means is any database can have a spike in performance, but they’re not going to end up being a noisy neighbor to any of the other databases in the pool.

Now you don’t have to worry about what size you’re going to give your database — S0, S3, S1 — just let Azure dynamically figure that out for you.  You’re going to have the best cost for your company and you get the best application performance for your customer.

Let’s go through some developer stuff.  We have this report.  And this is a report for the Fabrikam company.  And it allows franchise owners to see how their stores are performing.

The really cool thing about this report is that it’s aggregating live data from all of the databases in our pool.  We have 30 databases.  And it’s using a single connection string and a single query.

In a few weeks, we’re going to deliver in preview Elastic Database Query.  And creating a report with a single connection string against a group of databases will be as simple as a select start.

Finally, how do you manage updates and changes when you’re in an SaaS world?  You have tens, hundreds, thousands of databases.  How do you make updates to all of those databases?

For example, our world, we may want to restrict the rows of data so that each franchise owner only sees the data for the stores that they own.  Well, Azure SQL Database delivered low-level security some time ago.  We want to implement that, but that’s going to require a change to all of our databases.

To make any database change, any schema change, maybe you want to do a maintenance update, we’ll use the new elastic database job.  We’ll take our standard T-SQL that we know and love and we’re going to just submit this as a job to our pool.  We’ll call this RLS for role-level security.

So once I click run, my work is done.  I let Azure do all the rest of the work.  I don’t have to write my custom script that loops through all the databases, I don’t have to manage retry logic, I don’t have to think about how I’m going to scale this to tens, hundreds and thousands of databases.

Azure takes care of all of that for me.  I can use this for maintenance jobs like rebuilding indexes.  I can use this for other standard statistics updates or just one-time updates, any kind of update like that.

And after this completes running, we can see it’s already almost complete against all of our databases.  And if we just give it a second and we refresh our report, hopefully Martina only sees the rows of data for the stores that she owns.

So now you’ve got these fantastic new tools that will help really build robust SaaS applications using Azure SQL database.  Thank you.  (Applause.)

SCOTT GUTHRIE:  So as we can see, we’re adding lots of great enhancements to our overall data story in terms of enabling you to store operationalized data whether it’s in a single database or across thousands of them.

I want to switch gears now and talk also now about how you can use analytics on top of that data.  You know, we’re in a unique world right now in terms of the expectations people have around intelligent apps.  And companies that embrace analytics will be much more competitive in this disruptive economy than those who don’t.  Your data, combined with analytics, enables you to answer business-critical questions and build much better apps.

Questions like what other products would a customer who’s already purchased something from you want?  You know, what’s the likelihood of a customer of yours churning?  What are the warning signs you should watch out for before they do so?

What is the price of elasticity of your products and services?  How do you optimize for that?  Do you have sufficient product availability for the upcoming holiday or for a special promotion?

Being able to answer these questions accurately and use that insight to improve your applications can transform your business.

Now, in the past, answering questions like this was painful.  Data warehouses took months to set up and integrating analytics systems was hard.  With Azure, we now provide a suite of highly engineered analytics services that make answering these types of questions easy.  And it enables you to basically maximize the value of each byte of data that your application stores and processes.

Each of these services on this slide can be provisioned in minutes and composed together, they provide complete, full solutions.  And today I’m excited to announce two new Azure services that we’re launching that complement this existing offering even further and make it even easier to unlock insights from data.

Our new SQL Database Service makes it easy for anyone to set up a data warehouse to aggregate and store data in just minutes.  You can elastically scale the SQL Data Warehouse Service to store petabytes of data, and independently scale compute to match whatever resources you need to process it.

Our Data Warehouse offering makes it easy to interactively query and visualize all of the data stored within it, as well as operationalized machine learning on top of it.

What I’d like to do is show a quick video of it in action.

(Video segment:  SQL Data Warehouse.)

SCOTT GUTHRIE:  (Applause.)  Using our new SQL Data Warehouse Service, you can easily stand up and aggregate data from any source.  You can archive data from your operational SQL or NoSQL databases, you can also then aggregate that data from on-premises data like an SAP system or a Dynamics system.

You can also easily aggregate data from a Hadoop cluster and be able to import any type of data from that.

Now, once your data is in the data warehouse, you can then easily visualize and interact with it using our new Power BI service.  And this provides a rich way you can visualize your data and ask questions of it.

You can also run reports against it, as well as stand up and use machine learning to create predictive models on top of that data.

One of the things that makes our machine learning service so powerful is that once you create a predictive model with it, you can then expose it as an API that your apps can now call and use.  And this basically means the more data you get, the smarter your apps get, and creates a virtuous cycle that really allows you to build some truly amazing systems.

Now, there are other data warehouse offerings in the market today, and AWS has seen a good uplift with its Redshift offering.

I thought I’d just spend a little bit of time talking about how now Azure is even better.  Unlike Redshift, you can independently adjust the amount of compute and storage that you use with a SQL Data Warehouse.  This allows you to reduce costs and pay only for what you actually need.

You can automatically scale up your data warehouse in seconds.  This is a big difference versus Redshift, where typically it takes hours or even days to rescale your data warehouse, and your data warehouse goes into a read-only mode with perf degradation.  This allows you to basically increase and decrease pretty much at will.

Redshift doesn’t have the ability to pause a cluster.  And so unlike SQL Data Warehouse where you can pause when you no longer need your data warehouse and pay only for storage.  This allows you to reduce costs and pay only for what you actually need.

So a great example of that might be when the end of the month or end of the quarter you need a data warehouse.  When you no longer need it, just turn it off and you’re not having to pay for anything.

Our SQL Data Warehouse runs not just in Azure, but can also be deployed and run in an on-premises environment, unlike others.

From a compatibility and feature-set perspective, SQL Data Warehouse provides a full SQL experience and allows you to build super-powerful applications with ease.

So we’re really excited about what SQL Data Warehouse delivers.  We think this is really a perfect opportunity for you to build and analyze data in a really rich way, and it’s going to allow every developer to get more insights from the data they have.

Now, one of the most exciting opportunities, I think, for every business out there is to use the Internet of Things to transform how they build products and how they engage with customers.  And we had some fascinating use cases of IoT solutions built now on Azure.

Ford is adding IoT sensors to all of their vehicles to improve the overall car owner experience.  And they’ve built an IoT back-end solution on Azure to collect and process all of this data, and then they’re connecting it with the customer information that they already store within apps they own inside their own on-premises datacenters in providing a full, complete solution.

Rockwell Automation has partnered with one of the biggest energy super-majors out there, and is basically now running unmanned, Internet-connected gas dispensers.  Each dispenser emits real-time management information, which they can basically then use machine learning on top of to detect anomalies and predict when proactive maintenance needs to be done on the machines.

My favorite example is probably the NFL.  Most teams in the NFL now use the XOS ThunderCloud platform to record, analyze, and share game and player data in near real time during the game.

This ThunderCloud solution runs on Azure.  And in addition to providing real-time statistics and analysis on plays, it also records, encodes and uploads video that can be used for post-game analysis.

Let’s walk through some of the canonical services that you can use in Azure to build these types of IoT back ends.  Using the Azure Event Hub Service, which we released just last fall, you can now ingest and process tens of millions of IoT messages per second.

You can use the Azure Stream Analytics Service, which we made generally available this month, to perform near-real-time analytics on top of these IoT messages.  And you can operationalize it using a combination of Power BI to create rich data visualization dashboards of the stream data.

And you can optionally now store the raw data, or more commonly a subset of that data, inside any of these databases that we run inside Azure, and you can now store the raw data within our new Data Lake service that we’re making available an announcing this week.

Now, this new Data Lake service that we’re announcing today is really a revolutionary way to store and process data.  We built the new Data Lake service using more than a decade of real-world learnings, managing exabytes of data for our own Microsoft Services.

And the Data Lake service enables you to store literally an infinite amount of data.  You can create accounts that create exabytes of content with individual files that are up to a petabyte in size.

And this infinite capacity allows you to keep data in its original form, whether it’s IoT data, log files or any other non-relational structure.

You can then run high-throughput, low-latency analytic jobs to analyze and process this data at any scale.  You don’t have to worry about setting up or managing your own clusters to do this.  Instead, you can focus on the task at hand that you want to get accomplished, and the service will automatically scale out for you and apply the appropriate resources to get it done.

And the service provides built-in, enterprise-grade security support, enabling you to securely lock down control and access to any data that you store within it.

Now, one of the great things about the Data Lake service is that it exposes data using the standard Hadoop HDFS API.  And this enables you to actually leverage any analytics software already out there that supports HDFS.

This allows you, for example, to run standard Hadoop jobs and Hadoop workloads on top of anything stored inside the Data Lake.  You can take advantage and write R applications using our new Revolution R distribution, and be able to now have those scale out and query against literally exabytes of content.

You can also stand up and use VM-hosted Cloudera as well as Hortonworks clusters to access and run analytics stored inside the Data Lake as well.  This give you maximum flexibility and enables you to derive even more insight from the data that you ultimately stored.

So Azure now has, we think, the richest analytic capabilities of any cloud and enables you to build intelligent apps that are highly differentiated for your customers.

This morning, I’ve talked about some of the types of applications that you can do with that, that you can use to drive your business, disrupt businesses or grow profits.

The technology, though, can also be used to really change the world in non-business ways as well.  And one of the things I’d like to do is invite on stage a customer of Azure who is doing just that.  And it’s Just Giving, who basically optimized philanthropic giving campaigns and helped funnel money into good causes.

I’d like to invite Mike Bugembe on stage to talk about what they do and how they’re doing it with Azure.  (Applause.)

MIKE BUGEMBE:  Thank you, Scott.  Great, good morning.  This is what I’m going to talk to you about today.  This is what we’ve built on the platform.  We call it the Give Graph.

It’s a set of complex algorithms that will fundamentally change the way you and I interact with causes that we care about.  It’s going to change the way we give.

But to give you some context, let me tell you a little bit about our company first.  We’re based in the U.K. and we’re the largest social giving platform.  We enable people to raise money and do amazing things for the causes and for the charities that they care about.

Some people decide to do the marathon.  Obviously, recently, people were throwing buckets of cold water on top of their head.  All in the name of charity.

And we’ve been fortunate in having around 23 million people from over 160 countries in the world do this.  And they’ve raised in excess of $3 billion for charity and for good causes.

And this has taken us a lot closer to our goal, which is to ensure that every great cause gets the funding that it requires.

Now, we live in a world where we have billions of people who have that natural desire to give.  And if everyone did that, we would eradicate poverty.  We would put cancer out of business.  Now, our team had the task of trying to understand why that isn’t happening using analytics, using machine learning and getting a real understanding as to how we think and how we make that decision to give.

And our analytics have identified three key things that our product needed to use data for.  The first is that it needed to change giving from being a transactional activity to something that’s far more engaging.  It also needed to be personal, and of course it needed to be social.  After all, we’ve solved some of the biggest problems as the human species by connecting with each other and collaborating.

But at the heart of all of this is that we needed to make giving personal.  Which for us means we need to understand what you specifically care about.  Traditional ways of trying to address personalization, we would use things like look-alikes or collaborative filtering.

But in this space, that doesn’t work.  What you care about is specific to you.  If you care about cancer or beating cancer, I can’t find somebody who looks like you and say that they also care about cancer.  Everybody has something specific that they care about.

However, using machine learning and graph theory on this platform, we’ve managed to crack that problem.  In fact, what we’ve done is we’ve built a machine that does three specific things.  First of all, it understands how you specifically want to give.  Giving isn’t just about money.  It understands whether you want to give time or whether you want to give effort.

Secondly, the algorithm can work out specifically what you care about.  And it also understands that what you care about changes.  It’s a moving data piece.  It’s not static.  It changes as your circumstances change, as your associations change and the more interactive you become.

Lastly, my most favorite part of the algorithm is that it understands how to interact with you, specifically with you, to make giving more engaging and remove those barriers that might exist to stop you from giving.

Now, we couldn’t have done this without the Azure platform.  We’re a small team of scientists and engineers.  And what the Azure platform really allowed us to do was focus intensely on our goal.  We don’t have a team of operations experts who can manage a huge cluster of machines in the cloud.  But this platform was ideal for us.

And these were the components that we used.  And this is how we architected them.  We essentially had three core processes.  We had a batch process which runs nightly, a real-time process, and a process that allowed us to put our APIs in front of the product.

And the batch process is the backbone for everything that we do.  And at the center of it is HDInsight.  This is where we do all of our large calculations.  This is where we built the Give Graph that I showed you earlier.

So, essentially, we take all of our data, we upload it into blob storage, we spin up the clusters and do all of our calculations.  And once that’s done, the clusters are put down and the data is moved onto Azure table storage.

And at the same time, we have a real-time service that helps us keep this product interactive.  It helps providing data to the machine so that it can learn and it can continue to become more effective.  We use F# mailboxes and Azure Service box for that.

And, of course, we have Azure Websites or the Azure Web App, which is where we put our APIs and where the product has access to it.

And all of this manifests itself on the product itself.  This is what our feed looks like.  This is what that platform is powering.  And what’s exciting is that this is where we are seeing giving move from a transactional activity to a more engaging one.  We’re seeing people be far more interactive and far more social, connecting, collaborating about the causes that they care about, the charities that are important to them.

And as I conclude, I can’t emphasize enough how much of an impact this is going to have.  And we’re really in a fantastic position because this platform has allowed us to focus exclusively on our goal to ensure that every great cause gets the funding that it requires.  Thank you.  (Applause.)

SCOTT GUTHRIE:  The opportunity to build applications that can change the world has never been so great.  Each of you now has access to cloud resources that were unimaginable just a few years ago.

There’s never been a better time to be a developer, and I’m really looking forward to seeing what you build.  Thank you very much.

END