Remarks by Ray Ozzie, chief software architect, and Bob Muglia, president of Server and Tools Business
Professional Developers Conference 2009
Los Angeles, Calif.
November 17, 2009
Editor’s Note, Dec. 16, 2009
– References to IPv6 and ASP.NET MVC 2 were revised to correct errors in transcription.
ANNOUNCER: Ladies and gentlemen, please welcome chief software architect, Microsoft, Ray Ozzie. (Applause.)
RAY OZZIE: Good morning! Welcome to PDC ’09. It’s great to be here. Thanks so much for your investment of time and attention.
It was four years ago this month that we started talking about services and the fundamental services-related shifts that are happening across the technology landscape, and what those shifts enable for users, developers, and IT pros, and how our software plus services strategy would impact all the customers in all the markets that we serve.
And after three years of building, last year’s PDC was a very important event for Microsoft, because for the first time in all those years I was able to lay out not just the strategy, but also the very concrete and specific ways that our offerings would be transformed: Windows Azure, Windows 7, IE and Silverlight, Windows Live and Live Services, SharePoint Online, Office Web Apps, and so much more.
Here this week, you’re going to see how we’ve continued to build on that same software plus services strategy end-to-end, and you’ll see that we’ve executed on what we’ve promised. You’ll hear the tremendous progress that we’ve made over the past year. You’ll see the work really more excited, more reinvigorated than ever in recent memory, reinvigorated by the broad range of new ways that we can deliver exciting experiences and value to all our customers by taking advantage of the amazing capabilities of the Windows 7 PC, the net-connected phone and TV, the Web, IE, and Silverlight, and things on the back end such as SharePoint and the cloud.
Part of this excitement that we’re all feeling is driven by the significant and unprecedented technological diversity and choice that’s now before us in the market. What incredible innovations we’re seeing right now in the PC space! Who could have really imagined just a few years ago the kind of choice we’re now seeing in terms of fashion, form and function in laptops, netbooks, and desktops, gaming PCs, PC TVs and more?
And I think over the past few weeks, we’ve all begun to have this same, simple realization that Windows 7 has the real potential to sweep through and reinvigorate for developers the currently fragmented installed base of a billion PCs worldwide.
And so with all that hardware innovation, what a great time for new ideas, an opportunity to envision new scenarios for millions upon millions of cloud-connected laptops. What a great time for innovation on the client.
And on a parallel track at the back-end who could have ever imagined that explosion of interest in cloud computing, and again what a great time to be a developer, to bring solutions to the cloud, saving customers money and opening up whole new markets.
Of course, whether you’re developing for servers or services or PCs, phones or browsers, all of this diversity means that it’s a very complex software environment out there. Many platforms mean many choices. And whether users or enterprise IT, we all want everything just to work together very, very simply, interoperating as one.
And so across PCs, phones, TV, Web, and cloud, across our many products and platforms, across products serving both consumers and business, we at Microsoft have but one simple strategy, and that is to focus on leverage and seamlessness in everything that we do.
We aspire to deliver compelling, seamless, multi-screen experiences for users, and to enable skills leverage and investment leverage for developers and IT.
Of course, this is the very same vision that I laid out at last year’s PDC right here on this stage, and it’s a strategy embodied in a very simple picture we’d like you to keep in the back of your minds, and that is of three screens and a cloud, three primary classes of screens, surrounding the Web as the hub of most everything we do.
With this picture you can visualize our fundamental belief in a world of Web-centric experiences that are also extended richly through apps on your desktop, through apps on the phone in your pocket, and delivered on the inherently shared big screen on your wall; experiences delivered from clouds in private datacenters or from the worldwide public cloud.
Over the course of this week, you’re going to see some of the tremendous work that we’ve done to realize this three screens and a cloud vision. We’ll do it in three parts. Today, we’re going to focus primarily on the back-end: servers, tools and cloud computing. Tomorrow, we’re going to focus more on the experience, things at the top of the stack: Windows, IE, Silverlight, Office and SharePoint.
And then in a few months, this coming spring, you’re going to hear about the progress we’ve made with Windows Live and the Windows Live platform, and again also this spring at MIX 2010 you’ll hear about developing apps for the next generation of our Windows phones.
This week is going to be a very, very busy one. And after all that you hear and see this week, it’s our hope that one thing is going to stand out very, very clearly, and that is when it comes to developing apps for a vision of three screens and a cloud, there’s a single, coherent development strategy moving forward that you can bet on across all of our platforms.
First, in terms of tools, what brings it all together is Visual Studio for developers, and Expression Studio for designers.
Second, in terms of runtimes, our strategic runtime for delivering experiences across all three screens, across three screens and a cloud, are Internet Explorer and Silverlight. We’ll work to ensure that Internet Explorer is the best browser for Windows without compromise, a standards based, modern browser to its core.
Through IE we aspire to give Web app developers access to all the performance and all the functions that the Windows OS and the PCs hardware will serve up.
You’ll hear tomorrow from Steven Sinofsky, and he’ll discuss Windows, IE, and much, much more.
And because Silverlight is .NET, it’s also the premier high-level runtime for the development of and the Web-based deployment of line-of-business apps that typically need data-bound controls and workflow and more.
And Silverlight has exploded in popularity. It’s come a long, long way over the past year. And tomorrow, Scott Guthrie is going to have some incredible demos to show you and a few great prizes about what comes next for Silverlight.
What excites me the most is to see what developers are doing with the capabilities of Silverlight 3, and in particular how they’re using it to bridge Web apps into the world of apps that are installed on your desktop.
And to give you a sense of those possibilities, I’d like to introduce Loic Lemur, founder and CEO of Seesmic. Loic? (Applause.)
LOIC LEMUR: Thank you, Ray.
Good morning, everyone. My name is Loic, as Ray just said, and I represent Seesmic. We’re a small company. We are one of the most popular Twitter applications. We help people share and read what their friends are saying, what are people saying about their brands on the Internet. We got 3 million downloads, and we launched only six months ago.
And Ray, the three screens and a cloud vision really speaks to what we want to do, because we really want Seesmic to be available on every single device, on the PC, on mobile, on the Web, and we want to synch all the data in the cloud for our users.
It’s a nightmare for a small company like ours, because we have to create different branches of code, have different teams. And so the Silverlight vision really fits in our plan.
So what we did is we wanted to go beyond our current application, and we created a prototype on Silverlight which we have been really surprised on how fast we could put it together. And I’m going to show you this right now.
So, here is the same popular application that we have between desktops, but now built on Silverlight. As you can see, you can have all the features. We support multiple accounts, multiple columns. So, I can first post from PDC, obviously on those two accounts. You can add any search. So the best way to follow what people are saying about PDC is probably to follow PDC 09.
And we created a new UI which you can see on the left. We have tabs and we have this notion of spaces where you can create a problem, arrange them as you want, and save them as a space, which you will keep and find again very easily.
You can obviously use everything we have on Seesmic desktop right now, so see the profile, and reply interactively.
This is Seesmic for Silverlight, and we are hoping we will ship a Silverlight version of Seesmic very soon.
But what we wanted is to have the best UI and the best experience today for the large majority, the very, very vast majority of our users. So, I’m very happy and proud to announce that we will ship today Seesmic for Windows, which I’m going to show you right now as well. (Applause.) Thank you.
And Seesmic for Windows is an application entirely built on Windows, and it’s very smooth, very fast, very different experience. As you can see here with the demo, I can move the columns around. I hope you can see how fast it is. It’s really, really nice to use. And we have obviously the very best of Twitter already available. So, in Seesmic for Windows you already have user lists, if you can see here on the left. So, here are my investors, here are some French friends, here are some partners, and obviously you can arrange them all very easily.
I can also very quickly recreate a new user list. So, here is a new lead which I am doing. And look at what we did. This is something new. You can drag and drop users directly from one column to another in a way which is extremely fast and smooth, which we could not do before.
We will also take full advantage of the Windows platform. For example, on Windows there will be a center for location, and as you know, Twitter is actually launching location right now. We will have that available only if you want to put your location through Seesmic for Windows. As you see, we will not do it without your approval.
What we will have as well is the No. 1 request right now, which we’re getting by e-mail, is hundreds of developers want to integrate their services into Seesmic. And we simply can’t do it. We’re too small a company. So we wanted to provide the plug-in architecture, which is obviously possible. So, I’m happy to announce that Seesmic for Windows will be a development platform that will be available, where you will be able to create secure plug-ins for Seesmic for Windows.
Just to give you a few examples, we could have TweetMeme tell you how a Tweet is being retweeted or how it’s spreading. You could have Mr. Tweet give you some information about the user, and you will be able to create your own plug-ins. So I’m really hoping we can see many of your plug-ins being created very soon.
So, you can go to Seesmic.com right now, sign up, and we will send it to you today.
As far as Silverlight is concerned, we have used our code base for Seesmic for Windows to put the processes I’ve showed you earlier in about two weeks. So we will keep working on Silverlight, and we will ship something hopefully very soon as well. Thank you very much. (Applause.)
RAY OZZIE: Thank you, Loic. Great work.
So, I hope this gives you a sense of the power of using our three screens and a cloud programming model, based on HTML and Silverlight, to develop compelling and seamless experiences that extend the power of Windows to the Web.
You’ll hear much more about this tomorrow, and a lot about the front-end of our strategy as Steven Sinofsky and Scott Guthrie talk about and demo innovations across Windows and IE, Silverlight, and .NET.
So now let me shift gears to the back-end of our platform, toward the world of cloud computing.
Earlier, when I talked about the three screens and a cloud environment, I talked fairly abstractly about this cloud computing back-end. Sometimes I might have referred to this back-end as a server or sometimes as a service. And for customers it really doesn’t matter, and that’s entirely the point, because our software plus services strategy is centered on the notion of technology convergence and skills leverage across both.
Windows Azure, which we introduced right here on this stage last year, is our cloud computing operating environment, designed from the outset to holistically manage extremely large pools of computation, storage and networking, all as a single, dynamic, seamless whole, as a service. It’s a cloud OS designed for the future, but made familiar for today.
Windows Azure at its core is Windows. It’s Windows Server. You should think of it as a vast, homogeneous array of Windows Server hardware and virtualized Windows Server instances, and all these servers are under the control of a sophisticated, highly parallel management system called the Azure Fabric Controller, which you can kind of think of as an extension of System Center’s management capabilities in the enterprise.
With Windows Azure, Windows Server, and System Center, there’s one coherent model of managing this infrastructure as a service across Microsoft’s public cloud to private cloud to clouds of our partners who host.
To most developers, to developers like you, Windows Azure appears as a model based extension to Visual Studio, enabling you to build apps that leverage your skills in SQL, IIS, ASP.NET, and .NET Framework.
Alternatively, and of course it’s your choice, you might leverage your skills by using MySQL and PHP within Azure, or you might instead take advantage of our new Azure tools for Java and Eclipse.
Reaching all developers is incredibly important to us, and working closely with the community developing for Windows Azure this year has come a long, long way.
It was only one year ago at PDC ’08 that we launched Azure by inviting you as PDC participants to our Community Technology Preview. We committed to spending the year engaged with you, listening and learning, and reshaping Azure before we took it live.
Well, we’ve done that, and we’ll continue this Community Technology Preview program through the end of the year, so that you’re going to have a chance to get your feet wet with some of the amazing new features that I’m going to introduce to you in just a few minutes.
Then on January 1st, for the first time, I’m pleased to announce that Windows Azure will switch to a production service for paying customers. For the first month, the month of January, we’re going to be exercising our production provisioning systems and validating our billing systems for accuracy and completeness. So, during that first month, the month of January, you still won’t accrue any actual charges. And then on February 1st, customer billing will begin.
As we wind down the Azure technology preview, I just want to thank you and to give you a sense of just how successful the CTP has been. Tens of thousands of developers have actively participated in the CTP, and you’ve made a tremendous, tremendous impact on the product.
Let me give you a few examples by talking about just a few of the new features that go live today in Windows Azure.
The first thing to note is how your feedback has improved Azure’s out-of-the-box experience. There’s now a one-stop shop with single sign-on for Windows Azure, SQL Azure and more.
In Visual Studio 2010, Azure’s cloud application templates are just part of the product, and moving code back and forth between cloud and non-cloud projects is a snap.
We’ve provided a complete set of REST-based service management APIs, enabling you to automate the deployment and management of your application in many, many different ways.
We offer multiple sizes of VMs now, and we also do a number of things that will make it far, far easier for you to leverage your investments in existing apps and infrastructure that you’ve written.
Because you wanted it, we’ve broadened far beyond just the .NET programming model, and the Web role, worker role service design pattern. We now support any kind of Windows code and programming model, and any kind of multi-role, multi-tier service design pattern, supporting extremely flexible binding and arbitrary relationships between roles.
We added support for fast EGI, enabling high-scale Web apps to be written in any of a variety of programming languages.
And in sessions this week you’re going to see the Windows Azure team quickly building and deploying Java apps, running under Tomcat. You’ll see PHP apps under MySQL and memcached.
Let’s move on to the subject of data. There have been many, many improvements in the realm of Windows Azure storage, so many that I can’t even scratch the surface this morning. It starts with the automatic georeplication of storage across pairs of Azure datacenters, three sets of which are going live and into production starting in January: Chicago and San Antonio, if you happen to pick North America as the location of your storage instances; Dublin and Amsterdam if they’re in Europe; Singapore and Hong Kong if they’re in Asia.
As you’ll see, we’re leading the industry in datacenter innovation with our partners, having moved to a highly modular, all-container design, so as to drive the latency within our supply chain from months down to mere weeks.
And if you have a chance, you should check out the container that we’ve brought here to PDC. It’s quite a sight to see, and take a tour inside if you’d like to. It’s over on the show floor.
And stay tuned in the months ahead as we take this innovation, this container innovation from our cloud, and bring it, dedicated, with our partners into yours.
So, back to Azure. At the feature level, in Azure storage we’ve added entity group transactions in Azure tables, page blobs and blog snapshots, custom blob storage domain names, integrated CDM support so that your hottest downloads can be pushed far, far closer to your customers.
Perhaps most significantly is in a new storage type that we call XDrive. Azure XDrives are Azure storage blobs that are mountable as regular NTFS volumes, a drive mapped, cached, durable volume, accessible by mapping an Azure page blob into an NTFS VHD.
So, that’s the basics of Azure storage, but who could talk about data without discussing the huge improvements in SQL Azure?
Between PDC ’08 and this PDC, SQL Azure has truly been transformed. SQL Azure isn’t just instances of existing database servers that are configured to run within VMs in the cloud. It’s not just a veneer over database servers that were designed for a far different deployment and management environment. Rather, SQL Azure is a true database as a service. It’s a database in the cloud.
With SQL Azure you can now simply create a new database whenever you need one. There’s no physical administration, no need to think about memory or virtual machines. It’s got automatic replication. No need to think about disaster recovery; it’s all automatic.
You made it clear that you wanted your existing tools to work against Azure-based SQL, and so we delivered. We delivered TSQL stored procedures, support for ADO.NET and ODBC, support for PHP and JDBC. Azure SQL even works unmodified against things like Excel. Just great progress with all things data, and with all things Azure.
As I said, it’s been a very, very busy year, and I couldn’t be happier with our momentum and with customer demand around Azure.
Some of you have been itching for us to take Azure out of the preview phase and into production, and by special arrangement a handful of customers will actually go into production as of today.
One company I’d like to highlight is Automattic, developer of WordPress. WordPress is one of the most successful and pervasive blogging systems in existence today, used by tens of millions of bloggers worldwide.
WordPress is also a tremendous ISV who’s been working extensively with Windows Azure during the CTP.
So, in order to talk a bit about their experiences, I’d like to invite to the stage Matt Mullenweg, founder of Automattic, and founding developer of WordPress. Matt? (Applause.)
MATT MULLENWEG: Good morning, everybody!
Do we have any WordPress users here in the audience? Nice! Thank you, thank you.
I’m very excited to be here. Just to give you a little bit about my timeline, about six years ago, as a 19-year old poly-sci student, I started working on an Open Source GPL, PHP and MySQL project named WordPress. About four years ago, I founded a company called Automattic to bring WordPress to the masses, which was done to about 200 million people with WordPress.com.
Then about a month ago, I get a phone call from a guy named Jeff Sandquist, and he says, Matt, remember that thing I told you would never happen, and I said, what’s that? And he said, we’re going to have MySQL, PHP, and Apache support on Windows Azure.
So, I looked outside, peaked out the window, made sure there were no pigs, and I said, well, get me out there, I’d love to show this. So, that’s what we’re going to be showing you right now.
As you can see, right here on the Azure back-end, and we have a production WordPress blog here. So, I can click on it, and you will see the beautiful big blue header, everything that you’ve come to know and love about WordPress blogs.
But as you know, blogs are no more longer about just personal publishing, they’re being used for big news sites, they’re being used to cover everything. And so sometimes you get varied traffic.
So, as you can see, we have a MySQL and an Apache instance here. Let’s say my blog gets on Slashdot or Channel 9 or Digg or something like that, and we need to scale it up. We go right here in this beautiful XML file and change it from one instance in Apache to — how many should I go to, a hundred, a thousand? I don’t know a thousand to do that.
So, you can put it however you like, though. You just click the button, and that will take you all the way back, it will reload, it will deploy the instances, bring up all the machines, deploy the virtual machines, everything like that, and instantly add it to the load balancer and you have a fully scaled WordPress.
Now, what’s interesting a few months ago, because we had the election cycle in the United States, and we hosted about 10 million blogs at the time. So, we were seeing all range of really some of the biggest traffic we’d ever seen to blogs.
There were two blogs that were at the very top. One was CNN Political Ticker. It had deep, insightful analysis, really talking about the future of the free world was in the hands, hung in the balance in this election. And then on the other side we had a blog with pictures of cats and funny captions, battling every day for top traffic. I’m not joking.
So, to show you one of the engineers behind the other biggest blog, WordPress blog in the world, I wanted to invite out Martin Cron, who is one of the engineers behind I Can Has Cheezburger. Hey, Martin.
MARTIN CRON: Hi. Thank you. Nice intro. (Applause.)
Hi, everyone. My name is Martin Cron. I work for the Cheezburger network. You may recognize the site, including I Can Have Cheezburger, and FAIL Blog. You like that one, good. And Pundit Kitchen, which is our news and political blogs.
We have a network of about 40 blogs. They’re all humor Web sites, they’re all running on the WordPress platform.
We get about 8 million page views a day. All of our content is user-generated. We have about 10,000 submissions to our platform every day. And all of our content is user peer rated. So, we have about 100,000 votes every day that we process to know what things are good enough to feature on the front page. So, we look at the crap so you don’t have to of all these funny pictures.
We’ve been launching more and more news sites. So, the strategy of we don’t know exactly what’s going to be funny, we don’t know exactly what’s going to work, so we’re going to just throw things out there and see what catches on. We need to be able to throw things out there really cheaply. But we also need them to scale up.
This is some real-life traffic from just a couple of weeks ago, and you can see we’re hovering along about 50,000 page views, and then all of a sudden we get a spike. And that spike could be a cross-promotion from FAIL Blog, that site could be showing up on Digg or on ReadIt or on Slashdot or whatever.
And then afterwards, the sort of hungry hordes of Internet users, they move on to the next big thing, but the people who really liked the site, our core audience, the people who are going to comment and send in stuff and come back day after day, that’s the people who are the real revenue for us, they come back, and establish what I call the new baseline number.
But this dashed line here shows what would happen if on the day of our spike bad things happen, half of our traffic gets turned away. Well, then half of our recurring audience doesn’t even know what we exist really, so they don’t come back.
So we need to be able to scale up really quickly, but we also need to be able to scale down, because it’s wasteful to support this degree of spike-level traffic all the time. We can’t afford to do that. It’s really easy to buy new servers, but it’s kind of hard to give servers back. It’s like, oh never mind, I don’t need this one anymore.
So what we’re doing is we’re launching a new site today on the Windows Azure platform, and it’s running with SQL Azure on the back-end where you can go see Windows Azure blob storage.
So, the new site is called OddlySpecific.com, and it’s a Web site about really strange signage. (Laughter.)
It might not work for everyone. I think this one is cute. (Laughter.) And, see, I get to decide, because we get a lot of signage permissions from FAIL Blogs that aren’t really FAILs, they don’t really fit the FAIL Blogs, but they’re really funny.
So, this site just launched today, and it’s running on SQL Azure at the back-end, Windows Azure for instances, and if it gets really popular, we can just increment the number in the config file, we’ve got more instances to serve.
On the back-end we have a plug-in for WordPress that lets you use the Windows Azure storage platform, which is great because all the Windows Azure instances are stateless, and you can put in a custom URL, custom domain name for how the images are served, which looks really good, and then we can also use the CDN. So, in addition to this being easier for me, because I don’t want to have to deploy servers anymore than anyone else does, we get a better experience for our users.
So, Oddly Specific, all the content is user-generated. If you see a funny sign, please send it to us. Thank you. (Applause.)
RAY OZZIE: Thanks, Martin. Thanks, Matt. That’s great stuff.
So, what’s been tremendously valuable to Martin and to Automattic is the notion of having reliable access to a high-scale, elastic computing utility, with seemingly limitless capacity.
We have heard this specific use of Windows Azure as infrastructure delivered as a service, but as Bob Muglia will discuss a little bit later this morning, developers are looking for services that solve problems at a variety of levels in the stack.
For example, during the Windows Azure CTP, one of the points repeatedly made by ISVs, by many of you, was that you’d like to ride the online wave, the wave being created across the enterprise market by the tremendous success of Exchange Online and SharePoint Online, our Business Productivity Online Services. Many of you have asked, how can I use the Azure platform to help me sell my products to these same online customers.
So, to answer that question, I’m pleased to announce a new service called Microsoft Pinpoint. Pinpoint is a unified catalogue of business apps and services targeted specifically at developers and IT. It’s an opportunity for you to leverage our platform as a service to grow your business.
Today, you’ll find Pinpoint integrated into the Azure developer portal. You’ll see it also syndicated into the Microsoft Partner Network, and it’s also, of course, available from Pinpoint.com.
And we’re on the path to integrate Pinpoint into our Microsoft Online Portal for IT, the very same place where customers try, license, provision, and manage SharePoint and Exchange Online, and Office Communications Online, and all of our online apps, and it’s the most powerful place to be if you’re trying to bring your services and solutions to IT customers worldwide.
It’s difficult for me to overstate the importance of common catalogues and marketplaces such as Pinpoint. Marketplaces are important not just as a place to generate leads and drive business online, but also as a place where anyone can see some fairly stunning network effects around those things that are found to be most popular and most valuable to the community.
And as we’ve seen in the consumer space with music and video, online catalogues and marketplaces aren’t just about apps. They can also be a place where we discover and drive some amazing network effects around the most popular and useful data.
And today, I have the pleasure of announcing and introducing the Community Technology Preview of what I believe will be a game-changing new subsystem within Windows Azure, something that we’ve code-named Dallas.
Dallas, which is built completely on Windows Azure and SQL Azure, is an open catalogue and marketplace for data, both public data and commercial data.
Dallas makes the whole world of data better than the sum of its parts by creating a uniform discovery mechanism for data, a uniform binding and access mechanism for data, a uniform way of exposing trial datasets for developers, a uniform licensing model so that data from multiple providers can be easily joined and recombined.
By delivering data as a service, our aspirations are that Dallas might catalyze a whole new wave of remixing and experimentation by developers, an opportunity for innovation that’s uniquely unlocked by the cloud.
To tell you a little bit more about Dallas, let me introduce Dave Campbell, a Microsoft technical fellow, who’s leading our efforts to bring SQL and data very broadly to the cloud. Dave? (Applause.)
DAVE CAMPBELL: Well, I’m excited to talk about Dallas, and Ray did a great introduction.
The magic of Dallas is about taking friction out of the process of discovering, exploring and using data so that you can create great applications and experiences. And it’s one of these things where it’s easier to show you than tell you, so let me go through and run you through the whole Dallas process, discovering, exploring and using data in a new way.
As Ray mentioned, Pinpoint is the way, the marketplace in which you’ll discover Dallas data feeds. And we’ve been spotlighted here, so we’ll start right in taking a look.
In here you can see a set of data sources that we have. Some of them are public data sources, some are commercial. These are our featured data sources for today.
Now, Info USA had some interesting data on businesses, so I’m going to go ahead and look at this. Now, I can look at the class of data, a brief description on it. There may be some reviews from others in the community.
And once I go through here, I’ll decide I want to try the data. So, I’ll click on the trial here. There’s a EULA here. I’m already logged on. Now, I have, in fact, read the EULA earlier in my hotel, so I’m just going to click here, and not describe going through that. (Laughter.)
So, now here what we see are the subscriptions that I have within Dallas. So these are my subscriptions. In the Dallas portal I can also manage my account keys, I can get access reports, and I also have the catalogue of all of the data sources that we’re working with right now for Dallas. You can see AP, Data.gov, Info USA, NASA, so a mixture of commercial and public data sources in here; so a pretty exciting group.
If we go back in here to the subscriptions, that’s the discovery phase. So, to be able to get at the data right now, the next step is to explore it. And how do you understand the data, how do you get your hands into the data and make meaning of it?
So, this is the Dallas service explorer. And I want to show you right here there’s a set of service parameters that corresponds to this particular data feed. This is driven by Dallas’s underlying service model. So, I don’t have to learn how to parse the data. I can look at the different classes of data that are supported in this Info USA data feed, and I can just click on preview, and immediately get a visualization of the data so I can start to make meaning of it.
I can also look at the data in Atom, as in Atom Pub Feeds. Furthermore, I could just invoke the REST-based service and have the data be rendered back out through to me here.
Now, you may have noticed this Analyze button. I’m going to show you that real quickly.
So, one popular tool that a lot of people use is Excel, and with the new PowerPivot add-in, these data feeds can be understood directly within PowerPivot. So, if you’re familiar with the Atom Pub protocol, there’s not a tight description of the things that are within the items. And we have a set of open data conventions that we’ve used to encode the data items that we call the Open Data Protocol, and that allows tools to be able to render the data and understand it.
So, here within the PowerPivot I can get directly into here with just four mouse clicks, and I have data that I’m able to discover, explore, and then use within this solution. So, pretty cool I think.
So, we’ll take a look at the NASA dataset that I have here, and again I’ll preview that. And again through these open data protocols, which is not something new for us, it’s something that we’ve actually had going that you may have known it as the ADO.NET data services or code-name Astoria, and we’re calling that OData right now.
So, in this particular service I have some structured data that you can see here, and I also have some image names. So, this service combines both structured and unstructured data. And if I zoom in over here, you can see that there are two calls into the service. I can search for the images and get the structured data, or I could actually retrieve an image.
And again this is callable from any platform. But if you’re using Visual Studio and .NET, one of the things that the underlying Dallas service model provides you is the ability to build a service proxy for you.
So, I can just create this, load the service proxy, and as you can see here, because we understand the format of the data in this feed, I actually get a class for me, a partial class that represents the structure of the data.
So, now I’m actually an old-time database kernel guy, and my idea of UI for some of you who were database hacks is the 1200 series flags in the lock manager for SQL Server. So, yeah, that deadlock message, that’s mine.
But here I am, just to show you how easy it is, we’re going to let an old database kernel guy build the WPF app right in front of you. So, I’ll go forward here, I just pop in my little snippet. I’m going to set a break point here, and show you what this does. F5, away we go. It builds, nice step.
So, I just construct the service. I have an account key that represents my account, a usage ID, which is a GUID that you can supply as a developer to actually monitor usage for your own purposes; that will show up in your service report.
And then I look at the results that have come back. I actually asked for a hundred results here. And I don’t have to go through and parse the XML. I can just get it all right back here and make use of it.
So, in this particular case I just bind the results to the data grid, and if I just go here, so here your old database hack showing you a WPF app in just a matter of minutes here through the power of the service model in Dallas. So, I think that’s pretty cool. (Applause.)
Now, we have some people around who actually do know how to write real WPF apps, and so they took this exact same service, the NASA Mars Rover Imaging Service, which is public data, and we’ve built a little application. And that’s why you all have 3D glasses. So, if you’ll join me and pop on your 3D what I’ll do is we’ll actually get this going. So, it’s a little application that you can go through and explore some of these images. And we actually have a slide show here in which we can show you some of this. So, this is the power of being able to get at this data. So this is a public data source that we have, but how would you find this, and how would you use it? That’s the magic of Dallas.
So, again, all built on Windows Azure, the front-end, the structured data storage and SQL Azure. The blog data in this case is stored in Windows Azure storage. And if you want to learn more about Dallas, we’ll have a session today that is not in the printed materials. It’s going to be at 4:30 today, and we also have folks in the booth. So, if you’d like to participate in the CTP, and want to get yourselves an invitation token, I would invite you to either go to the booth, or go to the session today at 4:30. So, pretty amazing things that you can do with this sort of public data if you could find it, if you can just discover it, explore it, and make use of it to create some great experiences.
So, with that, I would like to invite Ray back out. (Applause.)
RAY OZZIE: Thank you, Dave.
What Dave has shown to be possible with apps around public data is really interesting, and I think it’s pretty evident that one of the most fascinating, and as-of-yet uncharted opportunities is to innovate in that realm of public data. Taking advantage of the vast wealth of data that seems open because it’s in the public domain, but that for all practical purposes is still out of reach for many of us.
Some of this being out of reach is because of inconsistencies in data formats. One piece of data that’s posted might be in CSV. Another piece of public data might be embedded in a PDF. Some of this happens because we as taxpayers don’t pay for our public domain data to be serviced in a high-scale way to all the world’s developers such as you. As you may be aware, the current administration is very interested in using cloud computing technologies to encourage efficiency, openness and transparency in government. And by all indications, Windows Azure and Dallas data as a service might well prove to be an effective catalyst toward choosing those worthwhile objectives.
I’m honored this morning to introduce to you by video live a person who would like to share a few thoughts on this topic, Vivek Kundra, our federal chief information officer, and the man responsible for driving the U.S. Government’s move to open data, and its move to embrace the cloud.
VIVEK KUNDRA: Ray, thank you very much for your kind introduction.
I am noticing all the developers and everyone attending this conference are focusing on the Mars data set. As you know, in the public sector, when the government decides to democratize data, innovation happens. We’ve got two specific examples as you think about what happened when the National Institutes of Health decided to release data, and encouraged innovation to happen at a global level with the discovery of breakthroughs in medicine. In the same way, the Department of Defense, when they’ve decided to release data around satellite information, there’s an explosion of innovation that happened in the GPS industry to the point where now anyone can go to the local car rental store, and actually for $10 get a GPS device that can navigate any city in the country.
In the same way, I’m really excited about what NASA is doing in cooperation with Microsoft with the launch of the Pathfinder Innovation Challenge. The president has said that Washington does not have a monopoly on the best ideas, and we need to look outside the four walls of Washington to leverage innovation, and the best thinking that’s happening across the country. And he has charged me to reach out and tap into the ingenuity of the American people across the board.
With the launch of the Pathfinder Innovation Challenge, that’s exactly what NASA is doing. If you go to beamartian.jpl.nasa.gov, anybody can participate and look at the data that has been democratized through NASA on a platform, the Azure platform, that allows people to look around the Red Planet, slice and dice, and cube, and create information, and advance our understanding of the universe.
In the same way, on Planet Earth, one of the things we’re doing is with the deployment of data.gov in May. We started with only 47 data sets. Today, we’ve got over 100,000 data sets on every aspect of government operations, from toxic release level data, that’s the Environmental Protection Agency, to data around average flight times from the Federal Aviation Administration. And what we’re noticing is innovation is happening across the board through people like you, who are developing applications for finding interesting ways of intercepting various data sets.
What I would like to do is show you a demonstration of an application that’s been built using the data.gov platform in an environment where people are trying to find jobs, and trying to figure out what is the most convenient way of leveraging the third window, being the cell phone.
What I want show you is a quick application that was created using this data set. This is a career finder application, where anybody can go online and quickly browse by interest on your mobile device, and if you wanted to, for example, look for a position as a teacher, you can go on here and click on that, and for data.gov, if you want to focus on post-secondary data, you can get all this information from the Department of Labor, and more importantly, you can find jobs near you based on where you’re standing.
Part of what we want to be able to do is encourage greater innovation and applications to be created rapidly. This application was created in a matter of days. It didn’t take hundreds of millions of dollars, and it didn’t take years to develop. In the same way, what we want to be able to do is shift the focus within the Federal Government to the broad focus of CIOs. If you look at what happened just in the last 10 years, we went from 498 datacenters in the Federal Government to over 1,200 datacenters. And what we need to be able to do is make sure that the CIOs are focusing on service delivery, developing services that have an impact on the American people.
I want to encourage all of you, as you look at innovative approaches of developing applications, that you turn to data about jobs, and that you’re able to create applications for the good of the republic, applications from every aspect of government, jobs, healthcare, education, energy. And using the cloud computing platform there’s an interesting intersection here where for the first time now we have the ability to lift up and really focus on service delivery rather than continually investing billions of dollars on building infrastructure.
We want to make sure that we drive innovation across the broader economy, and I’m looking forward to the creativity of the thousands of developers that are present today in terms of what you’re able to develop and also the outcome of the Pathfinder Innovation Challenge.
Thank you very much for having me, and I look forward to the thousands of applications that are going to be created. (Applause.)
RAY OZZIE: Thank you, Vivek. Thanks so much.
A lot of potential. In the grand scheme of things, these really are the early days. There is so much potential if we bring our ideas forth, and do rapid innovation on this platform. It’s early days for cloud computing, and also early days for the seamless cross-device trend, media experiences that we dream about when we think about three screens and a cloud.
But, as I hope you’re seeing today, we at Microsoft think that these areas have huge potential moving forward, and we’ve been investing in them for years now, because by enabling these new cloud scenarios, and by using software and services to drive leverage and coherence across all our offerings, we’ll have done the right thing for customers and the right thing for you.
And so, as you attend sessions this week I’d really like it if you could remember three things. First, when thinking of the experience component of your apps, bet on Windows, in particular bet on Windows 7. Right now there is a huge wave of excitement that’s sweeping across the world of Windows, new hardware excitement, new software and services excitement, and together we can drive network effects that will benefit users, and benefit us all.
The opportunity for all of us is to innovate and to find ways to bring all the value of their PCs and Windows to all the users of the Web. Second, when thinking of the cloud, bet on our online services and bet on Windows Azure. We’re focused, as always, on providing great software, and we’re also equally focused on providing applications, infrastructure, and our entire platform, and now data all as a service. And these services are ready for business now, and they’re the foundation of everything we do.
Finally, take a moment, maybe when traveling home from PDC, or maybe when daydreaming over the holidays, just to think about how much the world around us is in transition and the potential role for things like Azure and Dallas in that change. Our world and our systems are increasingly wired with sensors, physical sensors, virtual sensors, that are recording simply unimaginable volumes of data, data not just about us and what we do, but also data about the physical world around us, data that can help business, government education, healthcare and environment.
But this data does no good unless we turn the potential into the kinetics, unless we unlock it, unless we innovate in the realm of apps and solutions that are wrapped around that data. And so, as software people, you and us, it’s up to all of us. Get your invitation code and start using Dallas services on Azure, explore building new apps across three screens and a cloud. By weaving together public domain content with your own content. Let’s dream and then let’s build. The possibilities for your impact, for our impact together really have no bounds. Thank you.
BOB MUGLIA: Good morning. So, over the past year the industry understanding of the cloud has really evolved. We are learning together. But one thing that has become very clear is that the cloud is more than just about infrastructure. It’s also about an application model. And today what I’m going to do is really talk about that application model from a number of different perspectives.
Looking at some of the industry definitions, there’s a lot of consistency that is evolving in the way analysts are talking about the cloud. Analysts are talking about the cloud both as a set of attributes, things like delivering cloud as a service, and scalable, and elastic. They’re also talking about it in the context of capabilities that the cloud provides, delivering things through infrastructure, the application platform, and the software itself that runs as a service.
So, these attributes and capabilities are definitions that Microsoft is quite comfortable with, and we also very much see the cloud as being used as a pool of computers, running either in your own datacenter, in a hosted datacenter, or the public service, providing a set of IT delivered as a service.
So, thinking about that definition, and thinking about how the learning of the cloud has evolved, let me step back and give you an understanding of some of the ways Microsoft has learned about how to build the next generation cloud application model. One of our key learnings has been in the online services that we’re delivering ourselves, and Bing is probably the best example of this.
Bing is a service that’s always available, it’s a service that’s highly resilient. It runs across over 100,000 computers in multiple datacenters around the world. And if we were to try and manage this environment through standard means the way we have always managed our IT in the server world, it would be way too expensive. The people cost of doing that would be out of bounds. So, instead, what the Bing team did is they built an infrastructure, and an application platform that they call AutoPilot. And what AutoPilot does is it provides a foundation that is used to manage the Bing service, with a very small amount of human intervention. When things fail they just go offline, and then as the hardware is replaced they come back online automatically.
Now, AutoPilot is a great prototype for the attributes that are required in the cloud application model. But, it wasn’t built really as a platform that could be generalized. It wasn’t built as something that could be delivered in a generalized way as a service, or brought back into Windows Server. And that’s really where Windows Azure has come is, is to take these ideas and generalize them in the form of an application platform that can be broadly used.
And speaking about that in the context of application platform, we’ve watched over time an evolution of the application platform world. I remember many, many years ago when I was in school learning on a mainframe, and using the mainframe application model. In the late ’80s the industry introduced client-server programming, and that became the usage model, the application model for many years. In 1995 or so the Web application model was introduced, and the Web paradigm has become something that’s very well established. Each of these build on the other.
In about 2000, service-orientation of SOA, Web services became a way for programs to interact using common Web protocols. And now as we move forward, as this decade is coming to an end, we see the introduction of the next-generation application model, which is really all about the cloud, and some of the key attributes that the cloud provides.
Now, in thinking about the cloud as more than infrastructure, as an application model, there are some key attributes. I talked about the cloud as a pool of computers that provide IT as a service. And service-orientation, the way applications are compartmentalized is a key attribute of the cloud. But there are many other attributes that are being introduced in the cloud model that are very important for the next generation of applications.
Building scale-out applications, applications that are elastic and take on the needs of the loads that are put on them, based on what’s happening in the environment. The cloud environment should be self-responding to that, so people-intervention isn’t really necessary. The scale-out should happen. The elasticity should just happen. Things like stage production become very important, and they’re part of what is required to deliver some very difficult new capabilities, like always-available, the idea that resources and an application is always available.
I think about this sometimes and I see that even important applications for companies don’t have that always-available attribute. I noticed that when I do my banking on Sunday, for some reason my bank has a tendency to upgrade the consumer Web application on Sunday morning, which is just when I want to do my banking. And that’s not an always-available application. The cloud would provide an infrastructure as a part of the application platform, and the application model that will enable some very complicated things, like staging of applications through upgrades, to enable applications to evolve on a continuous basis, and be always available.
One of the key attributes that sits behind the cloud, and that we’ll see become pervasive into the next decade, is really this idea of connecting everything together in a model. If you want to have a set of services, a set of resources that work together to provide common services, you need to have a definition of the model of the application. And that’s something that would become very core to the cloud application model, and it’s something Microsoft has invested very heavily in for quite a number of years. And it will be needed all the way through our product line, this idea of being model-driven, from the point where requirements are generated, and developers and architects define an application, through the stages of testing, into operations and production, and through the evolution of the application.
So these are all important attributes of the cloud application model. And as we see what’s evolving, as we see the application model evolving, and we see applications that you have evolving forward, one of the key things is the ability to take the investments that you’ve made in applications that you run within your own environment, and be able to take those applications forward into the cloud. And so, a lot of what we’re doing is learning from you, and understanding how we can structure the cloud application model so that it will enable you to take your existing applications forward.
Now, some applications are really legacy. And the benefit that the cloud provides is an inexpensive set of computing resources that are shared. So, it makes sense, perhaps, to use virtualization and infrastructure technology to simply move those applications into the cloud. And that will be something that I think will become very pervasive as private clouds become established within organizations, and may also be important as private clouds are established within hosters.
That’s a good thing to do, but it’s really not taking advantage of the cloud application model, and the cost savings, and the benefits that can come from delivering applications that are structured in the form of the cloud application model, cannot be serviced through just infrastructure and virtualization technology alone. It’s important to begin to evolve the application, to have them take advantage of some of the services and the attributes that the cloud application model delivers.
So, taking advantage of things, adding new capabilities such as scale-out, maybe incorporating models within the applications, those are ways that you can take your existing applications and enhance them. But there are a whole set of other applications, maybe there are new apps, maybe there are very critical business applications that you wish to invest heavily in, in the years to come, where it makes sense to really adopt in a very, very deep way the key attributes of the cloud programming model, and to take that application model and change your applications to really embrace it.
Make those applications fully federated. Take the steps that enable them to be always available. Those are things that you may choose to do some time, but this idea that you have a set of applications that were built on previous generations of application models, and you want to bring them into the cloud, is one that I think is really consistent across all organizations. And that will be true whether you’re using a private cloud, or you’re using a hoster, or you’re using a public cloud, such as Windows Azure, I think that will be true across all those things. And the idea that you will embrace different attributes of the cloud application model based on the needs of the application make a lot of sense.
And so, to look at some of the things that are coming, with this cloud application model, and to really get down and talk about some of these services, what I’d like to do is invite Don and Chris — the Don and Chris Show — to come on up and show us some code.
Don and Chris? (Applause.)
DON BOX: PDC ’09, oh my god. How are you guys doing? Are you guys awake? (Applause.) How many people use Windows? (Applause.) Awesome, I knew we had a few users in the room. Awesome.
Last year Chris and I were up here, we programmed for about an hour showing you all the things that were being done in Azure. You spent a year programming against the same stuff we programmed against for an hour, and you gave us tons of feedback. Thank you very much. We appreciate it.
What we want to do is kind of show you some of the things we’ve done in reaction to that feedback, and let’s start with Windows Azure. With Windows Azure we show you that we have a managed execution environment where you could take .ASP code and run in a constrained execution environment and then that gave you a reasonable experience for building cloud apps. You told us, yes, we like that, but we also want the ability to bring other runtimes, runtimes like Python, or Apache, or PHP, or maybe I just want to write some low-level C code like I used to do back in the 1980s, because I thought it was fun, and you not to be able to do that.
So, what we’re going to do now is show you how to program at the lowest levels of the system in Windows Azure. So, what Chris has here is a text file that’s going to be a C++ program. How many people remember C or C++, show of hands? (Show of hands.) Awesome, a few people do. Those of you who remember C, we had these things called pointers. They were very sharp objects. What we’re going to do is use some of these things. We’re going to write a CGI app. This isn’t even Fast CGI. This is plain old CGI. We’re going to go ahead and get the query string out of the environment variable and then print some stuff to standard out and that will be out Web site.
So, what we’re going to do is get the pointer, and I recall from when I used to program in C there were all these libraries like BSERs, or even CLR Interop, where we put link prefixes behind the string. What we want to do is go play with pointers and see if there’s a link prefix behind our string.
So, to do that, Chris is going to take our pointer, task it to int star, back up, see the int that’s behind our string, and then assign that to the variable. Then what we’re going to do is take that pointer, de-reference it, and print it out. So, when we hit this Web site we’re going to see the four bytes before the query string. I want to know what’s in those four bytes, I need that level of control in my Web applications.
Great, now let me see what he’s done here. He’s got the pointer arithmetic, he’s done the print app, and he then prints out the query string. Anybody see a bug here? What’s the bug? Code review time, please, this is pair programming writ large. I think I see it. Chris has trashed the data variable, because he’s backed the pointer up and assigned it into the same local variable. So, because we’re low-level programmers, we’re playing cowboy style, Chris will you do something to make sure we don’t lose that variable. Awesome. (Applause.) Great, the X86-based cloud platform, love it.
Now, does it compile, Chris?
Yes, it does. Awesome.
So, what we’re going to do is we have another Visual Studio project that’s set up to be an Azure cloud project, and what we’re going to do is go ahead and build and publish this thing out to the cloud. Now, because we’re playing at the lowest levels, and we’re highly confident, C++ programmers are the smartest programmers on the planet. If you don’t believe me, ask any one of them.
So, because Chris is a C++ programmer at heart he is going to authenticate against the cloud. Oh my goodness, Chris. Great, so when we did the publish in Visual Studio two things happened, we navigated to the portal where we’re going to go find our service, and we’ve also got a path in the file system. So, Chris probably has the file system path on the clipboard, and we’re going to go now find our production server. So, there’s our production server, because we’re ready to go, we’re not going to do all this staging, and deployment stuff, we’re just going to blast the site right out there, because we’re so confident that program is ready to go. And we had a code review. I can blame you guys if it doesn’t work.
Awesome. So, Chris is pasting in both the packaged-up binaries, and the packaged-up config file, giving it a nice name, and it’s going. Something is happening. Something is happening. Love it, awesome. So, that thing is going to cook for a while. We’re going to come back and try and hit that later on.
So, we just looked at Windows Azure. It turns out, SQL Azure, which is our cloud-based database, we’ve done some work there, too. If you remember last year we accessed SQL Server data services, that was the name of it back then, using an open HTTP RES-based protocol to go do data access. You told us, we love having an open data protocol that’s RES-based for going to get to our data, but you know what, I have this investment in SQL, and I have this investment in ODDP, or ADO.net, or other TDF protocol-based access mechanisms.
And I also kind of like this thing called Transactions, so I want that, too. So, with SQL Azure what we’ve done since last year is, we’ve brought forward the TSQL and TDS experience so that I can actually write a normal SQL app that runs in the cloud. So, what we’re going to do is go to the SQL Azure platform. We’re going to create a database, we’re going to call it PDC09, we’ve allocated the database, and now what we’re going to do is go actually talk to that database using our traditional desktop SQL tools. We’re going to use SQL Server Management Studio, this is just the 2008 R2 Edition. This thing will just speak TDS over the wire talking to the cloud, which is where our database is running. We’re going to open up an interactive SQL windows.
And, Chris, please demonstrate your love of TSQL by writing a SQL table declaration.
This is a fresh database we just created. We’re going to create a table called “Talks.” We’re going to give it an ID with a primary key and an identity. That feels kind of SQL-y. We’re going to use views and bar charts, because I want to be able to have internationalized characters on my fringe. Thank you. We’ll have a title, a speaker, and a day probably. Awesome. And bar charts, 40 are great. Hit “go.” And now what we want to do is put the rows in there, but we want to make sure either all of the rows go in, or none of the rows go in. So, we’re going to use our good friend again, Transaction, and we’re going to do an insert statement or two inside of here. So, we’re going to go ahead and do an insert talks. The columns are title, speaker, day. And values, why don’t you use some creativity here, Chris, this is your small windows of creativity, so you demonstrate that you’re a creative person to this crowd here. Great. Love it. Often you can use Editor Inheritance, another feature in SMS. Who says we don’t have an inheritance-based database system. Love it. Very important. Awesome. Great. Cool.
And so now what we’re going to do is just select that thing, go hit execute. And if we do, again, we’re cowboys being SQL programmers today, so we’ll use a select star, which I know every SQL person in the room is cringing, but we’re going to do it anyway. Awesome.
So, we have a database. So, Windows Azure and SQL Azure. We also talked to the Access Control Service last year, and the Access Control Service had a SOAP-based WS Trust and WS Federation authentication mechanism where I could do enterprise-grade, enterprise-level authentication against a cloud-based authorization service.
It turned out I liked that because I need it for a lot of my enterprise assets. But I also want something more lightweight that’s very WRAP and Web and Web browser friendly. So, as you may or may not know, we’ve been doing this work with Google and Yahoo! to do this thing called OAuth WRAP, the Web Resource Authorization Protocol. And so we’ve added support for OAuth WRAP to the ACS that’s running in Windows Azure. So, why don’t we go ahead and program against this.
What we’re going to do quickly is change the URI to the Azure-based Authorization Control System. We’re going to go make a simple call that calls up a request body to go get a token. So, we’re going to give it our name and our password, which we pre-allocated before we came out, and the scope that we’re actually doing the authentication against. And so we’re going to make a call to the Azure Cloud Service. When we get out, it will be success, not succession.
Are you planning on replacing me, Chris, is that what that was? A little passive-aggressive coding. I love that. Cool.
And so what we’re going to do is, go ahead and grab the token out of the response that we get back. Awesome. And now what we’re going to do is call the function below, which we called Get And Show Talks. And now we’re going to go add a little parameter to it which allows us to pass the token.
So, we make the RES call over to the ACS. WE get the token. And then we call the code that we were doing before, and we need to use that token in the header. So, we’re going to say OAuthWRAP.setHTTPauthHeader. Lovely. Request, token. Sweet. That’s pretty good. So, we do the authentication, and now we’re actually going out, our service in the cloud is serving up our SQL Azure data using O data, using OAuth WRAP, and it all kind of works. Great.
Finally, Chris, let’s go test, see if our server works. (Applause.)
While Chris gets this to work, I’ll just remind you, we built a managed platform for you to write ASP.net apps, and .NET code in the cloud. You told us you wanted low-level platform access. We delivered it. We gave you an open data protocol for accessing data over our cloud. You said you wanted low-level TSQL and TDS, we delivered it. You said you wanted a RES-based authentication protocol, and Access Control Protocols, we delivered it.
I would like to thank you very much, and we want to bring out Bob. (Applause.)
BOB MUGLIA: Thanks a lot, guys.
So, Don and Chris showed you a whole set of changes, and things that have been done to the Windows Azure and the SQL Azure platform over the past year. And we’ve been listening to you. While we’ve heard a lot of feedback as we’ve been through the beta test, and the CPTs of our online service, and we’ve heard that feedback, and we’ve really evolved these things.
And we’ve been working with quite a number of companies and organizations around the world in this process of bringing Windows Azure to life, and we’ve had a number of them, as Ray said, that are going production and live right now. And over the next two months, we’ll be enabling everybody to go production and live on this service. And ultimately we’ll begin charging for it in February.
But what I want to do right now is bring out one of our customers who has been working closely with us, Kelly Blue Book, and they have built an application, one of their core applications, their Web site, that utilizes both Windows Azure as well as their on-premises implementation of this inside their own datacenter. So, let’s introduce Andy Lapin to come up and show us what they’ve done with Kelly Blue Book.
ANDY LAPIN: Thank you, Bob.
BOB MUGLIA: Good morning.
ANDY LAPIN: Backstage, Bob took the keys to my datacenter, and was asking me what he could do to get me into the cloud today. So, remind me at the end, Bob, to get my keys back.
BOB MUGLIA: I’ll have to do that.
ANDY LAPIN: So, for anybody that’s bought or sold a car in the United States, you probably already used KBB.com. Let me just quickly highlight some of the data that we provide to consumers.
So, in this Perfect Car Finder Silverlight app, we can filter cars down by things like body style, price, gas mileage. We can zoom in on pictures, pan around. Once we see the car that we like, we can get some more information about it. KBB.com today has a volume of about 14 million unique visitors a month. We do this through two co-located datacenters. Ideally, we’re serving all of our traffic through a single datacenter. That way, the other datacenter becomes a fail over. Unfortunately, traffic isn’t evenly distributed around the month, plus we can’t really determine what the peak is, it’s not always predictable. So, we end up using that second datacenter for additional capacity as well.
BOB MUGLIA: So, you need to take, with this second datacenter, and have the ability to bring on resources on demand, essentially. Plus, it’s pretty expensive to run that second datacenter as a fail over.
ANDY LAPIN: That’s exactly right, Bob. We’re paying for availability in that second datacenter, not really actual use. We’re actually only using it a couple hours a week. What we were looking for was really a more flexible cost model, like the cloud model. So, we’ll be using Windows Azure instead of that secondary datacenter to do our additional capacity and fail over.
So, our application was already written in .NET using Visual Studio. That made migration fairly straightforward. We had to make a few changes, but those changes accounted for less than 1 percent of our total code base. Here you see one file that’s a configuration file used to support an Azure deployment. You can see here the application looks just like it does running in IS running in the Azure development fabric. This allows us to have a single code base so that our developers can test both on-premise, and cloud solutions at the same time.
This is what the application looks like right now running in the cloud. As you can see, we have 30 instances available. As traffic dictates, we can move that up, we can move that down, and we really only pay for what we use.
BOB MUGLIA: So, you built the application to be scale-out and elastic, and you’re able to change it as you need?
ANDY LAPIN: That’s exactly right.
So, I talked about code a little bit. What I haven’t talked about yet was data. First and foremost, Kelly Blue Book is a data provider. We are heavily reliant on relational data in our on-premise datacenter, and we really can’t run our application in the cloud without being able to port that data. Fortunately, SQL Azure did the trick. Here in SQL Management Studio, we can see we have a couple of different databases. These databases have very different deployment models. In one case, our vehicle data is published through an extensive ETL process on regular intervals, and we can actually use the exact same mechanism in the cloud that we do today with our on-premise datacenter.
On the other hand, our community and personalization data is updated by users. It changes minute by minute. We needed a way to make sure that this data stay identical between our on-premise and cloud solutions.
So, here I’m going to start up SQL Azure data synch, and I’m going to walk you through how to set up a synchronization relationship.
So, in SQL Server Management Studio we fire up a wizard, we can enter our credentials, we can select whatever tables that we want. In this case I’m just going to select one. This is the message threads from our discussion forums.
When I process this, I’ll set up the synchronization relationship, and now I can update the schedule to make sure that that data is updated as often as I need it.
BOB MUGLIA: So, as changes are made between these servers in different datacenters, the information is kept up to date.
ANDY LAPIN: That’s right, and we control the schedule.
So, Bob, what I’ve talked about and showed today is how the cloud and Windows Azure has really allowed us to solve some of our hosting problems, and done so in a cost-effective manner. First, we didn’t have to rewrite a lot of code, and we can still maintain a single code base. Second, we can manage our primary datacenter based on an average load, knowing that we can leverage the cloud in Windows Azure for additional capacity and failover, and only pay for what we use.
The last thing I haven’t talked about yet is management. We’ve reduced the time it takes to provision new hardware from six weeks in our local datacenter down to six minutes in the cloud.
BOB MUGLIA: That’s great. Thanks a lot, Andy. We appreciate all that you’ve done. (Applause.)
ANDY LAPIN: Thanks.
BOB MUGLIA: So Kelley Blue Book is one of quite a number of companies we’ve been working with through this beta phase. And what I’d like to do now is show a video to talk about some of the other things customers are doing. Let’s run the video.
(Azure Partner Momentum Video.)
BOB MUGLIA: Building business-class services with a well-defined SLA is what the Windows Azure platform is all about. And we’re working with our partners across the industry, enabling the Microsoft ecosystem to make it simple for you to move your applications forward to the cloud.
There are many partners we’re working with, including systems integrators like Accenture and Avanade, who are really partnering deeply with us to help you move your applications forward.
Now, we showed a way – Andy showed it a couple minutes ago – how we can connect the datacenters that you’re running together with Windows Azure. And in that case, we did it with SQL Server and SQL Azure, and he showed how we could synchronize data there. But there are a lot of ways in which connectivity between on-premises datacenter and the centers in the cloud are very important.
The data service is one. Another is the application messages and connecting the message flow between applications. So, one of the things we’ve done, we showed this last year, we built a message bus, a service bus that is able to connect applications that are running in your datacenter together with Windows Azure cloud as well as with other trading partners that you might work with. So, that’s an important level thing because it enables connectivity from point to point.
Now, all of this really requires identity, identity infrastructure and that concept of federation, that idea of a federated identity that can connect your on-site authentication system, typically Active Directory, together with Windows Azure as well as together with your trading partners and having a common access control service, which is a feature of the Windows Azure platform, was an important way to do this.
Now, these services are really critical and they’re an important part of building the next generation of cloud applications. But sometimes it’s important to be able to get low-level network access back onto an existing datacenter. And today what I’m pleased to announce is with Windows Azure, next year we will be entering into beta with a new project, something we call Project Sydney. And what Project Sydney does is it enables you to connect your existing servers inside your datacenter together with services that are running with Windows Azure.
And what I’d like to do now is show you a demo of Project Sydney. Again, as I said, this is something that will be available next year. And what you have in front of you is an application that Microsoft uses for our giving campaign.
Every year in the fall we run a giving campaign. It runs for about six weeks, and our employees have an opportunity to donate to the charity of their choice, and we also run an auction as a part of that giving campaign so people can bid on items that employees have donated. And this is the actual auction application that we ran during our fall giving campaign, and we actually did run that on Windows Azure.
For the purpose of the beginning of this demonstration, I’m running this within a Microsoft IT datacenter, and I’ve got the Silverlight auction application here, and I can see that one of the items that’s up for bid are some U2 tickets.
Now, I was just down here a few weeks ago for the Rose Bowl and I saw the U2 concert down here, it was a great show. And I immediately discovered that U2 is coming to Seattle in June, I think it’s June 20th, actually. And so maybe getting these tickets here might be a good idea. It’s a charity auction, the tickets are a bit pricey, but hey, it’s for a good cause. And so what I’m going to do now is go ahead and place a bid on that. Little expensive, what the heck.
So this is running within the IT datacenter at Microsoft. And let me now go up and look at Windows Azure and take a look at this. And this is the Windows Azure environment and the portal, and I’m going to go ahead and I’m going to try and connect to this application. But in this case, the connection between the on-premises datacenter and the Azure environment is not there, so I get a SQL Server error. The compute environment that’s running in Windows Azure is unable to reach back into the IT datacenter to the SQL Server, which is running on-premises.
So let me now switch over to that actual SQL Server machine, running in a totally different datacenter environment. And here what I’ve got is the Project Sydney connectivity agent, and what I can now do is just go ahead and just run this on that box. And what it will do is it will create a secure connection between that on-premises server and the Windows Azure environment. And that connection uses a combination of IPV6 to do point-to-point connection, as well as IPSec to fully secure it. And as you can see now, that external resource is now available.
So if I go back over and I take a look at the application I was just running, and I refresh it, the application is loading and here we are at the actual application. We’ve now reached back and connected to that SQL Server box that was in the IT datacenter, and you’ll notice that the tickets are updated and the price of that is updated.
So that’s a feature that we’ve been working on for some time. And, again, our focus is to make it as easy as possible for you to connect the applications that you’re running within your own environment – because you’re going to keep doing that for many years – with new applications and applications and parts of applications that you run in the public cloud environment, and this is just one of the ways we’re doing that.
Now, as we move forward and look at how we can simplify – (applause.) As we move forward and think about ways we can simplify being able to take the investments that you’ve made in the Windows Server environment and move them into Windows Azure, one of the things that we’re doing is allowing you to create your own image. We will do this next year. This is another feature that’ll come in 2010. We’ll allow you to create your own image, which has all of the software configured exactly the way you want it.
So what we’ll have in the Windows Azure environment is a set of pre-defined Windows Server images which have different infrastructures on it, different versions of .NET and things like that loaded on it. And then you’ll be able to grab one of those images, mount it, and remote desktop into it. You’ll have full administrative access to that image, and be able to load whatever software you want on that image, take a virtual machine snapshot of it, and then store it away for future use. And those images become available to you to use as a part of your application infrastructure just as you would use standard deploy Windows Azure image.
So we think this is just one way where we can take steps forward to make it simpler for you to take environments and applications that you’ve built forward into the Windows Azure environment. And so giving you admin mode, you all told us that giving you admin mode and giving you this capability is something that you wanted, and so this is something we’ll be delivering with Windows Azure next year. (Applause.)
So as we think about building scale-out, highly available applications, one of the key things is to have in pace an application platform, an app server that can simplify that process. And today what I’m pleased to do is announce a new application server that Microsoft will be making available. It’s in beta immediately on Windows Server starting today, and will be in beta next year on Windows Azure, and we call that applications AppFabric. And what AppFabric will do is it will take and extend the environment that you’re very familiar with with IIS, and provide you with a platform for building scale-out, highly available, middle-tier services such as WCF-based services and Windows Workflow-based services.
And the idea is that this creates an infrastructure, a very easy-to-manage infrastructure where we will do that fail-over for you, we will keep the system highly available, and we will do balancing between these things so you can build your applications in a straightforward way. In addition to middle-tier services such as work flow and WCF, we’ll also provide a database cache, which is an important part of speeding up and providing better performance for the applications you’re building.
So this is a very easy-to-use service, it’ll be part of Windows Server, and it’ll be part of the Windows Azure environment, the Windows Azure platform, and we’re going to start making the beta of that available for our servers starting today.
Now, all of this, of course, is built on top of the infrastructure of the .NET platform. All of this is taking advantage of and building on the advances we’re making with .NET and going on with .NET 4, and of course Visual Studio. And we have some amazing new things that are coming together in these two releases, this combination of .NET 4.0 and Visual Studio 2010 that are well into beta right now, they’ll be shipping in the first half of next year, and there’s a whole lot of new features and capabilities that are too numerous to mention. So, rather than trying to go through them, what I thought I’d do is invite Cameron Skinner up to show you a demonstration of both the new AppFabric environment together with .NET 4 and Visual Studio 2010. Cameron. (Applause.)
CAMERON SKINNER: Thanks.
BOB MUGLIA: Good morning.
CAMERON SKINNER: Good to be here. So, this morning I’m going to show you the TailSpin Travel application, which is a Web application built with .NET framework 3.5. And I’m going to increase the functionality of this app by using some of those technologies that Bob has just mentioned.
So let me jump right into that app right now. So, let me sign into this thing and nothing like trying to come up with your password with the president of the company and 5,000 of your friends. Here we go.
All right, so now what I need to do is I need to edit my itinerary. I need to update this thing so I can stay in a hotel. So, I’m going to say select a hotel, select a room type, let’s pump that thing up, that’s good. And let me update that. All right, great. OK, so now what happened here is I forgot to select one there. And now let me update that itinerary. So, that’s the essence of this application. It’s just a booking application, update your itinerary.
So now I want to jump into the app and actually explore it inside Visual Studio. So, here I have Visual Studio. You know what? I’m a developer, I really like multi-monitor, I’m going to drag this to the second monitor now. Let me drag this guy over here under this monitor. And then what you see, this is a Web application – (applause.) Feel free to interrupt and madly applaud at any moment, that’s fine.
So here’s – we are leveraging the ASP.NET MVC infrastructure. And here you see my assembly diagram, which is showing me dependencies. As I select this box here, that is my ASP.NET MVC name space. And as I hover over this thing, I can actually zoom right into the methods, et cetera, that are leveraging that name space.
So let me jump back in here and start to add some new capabilities to the application. The first thing I want to do, Bob, is I want to add a single sign-on capability to this application. With Windows Identity Foundation, that’s never been easier to do than what I’m going to show you right now.
So what I’m going to do is I’m going to come in, I’m going to right-click on TailSpin Web. I’m going to add an STS reference. STS being a security token service. And I’m going to paste that in here. And then I’m going to use an existing STS –
BOB MUGLIA: So what we’ve done is we’ve built an STS into Active Directory so that you can take and build identity federation between on-premises datacenters and on-premises identity systems together with the cloud and with other partners as well.
CAMERON SKINNER: That’s right. So, now when I control-F5 this thing, I should jump right through that login screen right back into the application without doing anything, just done. So, that’s great. So, now I’ve done that. Now the next thing I want to do, if you remember that (garbled) when I had to select that to indicate one GAP, what happened there is it went back to the server to do the validation and then came back. I need to get client-side validation into this application.
So let me do that now. I’m going to use ASP.NET MVC 2 to do that. And it’s extremely simple. So, let me show you a new dialogue in Visual Studio, which is the navigate-to. I simply type edit in here or anything like that and it will bring up anywhere in my solution where edit is found. Like for instance, in this case, I want to go modify the edit.astx page, and add the one line of code that I need to enable client-side validation.
I’m going to do that now by adding a server-side tag. HTML dot:enable client validation. And at this point, I’m done with that. I’m going to go into the debugger, F5, and I want to show you that client-side validation working in place. And what I’m going to do to do that is I’m going to add a rental care to the itinerary. So, I want to update my itinerary by adding a rental car to it. As soon as this thing spins up. Here we go.
OK, so now what I’m going to do – here we go – is edit that itinerary. Scroll down here a little bit. Add that rental car, and just to show you the client-side validation, I’m just going to hit update, boom. Client side validation.
CAMERON SKINNER: Exactly. So, now I want to focus your attention now on the execution time of this page, 677 milliseconds, not hideous, but I’m anticipating some significant traffic on this application, and I want to make sure that this thing scales up. I have a sneaking suspicion that I’m hitting the database too much, so what would be fantastic is if I could get back into this app and actually see how I’m querying that database.
What I’m going to do is I’m going to go back over to Visual Studio and I’m going to use the new IntelliTrace feature in Visual Studio. So, what I’m going to do is I’m going to hit “break all.” And what you’ll notice is a number of events here. But because I think I’ve got a database issue, I’m going to clear my categories and only focus on ADO.NET. And so what you see here is a number of events. It’s basically showing all my queries that I have used in the application to date. And if I select one of those, it takes me right to the code where that query was being called. (Applause.)
BOB MUGLIA: So let’s be sure we’re clear about what’s really happening here. As Cameron was running the application, a trace was being made of the instruction execution. And Visual Studio keeps track of all of that, and then allows you to actually go back and debug something that has already run, whether that’s something within your development environment or even in a production environment, because the overhead associated with this is very small.
CAMERON SKINNER: That’s right. That’s right. So, what I need to do is I need to – a real common strategy that you can use when you’re building these apps is to introduce some sort of caching strategy. So, ASP.NET has had session and object caching for a while, which has helped tremendously with performance, but I also want to make sure that this thing scales out.
And so what I’m going to do is I’m going to introduce Windows Server AppFabric that Bob mentioned, I want to push this into this application and take advantage of that. The AppFabric, in this case, is going to be this distributed data memory cache service that I want to take advantage of.
So I’m going to bring up the Web config to show you what I have to do to get that into this app. I first need to just uncomment out one line there and come down here to my session state and undo that. OK? Great.
So now what I want to do to show this off, I’m getting tired of having to go back into the application and hit edit, et cetera. I want to show you another feature in Visual Studio 2010 which is called the coded UI test. And what this is is essentially a unit test that actually drives my UI.
So I’m just going to right-click on this thing and say run selection. And what this is going to do, it’s going to build and it’s going to run my application, stand away from the keyboard, it’s going to run it and then I’m going to get the test results. So, it’s navigating my app for me and then I’ll get the test results which hopefully shows me – there we go – down here, if I zoom in a little bit, you’ll see five milliseconds, down from that 600 milliseconds that we saw. So, a significant improvement in performance.
BOB MUGLIA: So a couple things that happened there. One is Visual Studio now has available a very broad set of tools to enable testing as a key part of this scenario, so making Visual Studio broader in terms of the way you can use it across your development life cycle, adding really a whole broad set of ALM functions. And then also showing that this database cache and the ability to take load off your production database by having these middle-tier caching services as a part of AppFabric, so those two working together.
CAMERON SKINNER: That’s right. Absolutely.
Now when I’m updating this itinerary, what’s happening on the back end there is a Windows Workflow Foundation workflow that’s handling that process. And in 2010, we’ve actually added a new WF4 designer, and we’ve added a flow-charting capability here, which makes modeling business processes very easy and very intuitive.
What we’ve also done is we’ve enabled through the AppFabric, the ability to expose this workflow through a service without requiring custom hosting code or any kind of extra logic that you have to write, AppFabric will manage that for you automatically.
I want to show you what that looks like here. I’m going to go into IIS Manager, the management portal here. I’m going to double-click on dashboard. OK. And what you’ll see here – I’ll zoom in and give you a little close-up there. You’ll see a management console that is showing me all of my WF instances, whether they’re idle or not, the call history and instance history. What I’m going to do is I’m going to scroll down a little bit and I’m going to go into that workflow that I was just showing you the designer for. I’m going to right-click on the activation of that flow and I want to look at some tracked events.
Here, I’m going to select that book hotel reservation that I had done, and I specifically want to look at a tracked variable. Sure enough, there’s a reservation, and there’s Cameron S. had made that reservation. So, one-stop shopping to manage all the services in AppFabric. AppFabric’s doing more than just managing and running services in your app, it’s also managing long-running workflows that are persistent over time, it’s handling fail-over, it’s handling – if you need to add more scale to your application, you can add more nodes to the cluster, that kind of thing.
BOB MUGLIA: So it’s always been hard to build these scale-out, middle-tier services, and with AppFabric, we’re making it easy, and we’re also making it a natural extension from the management paradigm by making it an extension of the IIS management console.
CAMERON SKINNER: That’s right. So, I’ve added a whole lot more capability to this application now. What I need to do now is deploy this to my staging environment. In the past, deploying a Web application has always been kind of a chore. You’ve got to deal with a bunch of moving code around, you know, IIS settings, database schemas, et cetera. What we’ve now done in 2010 is we’ve integrated deeply the MS Deploy and Visual Studio. So, deploying a Web application is extremely simple.
Let me show you a couple options here. What you can do is you right-click on your Web project and you say “create package.” What that does is it zips up this app and essentially puts it into one simple file that you can then hand off to the rest of your organization to go do what you want to do. But even better than that, we’ve created a published profile capability so publishing your application to your staging environment is as simple as one click, and I’ll show you that here.
BOB MUGLIA: So you can use this to publish into a staging environment either within your own datacenter, or for example in a hoster datacenter that’s running Windows Server?
CAMERON SKINNER: That’s exactly right. That’s exactly right. So, now that I’ve set my configuration to release, here is my publish profile. I’m simply going to click the “publish to Web” button, and away it goes.
So with that, Bob, we’ve added a lot of new capabilities what was a .NET framework 3.5 app, we brought that to the modern day, let’s go write some good apps.
BOB MUGLIA: Great, thanks a lot, Cameron. (Applause.)
So as you can see, there’s an awful lot to explore in AppFabric and .NET 4.0 and Visual Studio 2010. Obviously, there will be a lot of chance to do that over the next couple of days here at this show. But you should know we’ve been very busy trying to make this as easy as possible for you to write these next-generation applications, including taking advantage of the attributes of the cloud application model.
So these things are real, they’re available now, and you can begin using them. The AppFabric, as I said, is in beta on-premises, it’s on beta for Windows Server. It’ll be available in beta next year in 2010 on Windows Azure. The ASP.NET MVC 2 framework is also in beta, that’s something you can do now. The Windows Identity Foundation, that’s at RTM level, you can actually deploy that in production, and we now have go-live licenses on beta two for Visual Studio 2010 and the .NET 4 environment. So, these are all things that are ready for you to begin taking advantage of.
So looking at this environment and looking at these different capabilities, you know, what you see is a very broad set of platform application services that Microsoft is focusing on delivering, to make it easy for you to take your existing applications forward into the future as you make them cloud applications, and also to write next-generation cloud applications. One of the key attributes that’s very important is to be able to have a consistent environment that manages this across both the datacenters that you run as well as the cloud datacenters. And Microsoft’s been investing for many years in System Center to provide an underlying management infrastructure. And put this at the bottom of this stack because it really sits below the virtual machines, in a sense, and provides a management infrastructure across all of the different virtual machines that are running both in the private cloud environment that you’re creating, as well as the virtualized environment and physical environments you have as well as in the hosted environment and in the public cloud. And it will provide, over time, the ability to connect and manage across these two different environments.
So with that, what I’d like to do is invite Doug Purdy up to show us how with can put these pieces all together to provide a consistent environment. Doug? (Applause.)
DOUG PURDY: Hey, Bob, how’s it going?
BOB MUGLIA: Good morning.
DOUG PURDY: Good morning. Good morning. Welcome to PDC. What we’re going to show you this morning is an early look into some work we’ve been doing around bringing our traditional strengths up in the on-premises world, as well as our strengths building in the cloud world, and bring those things together in a cohesive developer platform to benefit you. And how we’re going to do that is we’re going to start with the same TailSpin application that we just saw Cameron build a few moments ago. And you’ll see it here on the screen.
And what I’m going to do is I’m going to open up Visual Studio and you’ll see that we, in fact, have the same exact project here, the same solution. We’re leveraging ASP.NET MVC, leveraging Windows Identity Foundation, leveraging all the wonder that we have in terms of the AppFabric which we just announced today.
And what I’m going to do is I want to move this seamlessly and transparently up to Windows Azure. And how I’m going to do that is I’m going to add an application model to this project. And how I’ll do that is I’ll simply come up here, I’ll say add, I’ll say new project, and then we’ll see a dialogue come up and I can select application model project. Now I’m going to go ahead and hit OK. For those of you that have been listening to Bob talk about cloud as the new application model, this is a first-class view on top of this. And don’t be shocked or surprised by that word “model” it’s not anything new. If you’re used to app config, you know what a model is. If you’re used to XAML, you know what a model is. It’s just a higher-level description of a part of an application or the whole of an application that we can do interesting things with.
Furthermore, if you’re familiar with Windows Azure, you already are very familiar with what models are because this is how you configure the environment via roles and other aspects. And, in fact, if I go over here and I click here on the toolbox, let me go ahead and pin that on top. For those of you that are familiar with Windows Azure, you’ll see some familiar things here. You see things like the Web role, the worker role. Furthermore, you see other parts that you’re not used to seeing inside of a model, but we want to capture the totality of the application directly inside of this app model so you see things like the database as well.
So what I’m going to do in order to deploy this application is I’m going to drag out three different roles. So, the first role that I’ll drag out is the Web role. The next role that I will drag out is the AppFabric role. Let me just make this here. And then lastly, I will drag out the database role.
So now I have all three tiers of my application sitting inside of this designer. So, now what I want to do is affiliate all the projects that I have over on the right-hand side with that application model. And how I do that is very simple. I just drag out the various projects, drop that here. Then I’ll drag out the services onto the AppFabric role, and then last I’ll go ahead and pull my DAQ pack out, which is just the database model.
BOB MUGLIA: Right. SQL Server, one of the things that’s coming in SQL Server 2008 R2 is the ability to define databases in a model-driven way with something we call DAQ, and it really describes all of the different tables and the components of a database.
DOUG PURDY: Exactly. And so what we’re doing here is we drag the various projects in, you can see that we’re picking up various properties that we have inside of the project. So, we go in and we analyze the app config, we go in and we look at all the models that are already in your application, and we pull them up to be first-class parts of the app model itself. And, in fact, if you notice I’ve clicked here on the Web role, and I can come down here and even change the number of instances. So, I don’t have to muck around with any XML config files or anything like that. And what we’ll see this data will flow all the way through the application in just a few moments.
So I’m going to change this to two. Go ahead and hit “save.” And now I’m ready to go. And so what I want to do now is I want to go test this. I want to take this from on-premise here in Windows Server and I want to move it up to the cloud and test it. And I’m going to do it in such a way that I don’t want to have to talk to anyone in operations. I’m just a developer, I want to have a self-service experience where I can test this application.
So I’m going to right-click here. And I’m going to select “publish to test environment.” So what’s going on here is very interesting. We take that entire application, we shred it into a natural SQL database. You may have heard this called the Oslo repository. We’re officially renaming that here at this PDC to SQL Server Modeling Services. We take that, we take it inside of the database, then we utilize the REST APIs to push it up to Windows Server, and then we start to run the application. And what you can see here is I’ve started to run the application. You’ll notice the URL up here, and you’ll notice some interesting aspects of this application.
First, you’ll notice that it has my identity. So, we have Windows Identity Foundation running up inside of Windows Azure. I didn’t have to change a line of code for that to work.
BOB MUGLIA: So to be clear, you published it to a staging environment for Windows Azure.
DOUG PURDY: Exactly.
BOB MUGLIA: And so it’s now federating back to Active Directory using your identity from your Active Directory environment up in Windows Azure?
DOUG PURDY: Absolutely. And so I took that same app, Cameron had that working in Windows Server, now it runs in Windows Azure, didn’t change a line of code.
Furthermore, if I go ahead and scroll down here, I’m going to hit “edit.” And when I hit edit, it’ll make a page request and we’ll scroll down. And if you remember, Cameron was tracking the execution time on the page itself. You can see here that it’s about one second of execution. If I hit F5 and then scroll down again, you will see that it has significantly went down. And the reason it has went down is I have AppFabric running in Windows Azure full available for me as well.
BOB MUGLIA: The database cache was caching the second time you ran, it was in the cache?
DOUG PURDY: Absolutely. So, all those same features that you saw in Windows Server AppFabric are available to you in Windows Azure AppFabric as well.
So I tested the application, I’ve utilized the model, and what I’m going to do now is I want to take that model, because real apps you don’t – oftentimes, just as a developer hit F5 and deploy it up into production. What I want to do is I’m going to ask Bob here to help me with the demo. And how I’m going to do that is I’m going to right-click, and I’m going to say generate an application package. And what this does is takes the totality of the app, the complete application, including all the models, and it bundles it up into one consistent package. And I’m going to call this thing TailSpin.app. And what it’s doing, it’s bundling it all up into a nice little package, and you can see it here on my desktop, it’s called TailSpin.app.
Now, Bob, I would love for you, if you could, I know that operations is near and dear to your heart. So, I’d love it if you could help me deploy this app into production.
BOB MUGLIA: Well, I’ll put my operations hat on. And we showed it in the staging environment, but the next thing is to really go ahead and look at that in the production world, and this is, again, in Windows Azure. And so what I’m going to do is run a PowerShell script to first see what will happen if I try and deploy it.
DOUG PURDY: So you don’t trust me, Bob?
BOB MUGLIA: Well, you know, I think there’s an issue with operations and development working together. And what we want to do by connecting it through modeling is making it as easy as possible for developers to create the application and define the environment, hand that off to operations, allow them to run it, and then have developers have full visibility to the changes they’ve made and have that really encoded into the operational environment.
In this case, we can see from what you’ve created that there are three different roles: A Web role, an AppFabric role, and a database role. There are two instances that are set to be deployed for the Web role and the AppFabric role and one for the database.
So what I’ll do now is – since that looks good, I’ll go ahead and remove the what if, and I’ll run it and go ahead and let that get deployed into production.
DOUG PURDY: And so what we’ve done is Bob is deploying this up to Windows Azure right now, but of course it takes a bit longer to deploy that. So, we’ve got an environment pre-staged that’s out here. Bob will be switching to that as we speak.
BOB MUGLIA: Yeah. So, as that gets deployed, so it’s now done, and what I’m going to do is take a look at this using System Center Operations Manager. And so this is the Operations Manager console. And what this is now doing is it is monitoring the operational state of the application running on Windows Azure. You’re familiar with System Center running within your own datacenter, here it is running in Windows Azure.
And what we have here is a diagram view of the application. This diagram view was created by Doug as he built the application, then it was put into a SQL Server model, stored I the repository, and then I’ve been able to consume that in System Center and display the exact same model. And I’ve added here a watcher process to keep an eye on the SLA associated with this application.
DOUG PURDY: Great. So, what I’m going to do, Bob, is I’m going to apply some load from my development machine here, and let’s see whether or not the SLA actually gets enforced and we see some monitoring that goes on. So, I just kicked off my job, we’ll see what happens here in a few moments.
BOB MUGLIA: So we’ve got two roles that are running, and perhaps if there’s sufficient load put on the application, the SLA could be violated. And we see here that that has come up. I can go ahead and take a look and drill down into this and take a look at the different events. I see the different alert that was raised and get a little bit more detail on that alert. And I see here that one of the things I can do as an operational manager is increase the number of instances for that Web role to provide resolution.
We could also automatically do this so this would automatically scale to meet the needs that the application had on demand, just creating more instances within Windows Azure, but I’ll just do this here, raise that role from two to four, and I’ll go ahead and run it. And as this is doing this, what we’ll see is that Windows Azure is – that Operations Manager is communicating to Windows Azure, telling it to spin up those extra roles. And if I go back to my diagram view, we’ll see here – and going to the Web role – we’ll see that we’ve actually now got four instances, and then two instances will take a couple of minutes to spin up, but they’ll take on the extra work that you put on.
DOUG PURDY: Great. Great. So, what have we seen here? So we’ve seen a single application model that has allowed me to take the same application and move it transparently from Windows Server up to Windows Azure. We’ve seen the same application model that’s allowed Bob, who’s playing operations, and me doing development to work seamlessly together. We see the same application model that allows us to provide services to you like auto scaling and high availability. And lastly, but certainly not least, that same application model is fully available to you in a natural SQL database for you to use and you to extend and use as you will.
And so we look forward to giving you more details on this in the upcoming months. Thank you.
BOB MUGLIA: Thanks a lot. Appreciate it. (Applause.)
I think that demonstration really pulls together the connection between what’s happening within your own datacenter together within the cloud, and it really shows the importance of how critical models are to connecting this together, and through that, speaks to the fact that where the cloud really is about the next generation of applications, it is a new application model, and it is something that you will create for your organizations.
So in filling this out and filling out and looking at the capabilities, you know, we talked about the low-level infrastructure pieces, we talked about the application environment, the application model, and the last piece is really to think about the software or the applications that actually run. And I’m going to say this is where you come in in terms of writing a very broad set of software that you’ll create for your own organizations.
But Microsoft also has quite a few key pieces of business application software that we’re also moving forward to this next-generation application model into the cloud. So, we’re taking the things that we’re doing with SharePoint and with Exchange and Dynamics and building them in the cloud and understanding and incorporating in the aspects of the cloud application model.
Now, tomorrow, we’ll show you and talk to you about some of the steps we’re taking with SharePoint and some of the great things that are coming with SharePoint, but thinking about that in a cohesive way makes a lot of sense.
And so we kind of come back to the beginning and talk about the different capabilities that the analysts were describing. You can look at this very, very holistically and you can see here that if you take a look, the software piece with applications such as Exchange, SharePoint, Dynamics, many, many others – and of course the many that you’ll create, that’s a key part. The application platform, those components are a core part of the cloud and what we’re doing. And of course the infrastructure. The underlying management fabric and the management tools together with the operating system environment that enables these applications, this application platform to run as well as the software and applications to run across it.
This cohesive offering of capabilities from the bottom of the stack with the infrastructure all the way to the application platform and the software is something that we’re working on, we’re learning a lot through this process, we’re learning it by doing it ourselves, we’re learning it by hosting many datacenters around the world and hundreds of thousands of computers, and building this next-generation application platform. I think it’s fair to say no one else in the industry has the breadth of focus on this, the focus on the cloud, together with the focus of on-premises and servers and allowing you to go from one to the other. And Microsoft is really focused on working together with all of you to really drive this.
So in thinking about this cohesively, I’ve talked about all of these different components. And let’s go ahead and take a look at when some of these things will be available. So, if we go forward and say, OK, some of the stuff we’ve shown this morning is available now for you to begin to work with. Certainly Windows Server 2008 R2 is an example of that. That’s something that you can certainly work with today, and Windows Azure, SQL Azure, those are also available for you. And of course the versions of SQL Server, Visual Studio, those things are all available today as well.
In beta right now and going into 2010, there are a set of new services that we’re going to make available, the next generation of SQL Server, SQL Server 2008 R2 with a lot of great new capabilities, particularly things like BI, Dave showed that earlier this morning. The Application Fabric will go from beta into production next year, so it’s in beta today. And of course we’ll make .NET 4 and Visual Studio available within the first half of the calendar next year.
We’ll take the work that we’ve done on AppFabric on Windows Server and move that into Windows Azure. Doug showed a demonstration of that just a few minutes ago, but we’ll make that available in beta next year, and we will be taking and continuing to enhance System Center to enable it to build both private cloud as well as to span into the public cloud environment and the hosting environment, and we’ll be entering into beta on that next year. So, lots and lots of exciting things to come.
In thinking of the Windows Azure environment, Ray talked a little bit about the different datacenters that exist around the world and the fact that we are building out datacenters and putting the Windows Azure platform in different datacenters. Today, we have our North American datacenters up and running. In 2010, we’ll bring online both our European and our Asian datacenters, and this is one of those key things that enables you to build cloud applications that are fully geo-scaled, enabling you to reach your customers regardless of where they are around the world. It’s one of the advantages of building public clouds is we’ve been investing in this infrastructure all around the world so you don’t have to.
Now, I talked about this application model. And I talked about a set of attributes associated with this application. We covered a number of those throughout the session this morning. And if you go back and take a look at the different attributes of the cloud application, Don and Chris talked about self service. Andy showed us the elastic attribute. Cameron showed us service-oriented, federated, and scale-out; and Doug showed us both model-driven as well as staged production. So, we covered this morning quite a number of the attributes within the actual demonstration, really showing the investment that we’re making, that it’s all real in code and in platforms and systems that you can use.
If you continue to look at this, there are a set of attributes I didn’t have a chance to cover, always available, multi-tenant, and failure resilient. Those attributes will be covered, as well as all these others in a number of breakout sessions that will be happening over the next few days as a part of this Professional Developer Conference. So, we’re really talking about how Microsoft is implementing the cloud application model within our underlying platform, within our underlying tools so you can take advantage of that, and we’ll show you how we’re doing that across the different breakouts. And just so you know, as you look at the show catalogue online, it’ll tell you which sessions cover these different attributes. So, this is the list, but you can see it online later as well.
So in thinking about the future of the cloud as an application model, Microsoft is focusing on taking the investments that you’ve made in your existing environment with Windows Server and enabling you to bring that forward. At the same time, we’re focusing on enabling you to build the next generation of Web application – of cloud applications. And whether you’re building these within your own datacenters in private clouds or whether you’re implementing this in hosters in public or private clouds that you might use, or whether you’re using public clouds like Windows Azure to do this, we’re focusing on providing you with a consistent application model across these different environments. There’s no question that the cloud is the next-generation application model, and Microsoft will take you there.
Thank you very much, have a great PDC. (Applause.)