Bob Muglia: Tech•Ed 2007

Transcript of remarks by Bob Muglia, Senior Vice President, Server & Tools, Microsoft Corporation
Tech•Ed 2007
Orlando, Fla.
June 4, 2007

BOB MUGLIA: Well, good morning and welcome to Tech•Ed. The great thing about Tech•Ed 2007 is we have so many great products to talk about that we really don’t have time to go into a lot of future vision speak. What I want to do today is really focus on those products, but before I do, let me put some of this into a little bit of perspective. First of all, in terms of the challenges that business organizations face, there is, of course, as Doc mentioned, the 70/30 split where the majority of the resources within any IT organization goes to maintain existing systems, and only a small percentage, about 30 percent, is available for new business applications where new business value is generated.

Now, at Microsoft we are focused on first and foremost listening to you to understand exactly what your challenges are, what are the things that cause you problems in the business, what are the opportunities that you seek to achieve, and we see the opportunity to help you by providing a consistent platform, a consistent set of applications to really help you to drive your business forward.

We also do this in partnership with the industry. There’s no way that Microsoft can deliver what we need to do to you without working together with many, many — literally tens of thousands, hundreds of thousands of partners and people around the world to help you make your business success.

So, the first thing I want to say is that we’re listening, and the listening that we do in terms of understanding that the problems you face really are manifest in the kinds of solutions that we’re able to deliver together with the industry to solve your business problem.

And putting this in context, while there’s a lot of short term, great new software, great new software things, software services and software capabilities coming out to help you with your business, we’ve put this into the context of understanding that we need to partner and work with you to meet the promises that you have in the long run.

Microsoft realizes that as we’ve evolved and become a strategic partner for IT in solving business problems, we need to think about and understand your issues today, tomorrow, and for many years. So, we’re putting in place the foundational elements to allow us to meet your business needs certainly today, lots of great things to learn about this week at Tech•Ed and this year, but then for five years, 10 years, even 20 or 50 years beyond. We hope to and intend to be your business partner for the long run.

Now, in executing on this, we want to help by providing you with what is effectively a roadmap to allow you to take advantage of and consume software and technology to improve your business and get better business results. So, what we did is we worked with the analysts and we worked with many of you to understand what are the attributes of your business and how could we provide you with a roadmap to solving the business problems you have. We call these optimization models, and we’ve defined four steps within this optimization model, starting with basic, going all the way up to Dynamics, with the focus at each step where you can take advantage of new software and hardware technologies to solve different kinds of business problems.

The great thing about this is, it’s very tangible. It allows you to look at your organization and compare it against these standardized models, and see where there are opportunities for you to make changes, make improvements to get better business results.

And the results we’ve seen on this are very dramatic. We see tremendous return on investment from customers who take advantage of those optimization models.

Now, what we’ve done is we’ve put this in the context of three different optimization models. We have a core infrastructure optimization model that’s really about the underlying infrastructure that runs your business. We have a business productivity optimization model that’s focused on your information workers and how you can make them more effective in terms of the way they work together to generate the ideas and the intellectual property, all of the things that make your business differentiated.

And there’s an application platform optimization model that’s really focused on your business applications, allowing you to understand how you can optimize those to get better business results more quickly, how can you roll out new solutions to meet your business needs in a faster way, and responding to your different business units to actually allow their products and services to come to market more quickly.

Now, we’ve put this in the context of the fact that for many years we’ve been working on several key initiatives, and we’re continuing to work on those. They’re very, very important today and in the long run. DSI has been a four-year journey. We said it was a 10-year journey; we’re four years into it in terms of enabling dynamic systems. .NET is a programming environment that really provides better results in terms of building business applications. It’s phenomenal how much return on investment you can get by building business applications in the .NET environment.

And all of these things need to be done in the context of the security environment, and thus we’ve had this Trustworthy Computing initiative, which focuses first and foremost on the platform and enabling the platform to be as secure as possible, but then it follows on with best practices and solutions and now software applications that allow you to secure your entire IT environment, and ensure that you’ve implemented policies to ensure that your corporate resources are as secure as they possibly can be.

Now, we’ve thought of all of these things as important initiatives, but over the years we’ve gotten some encouragement from a number of folks, particularly some of the analysts, to help us think about this in a more cohesive way.

So, what we’re doing today is we’re talking about four new technology areas of focus, technology innovations that we think provide a vision for the future. The four areas of focus really are first and foremost the unified and virtualized, process led, model driven, service enabled, and user focus. Now, those are the top level areas of these four technology initiatives, but they must fit underneath the platform.

Now, I talked about this as a bit of a vision, but really the thing to think about here is that it’s very much focused on real world things. It’s focused on how we can deliver products and services in the short run, but there is a long term plan against these that really will enable very substantive business results.

What I want to do now is really start by talking to you about one customer who has worked with us on this journey towards dynamic, and is really focused on implementing these optimization models, and this customer has chosen — Energizer has chosen to take and outsource some of their underlying services, and in this case we’ve provided, Microsoft has been the company that’s provided those services for Energizer, but it’s allowed Energizer to really focus in on the business problems that they think are most important. Let’s roll the video.

(Video segment.)

BOB MUGLIA: So, that’s how Energizer has really been focusing on their journey towards, dynamic. And I think if you look at these optimization models, and there are a lot of tracks here at Tech•Ed where you can really learn more about them, you’ll see ways where the software and the services can really help improve the business results that you have within your organization.

So, I mentioned at the beginning that as we started thinking about dynamic IT, this really fell in the context of conversations we’ve been having with a number of customers, and in particular a number of industry analysts. And this really sort of began about three years ago with some conversations that I and my group had with Tom Bittman at Gartner where Tom really encouraged Microsoft to think a little bit broader. He told us that the issues we had were really important, but we could think a little bit more holistically.

So, I thought it would be useful to start today by hearing a little bit of Tom’s perspective on what Gartner has to say about the journey towards dynamic. Tom, good morning.

TOM BITTMAN: Good morning.

BOB MUGLIA: Thanks.

TOM BITTMAN: Thank you. (Applause.)

Good morning. So, why is there all this talk about being dynamic, about being agile, being real time, being on-demand? We think technology is driving this change. Technology is creating an opportunity for business to truly differentiate, because of the speed of technology.

Now, let me talk about a couple of things in particular. Connections are becoming more pervasive. So, the ability for a customer to connect with a supplier, and for a partner to connect with another partner, those connections are there. Those connections are much more available than they have been in the past, and they’re enabled by the Web.

Also response time expectations are shrinking. Now, let me give you an example. When we were all kids, we know how kids operate, kids want everything right now, they want everything immediately. And we grew up learning that that just didn’t quite work out that way. We learned that you couldn’t quite get everything when you wanted it.

The experience that our kids are having, those experiences are very, very different. Our kids are instantly communicating. They’re on the Web finding what they need immediately. They’re buying things on the Web. They’re downloading music instantly. It’s a very, very different experience. Our kids have no tolerance for delays.

Now, why is that important? Because our kids are becoming the new workforce, they’re becoming the new business partners, they’re becoming the new executives. They’re bringing those expectations into our industry.

Also relationships: With pervasive connectivity, with expectations and response time, relationships are much more online than they used to be, and they’re much more short-lived. So, these are relationships that are transacted between again a customer and a supplier or business partners. The relationship is discovered, it’s created, transacted, and it’s done.

And what that’s doing is that’s creating a very big change in windows of opportunity. Windows of opportunity are getting smaller. They’re more frequent but they’re smaller. And the businesses that can capture those opportunities are the ones that are going to win.

So, eventually what’s happening here, what we see happening is that agility is becoming a more and more important differentiator in a connected world. Agility is making a true business differentiator in this world.

Now, why is it a differentiator? Does IT really matter? We all have equal access. We don’t, because the reality is many companies have been able to move forward into better and better agility, and many others are still well behind. In fact, in our experience the gap between agile companies and agile IT, agile development, and those that are less agile, that gap is growing. So, in fact, agility is very hard, and that’s making agility a very important differentiator between businesses.

We believe that it’s very important to harness technology that we have today in order to create a more agile, a more cost-effective, a higher quality of service environment. Agility requires us to take technology and apply it to itself.

So, how do we get there, and what is agility all about? What do we really mean by agility? Well, first, agility we believe is the ability to sense a change, something new in the environment, and to do something about it, but to do something about it not just with speed, but also efficiency. So, in fact, agility, in our view, is a balance between speed and operational efficiency. It’s the appropriate balance between those two things.

There’s a very important aspect here, and that is that agility is not something you do in one element, that you do in IT, for example, or you do in business process or you do in development; it’s system-wide. Any single element could bring it down. Any single element that is not agile will bring down agility and make it complex, make it less effective. So, system wide is very, very important.

In fact, if you look at how business operates today with infrastructure, with development, with IT in general, the way things typically operate is the business says, hey, I need something to do something for business, I need an application. They throw that requirement over the wall to the developers. The developers like the application or they acquire the application. They throw that over the wall to the infrastructure and operations people, and they do the best they can to manage it, to meet the expectations.

But then things change. The requirements change: scalability, response time. Business requirements change. The changes are hard to handle.

So, actually what you need to make this work, and what agility really requires is breaking down these walls. It’s not just that business processes need to be agile and infrastructure, applications need to be agile; it’s the interfaces that need to be agile between all of these things.

So, technology is like service-oriented architecture and visualization and automation and operationally aware applications. These are all very different trends, but we think they all fit together to make the system agile. We think they’re all important.

And just to be clear, we don’t think it’s all done by technology. We think technology is important, we believe culture is important. We also believe that process is important. But we believe it is one important element in this equation.

So, how do we measure this? If we’re trying to improve ourselves, what are we really looking at, and what do we measure?

Well, first there’s a damning fact, and that is that 70 percent of IT budgets are spent maintaining what we have, just treading water. And as I mentioned, that’s not true with everybody. And, in fact, we’ve seen companies that have been able to turn that equation around to 50/50, 50 percent maintaining what they have, 50 percent investing in new things, or even 30/70. They’ve been successful in doing that.

And the way they’ve done that is they’ve focused on three different things. The first is they’ve focused on cost. But we believe it’s not just about bringing costs down, it’s actually about changing the equation. We believe that the economics have to change completely from more and more and more with less, do more and more with less, to shift the model to you pay for what the business needs. We need a variable cost model, so over time we need to see the economics change to becoming the business paying for every transaction, more of a utility kind of model.

But that’s only one element. Another element is quality of service. Quality of service is clearly important, but every application doesn’t need five 9s or six 9s or sub-second response time. The key here is matching what IT can do with what the business needs. The business should say I need this kind of response time, and IT should be able to respond with that. So, it’s really getting that service level relationship to be much more programmatic and efficient than something thrown over the wall, or a piece of paper that somehow is converted to how many 9s for a server.

The popular ’80’s book “In Search of Excellence” postulated that it was all about quality of service, that was the silver bullet, that if you focused on quality of service you would differentiate and you would be successful.

Now, the problem with that theory is that most of the companies that were raised as examples in that book, companies that were high quality and were successful have since failed in one way or another.

Why is that? It’s because they weren’t agile, they weren’t able to change, they weren’t able to adjust quickly. It’s not good enough just to be low cost, it’s not good enough just to have good quality of service; you also have to be able to adjust for change. And if you can’t adjust for change, what does that mean? Well, that means you’re not adding value for the company. That means that you are a cost center, and that means that the company, your company is probably looking at alternatives to use such as outsourcing. Because by the way, how many times do you outsource in order to improve your agility? Not very often. That’s not what outsourcers are known for. By the way, in the future they’re going to have to be, but they certainly aren’t today.

So, the fundamental message here is there’s a pendulum that swings. We believe that this pendulum over time goes from cost is important to quality is important to, no, it’s all about agility, and we really think it’s all three. Right now we believe that there’s a little more momentum, there’s a little more need, a little more demand in agility, but we think it’s important for you, if you’re improving IT to focus on all three and build metrics on economics, build metrics on quality of service and build metrics on agility.

And by the way, those metrics aren’t that easy. We do a pretty good job of measuring cost. We understand what we’re spending. We don’t do a very good job of measuring quality of service because we don’t really have a good mapping to what the business is doing, and we don’t necessarily have SLAs in place that make sense to the business. But we do something there. We do almost nothing on agility. It’s hard to measure.

So, I often get the question, how do we measure agility? And the answer is ask your customer, ask your business what agility means to them, and take that and translate it into things that you do in IT. If deploying a service quickly in this amount of time is important to them, translate that into the time it takes for you to acquire and deploy the servers and integrate the software and get it up and running. And then, by the way, measure that and improve that, and then you can come back and do a full circle and show the business how you improve something that was critical to them.

So, finally, how do you get from here to there? If we buy all this, and we think that maturity, that improvement in IT is important, we believe at Gartner that a very effective method in doing that is a maturity model. We think the maturity model is a good tool. But we also have some very specific ideas about how a maturity model should work. In a survey we did at one of our conferences we asked about 800 people what they thought about what was holding them back, what’s keeping them from maturing their infrastructure and their operations and their IT: 29 percent said it was technology, 26 percent said process, 25 percent said it was culture and organization.

And this really helps to illustrate my point. The granddaddy of maturity models is CMM, the Capability Maturity Model developed in 1987. This was breakthrough work, and since then thousands of maturity models still today are based on that concept. And the basic concept is it’s all about process. If you focus on process, you will get it right.

Now, that sounds great, but the problem is we live in a high tech world. If you get process absolutely perfect, technology is going to change, and that messes up your process. In other words, process alone is not going to help us. We need more than that. At Gartner we believe that you need to have a focus on process, you need to have a focus on technology, and you need to have a focus on culture. Those are the three elements that you need to focus on in terms of maturity in order to move forward.

And then finally, we believe that you also need to look at a different way of maturing. It’s not about a long term project, we start here, we end here, and that’s where we see a return on investment. As we all know, long term projects in IT inevitably fail. We need to break them up into smaller chunks. And we’ve seen our clients do that successfully. You break it into stages, smaller pieces. Each stage needs to have its own return on investment.

But the difference here is that return on investment can’t just be about costs; again, it’s got to be about costs, quality of service, and agility. And what we’ve found is that the early stages are all about cost recovery and getting the cost to a variable model and getting costs in line. The middle stages tend to be about quality and getting the quality of service aligned with the business requirements. The later stages tend to be more about agility.

And what’s interesting about that is you squeeze the cost out in the early stages, and it’s really a cost recovery kind of mechanism. The later stages require investment. But that’s okay, because what we’re talking about is taking IT from a cost center to a process center; you invest to get more agility. So, we think that’s the right direction to go.

So, bottom line we believe agility is becoming an important differentiator in the business world today. We believe that agility changes need to be system wide, and you need to look at it as a system and make sure that every element and all the interfaces are working effectively. We believe that you need to look at cost, quality of service, and agility, and build metrics for each of them. And we also believe that the maturity model is an effective approach, but don’t just focus on process. Put a focus also on technology and on culture.

Thank you very much for your time and attention. Bob? (Applause.)

BOB MUGLIA: Thanks, Tom, appreciate it.

Well, that’s a great perspective from Tom Bittman at Gartner, and I think it fits in well with the dynamic IT environment that we’re talking about today, and the role of software as well as process and partnership, the role all of those things play to returning business results.

Now, in terms of dynamic IT, I want to take a second to talk about some of the foundational components of that, and really there’s a few key areas that we are focused on, a few key technical innovations that we’re focusing on.

The first is federated, and more and more we’re living in a world where we need to work together with others, and from a cohesive identity perspective. Our business partners need to be able to work, we need to be able to share identities with our business partners, allowing them to work with our systems for downstream or upstream processing, customers need to have access. And as you begin to think about outsourcing and software as a service, having other service providers, the need to be able to federate identity with multiple service providers is very important, so that’s an area of focus for us, and an evolution of Active Directory.

Security is, of course, the foundation of this, and I already talked about that in the context of Trustworthy Computing and the investments we’re making across all of our products and services to ensure that not just the products are secure but the operational procedures can be put in place so that your data and your environment is particularly secure.

But interoperability remains a major concern for larger enterprises in particular, who have a wide variety of heterogeneous systems, and Microsoft is very committed to taking the platform and products that we produce, and ensuring that the data that you have can be shared with others, and that we provide services to allow an interchange of information with other vendors’ technology.

And we’ve put this in the context of really four things. We start with products and ensure that our products are built to be interoperable. The data is yours, and it can be pulled out and transformed into whatever form you want. Microsoft products may store your data, but you own the data. So, that’s very key that our products are fundamentally built, they’re built to be interoperable from the beginning.

The second piece is community, working together with many, many others in different kinds of communities, including open-source communities to ensure that the best thinking goes into building interoperable solutions. And over the last year, really since the last Tech•Ed, you’ve seen Microsoft take a number of steps that really implement and show, distinguish a change of approach to us. You’ve seen us work with XenSource and Novell, you’ve seen us work with JBOS; we’ve had a number of different kinds of partnerships with different companies in the industry that you might not have expected Microsoft to be involved in.

And today one of the great things I get to announce that another great partner is joining the fray. Xandros has also taken a license to our intellectual property so they’ll be including Microsoft, all of the patents and all of the underlying intellectual property that’s required to run their software as a part of their distribution.

So, Xandros is the second distro after Novell that is taking the position of helping customers to ensure that when they use open source software, that all of the intellectual property that the industry has created is included with that.

That’s the approach that Microsoft is taking is to work with industry partners to ensure that when customers want to use open source software that they can do so with the knowledge and security that that software brings with it all the license rights that are required.

So, welcome Xandros to the community, and we’re glad to partner with them.

Standards are key. We work with standards bodies across the industry. We build new technology, we work with open standards. Whether this is the Open XML document formats, whether this is the underlying WS-* Web Services formats, whether it’s work with HTML and HTTP or underlying networking standard, Microsoft is committed to working across the industry and helping to drive industry standards.

A really great example of that that we did this last year is XML. We’d been working on an underlying model based schema format. We’ve now worked together with the industry to provide an industry standardized approach to that to enable modeling to work across a heterogeneous world, so that’s a key commitment.

And the final one is access, focusing on making sure that when Microsoft builds products and builds technologies, they can be easily licensed and available to others. In many cases these are through royalty free licenses like we’ve done with the open specification promise, and in any case we’re making sure that everything we do, the protocols we build, the formats we build, all of those can be ultimately licensed. And the purpose of this is to ensure that the systems that you build are as interoperable as possible when they’re created on Microsoft software.

So, what I want to do now is — (Horn honks.) Oh, there’s no vision there.

CHRISTOPHER LLOYD: Sorry. Sorry, Bob. I was checking my e-mail and clicked that thing back.

BOB MUGLIA: That’s OK, Doc. That’s OK, Doc. Hey, I really appreciate it.

CHRISTOPHER LLOYD: Oh, no, no, that’s all right. And now back to your regularly scheduled keynote. Goodbye, everybody.

BOB MUGLIA: Goodbye. (Applause.)

CHRISTOPHER LLOYD: Have a great Tech•Ed future!

BOB MUGLIA: Thanks, Doc.

Ladies and gentlemen, Christopher Lloyd. (Applause.) I’m very privileged and great to have Chris join us today. He’s a great guy, and it’s great to have him.

OK, so back to Tech•Ed.

In terms of unified and virtualized, one of the most important investments that we’re making over the next two years is taking and building upon the integrated platform that Microsoft has created, and enabling it to be virtualized in a number of ways.

Now, we really do see a future, say, five years or so out where in particular on services the vast majority of server services will be run in some form of a virtualized environment, and we’re making a number of key investments at every level of the stack to drive that forward.

Now, when we think of virtualized, clearly we think about hardware visualization as a key pillar to that, but we’re also looking at it in a more cohesive way with things like application services and our software technology to enable applications to stream down very easily and efficiently, as well as presentation services.

Now, one of the most important contexts against this is that from a management perspective we think that virtualized and physical are cohesive, and need to be managed with a common set of tools. So, that’s where our focus is with System Center is providing an environment that understands both the physical and virtual world and provides a set of tools that manages them in a cohesive way.

And with that, what I’d like to do is invite Jeff Woolsey out to give us a demo of a little bit of Windows Server 2008, but also System Center and some of our new System Center products. Jeff.

JEFF WOOLSEY: Good morning, Bob.

It’s a pleasure to be here to demonstrate Windows Server 2008, and the Microsoft System Center family of products to show how these great technologies complement each other so well.

First, I’m going to get started with Windows Server 2008 and its cool new technology, Server Core. Now, for those of you who haven’t heard, Server Core is a new minimal installation option tailored to provide only what a role requires. For example, if I’m running a file server on a server core installation, only the components required for a file server are even installed. For virtualization, Server Core is especially important because Server Core requires fewer system resources, which allow us to increase the number of running virtual machines.

For those of you who have never seen the Server Core UI in all of its glory, let me show you. Pretty exciting, huh?

BOB MUGLIA: Yeah, good old command lines. What we’ve done with Server Core though is we built a really minimal installation of Windows Server and allow you to run roles such as Active Directory, file serving, DHTTP, DNS, and I’m glad to announce today that we’ve added IIS to the list of roles that can be run in Server Core. So, for Web servers, that’s a great addition. (Applause.)

Now, one of the other things is that in general Windows Server 2008 is really focused on allowing whatever role you can install to be installed in as secure and as minimalist way as possible, whether you take advantage or Server Core or not.

JEFF WOOLSEY: Right. And one other important thing to note about Server Core is that since it presents only a command line interface for local administration, this places additional emphasis and need for greater integration and tight integration with systems management, which I’ll get to in just a moment.

Next I actually want to bring up the virtualization MMC interface. Now, once you start creating a number of virtual machines it may look something like this. Here I’m running Windows Server 2003, 32-bit, 64-bit, Novell SUSE Linux Enterprise Server 10, Windows Server 2008 as a Server Core installation, and Windows Server 2008 as a full installation, all on the same server.

As you can see, this is a 64-bit virtual machine with 6 gigabytes of memory. In addition, I’m going to bring up task manager to show that this is a quad-core virtual machine. With quad-core support, Windows Server virtualization scales to run the vast majority of enterprise class workloads.

Now, up to now all of the virtualization management I’ve shown has been managing a single virtualization server. However, over time as you create dozens, hundreds, even thousands of virtualized workloads, you’ll quickly run into problems created by virtualization, such as virtual machine sprawl.

And this takes us to Microsoft System Center, and the latest addition to the System Center family, Virtual Machine Manager. Now, Virtual Machine Manager is a centralized console for managing both our existing virtual server and our new Windows Server virtualization.

The one very common question we receive is, can I convert physical machines, or VMware virtual machines to Windows virtualization? Yes. In fact, so quick, let me show you how to do that right now.

Here I have a VMware virtual machine and its library, and I want to convert this to Windows virtualization. I simply right-click and select Convert Virtual Machine to bring up the conversion wizard. Now I select the resource that I want to convert, which is that VMware virtual machine. I select next, and now I have to give it a name. I’m just going to call this Converted VM, click next, and here I’m presented with the Intelligent Placement Wizard. Once I actually convert this VM, I don’t want to just deploy this on any random old virtualization server, I want to deploy it on the best server, the one that has enough memory, one that has enough CPU resources. So, by default, the Intelligent Placement Wizard gives me the best solution, provides it at the top by default. So, I’m going to go ahead and stick with that answer and move right along.

I’m going to clip through the next couple advanced options so I can get to the last screen, which is the summary screen. And one really important thing to note about Virtual Machine Manager is that it’s build on a layer of PowerShell objects. Everything that you do in Virtual Machine Manager is easily scriptable through PowerShell, like this conversion wizard. For example, if I want to see the whole PowerShell that I did to convert this VMware virtual machine to Windows virtualization, I simply click on this script, and there’s the entire PowerShell cmdlets. If I want to drop this, say, into Notepad and paste it, guess what, folks, I’ve got the whole cmdlet right there to do a conversion. I could easily create a loop, drop in a couple of variables, point this at my SAN storage, and easily and quickly convert all my VMware virtual machines to Windows Server virtualization in no time.

Now, I’m going to go ahead and finish up this conversion by clicking on Create, and you can see that the status window, the job window comes up, and you can see the conversion is actually starting, and there’s the process below. And in about 60 seconds this VMware virtual machine will have been converted to Windows virtualization.

Now, while that’s finishing up, let me show you another really cool feature. Something by far the most commonly asked question is, can I quickly move a virtual machine from one server to another, and the answer is absolutely, positively, unequivocally yes, with Quick Migration. Let me show you.

Here I’ve got a virtual machine, Windows Server 2003 VM, and you can see it’s running on Host 1. I’d like to move it to Host 2 because I need to service Host 1 right now, and add some more memory. I simply right-click and select Move.

Now, what’s happening with Quick Migration is Quick Migration is saving the virtual machine, moving the connectivity of storage to the new host, and restoring the virtual machine. And like that, we’ve moved a virtual machine from one server to another.

Now, you probably think this is actually some super fast, ultra pricy fiber channel array that I actually used to move a VM that fast. Actually, no. This is all happening through affordable fast gigabit I-SCSI here in my HP rack.

Furthermore, this is not only fast but it’s affordable. Quick Migration is not some multi-thousand dollar add-on. This is built right in, folks. In fact, if you’d like to be able to do this today with Virtual Server, come by the Microsoft booth at the product pavilion, and we’ll tell you exactly how you can do this today with Virtual Server — oh, by the way, for free.

So, now as Bob mentioned earlier, IIS 7 will soon be available as a role for Server Core, so as we move to the last part of my demo, I thought I’d bring up a diagram view of my Internet Web store. Now, this is System Center Operations Manager 2007, which is monitoring the health and performance of my Internet Web store. This Web store is actually being powered through IIS 7, running in virtual machines, configured with network load balancing to balance this load across my Web server.

And you can see I’ve got a number of alerts, so I’m going to drill into the dashboard view so I can take a closer look.

So, again here are my hosts above here, and I’ve got two virtual machines that are my Web servers. And I can immediately see from my performance view this blue line is my CPU utilization for my entire Web farm. I can see it’s running really hot. But more importantly I can see that the number of orders that I can process per minute seems to be limited to about 40, and I’d like to be able to move that up, I’d like to be able to add capacity.

No problem. We’ve created a custom action to add a new instance, and what’s happening here is Operations Manager is calling Virtual Machine Manager, and Virtual Machine Manager is deploying a preconfigured virtual machine and adding it to the Web farm. It’s also taking advantage of the new shared configuration file feature in IIS 7 where all of my Web servers are actually sharing the same Web configuration file. You can see I’ve added a third virtual machine to my Web far. I brought all mine into green alerts now. And you’ll see my CPU utilization drop, and my order process go up.

BOB MUGLIA: That really makes the point that I mentioned earlier about building a cohesive set of tools that lets you manage both physical and virtual environments, as well as the software within a virtual environment through the same set of management services that are all integrated together.

JEFF WOOLSEY: Absolutely, Bob.

The last thing I want to mention is you’ve seen Windows Server 2008 is awesome, and when you look at the new features, Server Core, IIS 7, PowerShell and virtualization, they all work great together. And when you couple this with the Microsoft System Center family of products and management, we provide our customers a complete solution for managing both physical and virtual workloads.

Thanks for this opportunity, Bob.

BOB MUGLIA: That’s great. Thanks, Jeff. (Applause.)

I’m excited about the virtualization solutions, as well as the management solutions they’re rolling out this year. And I want to really emphasize that Virtual Machine Manager will be shipping later this summer, and at that point you can begin to take advantage of all those features starting with Virtual Server, and prepare for Windows Server 2008 when the “Viridian” technology comes as a part of that next year. Just to be clear, Windows Server 2008 ships late this year, but the “Viridian” technology will ship about six months after that, as we said all along.

Now, in terms of process and model, we think that this is at the center of driving down IT costs, both in terms of being able to do service management as well as being able to think of an entire lifecycle of a business application. And here is one of the areas where I feel like Microsoft has the opportunity to really take a leadership role in helping IT think about business applications from the moment the business analyst defines the business need all the way through the lifecycle of the business application. And we’re looking at this cohesively, both in terms of enabling and creating tools to allow you to define business process as well as IT process, and bring those together in a cohesive way. And really there are three key roles here. There’s the business analyst that defines the business need working together with the developer to create the business application, and then the IT pro to manage and deploy and configure the application on an ongoing basis. A key part of this is the need to bring the knowledge together across all three of these, and to capture that knowledge in a way that’s understood by computer systems.

Now that understanding of business knowledge, and environmental knowledge, and infrastructure knowledge, and application knowledge, all of that today exists in a fairly ad hoc way within IT systems. And our focus is really to build a cohesive solution that allows that knowledge to be captured in models that can begin with the creation of a pictorial view of the application by a business analyst. It’s developed by a developer, and then maintained and operated by a set of IT professionals.

We’ve been working for a number of years on defining a standards-based way of defining models, a meta-model, so to speak, so that models can be created, and that’s what XML is all about. We’ve been really excited by the industry-wide support we’ve seen around XML, and that’s kind of the beginning of the process. It’s taken a while to get to the point where we have industry agreement on a standard modeling approach in XML, but it enables the real fun to begin now, and the creation of domain-specific models that really define what an application really is, or a category of applications. So we’re now working together with partners in the industry, partners like EMC and Cisco, and many, many others, to create models for applications, for hosts, for networks, for storage. And these will be standardized models that can be shared and used amongst the industry to define the way that the underlying physical, and virtualized environments are created.

And from that, the exciting thing that we see is the ability for us, together with the industry, to build best practice models. I talked about the infrastructure optimization models earlier, and the four steps to Dynamics. Today, you find out about that by reading Web documents on our Web site. In the future, that information will be captured in models, and be built into best practices to allow you to deploy applications based on a deep understanding of what works, and that’s all been created by the vendors and the people that have actually deployed these applications in the past.

And, of course, this system is flexible enough to allow you to customize these models to meet your business needs. So we see modeling as a real core to a way of driving down IT costs, and making it more effective for you to deploy and build applications. And I’m really glad to say that we’re seeing this, the fruit of that labor appear in products today.

With that, what I would like to do is invite Barry Shilmover up to give us a demonstration of some of the management tools that we’re building that are very model driven.

Barry, good morning.

BARRY SHILMOVER: Good morning. Thanks, Bob. Pleasure to be here.

I want to just take a few minutes today to show you a couple of the model-based applications that we have today. What you’re looking at the screen is Systems Center Operations Manager 2007, and I’m taking on the role of an administrator at Contoso Financial. What I’ve done is, I’ve modeled my end-to-end applications, and taken into account all of the different components.

Now, I’ve done this using management practices from Microsoft, from third party vendors, as well as one written within the console for Contoso. You can see I have F5 load balances, network devicing from EMC, Exchange, SQL, IS servers. I also have perspective, and what perspective does is it gives me the customer experience, it lets me know that the customers can actually log into my applications and perform the tasks that they expect. And, finally, I actually have the application down here, and we’ll focus on this one and give you a better view of it, the expense application.

I’m going to switch for a second into the customer experience, and show you what that would look like. So this here is my expense application, many of you probably have similar ones in your organization. This is actually modeled around the Microsoft one. I’ve got my expenses up top here that I get from my credit card company. I’ll give this a purpose, and I’ll add a new item, come in and give it a date. And today I’ll just do a mileage for the ride to the hotel, and give it a little bit of mileage, and I’ll save it.

Now I’m ready to submit this. Clicking on submit should do that for me, and I see that I’ve got a bit of a problem. Now, as an end user, I’m going to look at this, I’m probably going to go back and resubmit, and it’s going to fail again. I’m going to pick up the phone and call a help desk, or I might just walk away.

So let’s take a look at what help desk would be. In a couple of seconds here what you’ll see is this diagram will change as Operations Manager watches the different components, and it’s going to go critical for me. What we’ll see is, we’ll see the perspective go critical, and we’ll see the application go critical. Now I have an opportunity to show you another great new feature of Operations Manager. The problem is several levels down, and I’m not quite sure what’s causing it. We have this new feature in here called Problem Path. If I click on Problem Path what Operations Manager does is, it takes all of the healthy components and drops them to the background, and brings the critical ones up front. As I zoom in and pan, I can see that the problem is actually in the mileage server itself. From here, I’m just going to pivot to the alert view. This will give me a view with the specific alerts that have caused this critical problem, and I’ll look at the knowledge.

Now, I may not know what’s going on, but the knowledge, because it’s all model-based, and we have the health model in there, I know what the problem is. I can see that this is a known issue. I can also see that it’s currently being worked on, and there will be a new service released in the future. Right now, I can just restart the service.

When I restart this task, what happens is Operations Manager actually goes out to that system, connects to the agent, talks down to the service, restarts it, and makes sure that everything is working properly. What will happen in a second, I get a response here, and if I go back to my Web page and resubmit, I should be successful now. I can see I was successful. Let me go back to the task and see there’s my results.

What’s going to happen in a few seconds here, the alert will get auto-resolved, and I’ll go back to a healthy state. While we’re waiting for that, let me show you how we built some of these models. The tools are built into the console. This is a distributed application designer, and this here is the application that I just showed you. There’s my four components, perspective, the Web application itself. One thing I’m not looking at is, I’m not looking at the physical machines here on stage that are running this demo. I’m just going to add that quickly. I’ll add a component to this, physical servers, and right down here is a list of all the objects that Operations Manager has discovered through those model-based management packs, choose device, computer, and I’ll choose the Windows computer. Operations Manager will give me a list filtered down for only those objects. I choose the expense server. I’ll just simply drag it into that container. Now I just save it.

BOB MUGLIA: So what we’ve done by enabling these model-driven capabilities in Operations Manager, it really allows you to focus on building service-based safe modeling. In other words, to be able to define the services as they’re appropriate for your given environment. And these capabilities are built into Operations Manager, and they let you create both a virtual, a physical, and a physical layout across your entire organization.

BARRY SHILMOVER: Exactly, Bob. So what this application designer actually did for me is, it just recalculated the models for me, recreated all the relationship models, as well as the health model. If I look at the alert now, I see that the alert has been resolved. If I look at diagram view, not only has it gone back to healthy, I now see that there is this fifth container that actually has that server. In a few seconds, that health will roll up, and if anything happens to that server, I’ll be made aware of it.

I want to switch into another application, SQL Server 2008. What we’re doing with SQL is we’re actually building some of the management models right into the product. And the way that we’re doing that is using policies. We have this new tab down here, a new list down here for policies, and in here I could configure how my databases  I’ll create policies to show how my databases need to be configured. And in this instance, we have a new DBA that’s gone in and changed the auto-close option to true, and we have a policy against that. So I can see with the icon to the left of the expenses database that we’re now out of compliance. If I actually look at that policy, I can see that, sure enough, one of those databases, the expenses database, is now out of compliance.

Now, it’s great that it’s in this tool, but I just showed you another tool where operators are going to see it, and I’m not convinced that they’ll have access to this tool. If I look at Operations Manager, what we’ve done is, we’ve extended the management pack to pull in that information. This is a dashboard view up at the top. I’ve got the state, I can see it’s critical. There’s the alert that’s caused it. And, again, I’ve knowledge. The knowledge tells me what this alert is all about, whether this policy  what it would affect, the performance impacts to the database, as well as I can execute the task. Not only can I run the task from there, but I can just, because everything is in context, I can run it from anywhere.

What’s interesting here is, the operator that just executed this task probably doesn’t normally have access to that database. Operations Manager has that information, and it takes care of it for us under the covers. I can see that it has succeeded, and now if I switch over to SQL Server, re-run the check, I can see I’m now back into compliance, refreshing the view will make that alert go away.

BOB MUGLIA: So we’re building modeling into many Microsoft server products, you can see it here in SQL Server, we have models being created as a part of role management in Windows Server 2008, and, of course, it’s all done in a cohesive way using XML as an underlying architecture so that tools like Operations Manager can monitor them.

BARRY SHILMOVER: Thanks, Bob.

BOB MUGLIA: Thanks, appreciate it. (Applause.)

So we’re very excited about the opportunity that model-driven, based on business process can have to really redefine the way business applications are created. And thinking about business applications, service enablement of applications, whether they’re older applications or new applications that are being built, we think is a very, very critical focus for you to create these distributed, heterogeneous systems of the future.

Now, in terms of being able to take existing applications, and many of you, in fact perhaps all of you, have many, many applications going back a long time, and they run on a wide variety of different platforms. And we know that one of the focuses that businesses have had over the last few years is taking the process, and the data that’s encapsulated within those applications, and enabling them with Web services, and that provides a great foundation for allowing those applications to interact in a modern world.

Now what we see as a future is applications that are being created will need to have a basis to reach out and work with consumers of many types. In many cases, they’re customers, in other cases they’re information workers, and they work across a wide variety of different devices, PCs, mobile devices, PCs of many types, accessing by consumers. And in that kind of an environment being able to take and have and use SOA technology to take the underlying capabilities that have been built inside those business applications and create overall business applications that wrap those things together is a great service.

Now we’ve been using .NET as a foundation of this, and we’ve invested for many years in the .NET platform as a way of very rapidly building business solutions, middle-tier applications, Internet-based applications, and we found tremendous results from .NET. The developer productivity improvement in the .NET environment is multiple times faster than using other, more traditional services. So .NET is a great enabler of this. As we move forward, we see services augmenting that environment, allowing all kinds of applications to be built, and reaching customers regardless of where they are. In many cases, services can be used as building blocks to help augment the creation of new SOA-based applications.

And with that, what I would like to do is invite Mike Woods to come up and show how a combination of business process together with service enablement can really allow you to create some new kinds of business applications.

Mike, good morning.

MIKE WOODS: Thanks, Bob.

So what I would like to do is talk about a company called Contoso. Contoso is a middle-tier supplier who chose BizTalk 2006 R.2, and SQL Server 2008 to build an automated process or inventory procurement system. The problems that Contoso wanted to solve was that when they got really large orders coming in, there were too many manual steps in the process, and it was really affecting their ability to meet their service level agreement.

So what we see here on the screen is the BizTalk Server orchestration designer inside of VisualStudio.NET. This is a model-driven tool to build business process. You drag shapes out from the toolbox onto the design surface, and then you configure them appropriately to build out your business process.

In this business process, what we do is, we receive an EDI order. Now BizTalk Server 2006 R.2, one of the new features that we ship in the box is advanced EDI functionality. So we grabbed us an EDI order, it feeds into our process, and then the next thing we do is we set up a call to Axapta to query our inventory. If we have enough inventory to service the order, then Axapta serves the order as it normally would. If, on the other hand, we do not have enough inventory in-house to serve the order, that’s when the magic kicks in.

What we do is, we go ahead and take that message, and send it out to something called BizTalk Services. Now BizTalk Services is an incubation project that Microsoft is running. What they are is a software in the cloud that makes it very easy for customers to communicate across organizational boundaries. So there’s a federated identity service, and today there’s a firewall friendly messaging service. We’ll be adding more of these services over time.

What this enables Contoso to do is to multicast a message out to a set of suppliers who are listening on the other end of BizTalk services. So we’ll send the message out. We’ll start to get responses back from suppliers who can help us meet our inventory demand in order to serve the order.

Now, the next step of the project is reporting, but before we get there, let’s go ahead and talk about how do you service enable a process. That’s an important thing to do. What BizTalk Server 2006 R.2 provides is a wizard to help us do just that. If I go click on the Windows Communications Foundation Server Publishing Wizard, I’ll run through a series of questions, and what we’ll do is, we’ll take this particular process and service enable it. So there’s about five, maybe six screens we need to go through. Then we can go ahead and create that process.

BizTalk Server is inserting this new Web service, this WCF Web service into IIS, and that’s what it’s just done. Now that we understand the process, the understand how to service enable it. Let’s switch gears and look at the reporting side of the application.

Here I’ve got VisualStudio.NET, and what I’m doing is, I’m creating a reporting service based on SQL Server 2008 Reporting Services. This is a pretty vanilla report, not too exciting. But what we’re going to do is, we’re going to add some pop to this report. Recently Microsoft has acquired some technology from a company called Dundas, which delivered some data visualization controls for reporting services for us. So what I’m going to do is move this control down to the bottom of the screen here. I’m going to add a new map control, let’s go ahead and move that over and resize it. Let’s add a template so that it looks a little bit better. That’s starting to look good. And let’s bind some data to this map control. We’ll take a city name, we’ll take the number of items from the quote that our partners are going to be able to send to us, and then we’ll go ahead and group that by city name.

So what we’re going to do is, we’re going to see the city, and the amount of items of inventory that our trading partners can go ahead and send us so that we can service our order. The final thing I need to do here, though, is set a property on my map control. What I need to do is bind my data to the city symbols instead of the shape itself. Now we should be ready to go. I’m going to go ahead and deploy my map control, click deploy, succeeded.

Now let’s go ahead and actually look at the application running. What we’re going to see is, once I get this up on the screen, is a map. We’ve got part of our order sitting in our own inventory warehouse in Olympia, and now our partners are going to start sending in quotes for orders that are  for inventory that they can service. Carson City has got some orders, got some inventory for us we can see. These messages are coming across the cloud, it looks like Raleigh, North Carolina, has also got some orders for us.

Now, the other thing that Contoso did is they used another new feature of BizTalk Server 2006 R2, called RFID Services, or radio frequency ID. They’ve taken their warehouse and RFID enabled it, so that they can automatically read inventory as it comes in. As a matter of fact, look at this, John Flanders from John Flanders Delivery Service is bringing a new  a delivery to our dock right now. I’m going to take this RFID, this GPS enabled process in this RFID reader, swipe it across the box, and what we should see is, there we go, inventory popping up right here in Orlando.

So what we’ve done is we’ve used SQL Server 2008, BizTalk Server 2006 R2, to build out some pretty advanced capabilities. BizTalk was able to take legacy technologies like EDI, current technologies like reporting services from SQL Server, and we were even able to integrate the physical world with BizTalk Server RFID Services.

BOB MUGLIA: Great. Thanks, Mike.

MIKE WOODS: Thanks, Bob. (Applause.)

BOB MUGLIA: So although we just shipped BizTalk 2006 last year we have an update, an R2 release, that’s really focused on service enabling it, and connecting it to things like the Windows Communication Foundation, as well as enabling new features like RFID. And you saw some great new things coming in SQL Server 2008 Reporting Services, with a great new set of controls to allow for better visualization as a part of SQL Server reporting services. So lots of great things to help you build business applications.

Now, in talking about business applications, I mentioned a second ago that one of the important things is being able to reach out and work with users, consumers, as well as end users within your organization, and being able to work across many different kinds of apps, mobile applications, applications that have to work across any browser on any machine, as well as internally. And we have a real focus. We’ve been putting a lot of focus in enabling you to build very rich client experiences that work across this cohesive set of different kinds of devices that are used by so many different people. And one of the things here is when you think about building an application there are different kinds of applications you want to build.

In most cases the underlying data, and the business objects that you’re creating will be the same, but in some cases you might want to create a mobile application, in other cases you want to create a browser-based application for consumers that can be used by anyone. Some applications are internal that will be built within your business, and others it’s really beneficial to integrate into your information worker and office environment. And we’ve been focusing on building a cohesive set of tools to make those much, much easier to produce across the whole gamut.

With that, what I’d like to do is have Brian Goldfarb come up and give us a taste of some of the things that you can do with the combination of Visual Studio and Office 2007 to build really integrated Office business applications.

Brian, good morning.

BRIAN GOLDFARB: Thanks, Bob.

Good morning, everybody.

BOB MUGLIA: Good morning.

BRIAN GOLDFARB: So Office business applications are great for providing data to your users in a contextualized way. In this demo we’re going to create an Office business application using Visual Studio 2008. Now, how many people here spend way too much time in Outlook? Do you? No one? There you go. Well, one of the great benefits of Visual Studio 2008 is it includes Visual Studio Tools for Office 3.0, which makes it easier than ever before to customize and extend Word, Excel and Outlook.

Let’s go ahead here and dive in. I’m going to build this application for the Adventure Works Flight Company. And the first thing I want to do is add a new item. And the key to all of this is the Outlook Form region. So I want to build a sales report. We’ll call this Sales Report.CX. And I get a few options here. Now, I can either create a separate form that’s all by itself, stand-alone, or an adjoining form that extends an existing form. So let’s start by extending the mail pane. We’ll click next, we’ll see I have Mail Message checked, go ahead and hit finish. It brings up a design surface here. That should look very familiar, it’s similar to the Windows Forms Designer that we’re all used to, but now I’m going ahead and customizing Outlook.

I’ll go ahead and drag this over to make it a little bit bigger. Now, I’m no designer. But, we have a great design staff on hand that’s able to leverage the collaboration between Visual Studio and Microsoft Expression, and they went ahead and built a WPF control that I can leverage here that shows me customer sales information when I get an e-mail from one of my sales people. I’ll drag and drop that into the region, go ahead and dock that, and the next step is to actually write a little bit of code, jump into the view here, and what I want to do is, during the initialization sequence, is just make sure that I only display this if it’s a sales person. One line of code to make that check, everything else happens for me.

Let’s go ahead and run the application. Visual Studio will automatically fire up Outlook. I can jump into my inbox, and when I get a regular mail nothing special happens, but if I click here on Eric Carter, one of my sales people, I get that great WPF visualization, and data pulled from SQL Server 2008. It’s very, very easy to do, one line of code there. (Applause.)

So we can see Eric had a little trouble in the last quarter of the year. So I want to go ahead and drill in, and do an individual sales report just for him. So let’s jump back into Visual Studio, we’ll add another Outlook form here, new item, Outlook Form Region. This time, though, we’ll call it an Individual Sales Report. This time we’ll do a separate form. Click next, and now instead of it sending a mail message I want to go ahead and extend the contacts part. We’ll click finish, get back to the same design surface, I’ll drag this out, and again leveraging my great design theme here, I’m going to grab the Sales Scorecard Control. This is a grid view, it’s going to go into SQL Server, send me data that I need to populate that. It’s also going to use Excel Services on the server to create graphs and charts that I can use in real time to visualize my data.

So the next step here is just to dock this, so it takes up the full region. Then, of course, we’ve got to go write a little bit of code. Now, this is Visual Studio 2008 beta 1, and there is a small bug, believe it or not, that is fixed in the next version. I’m going to change note here to full Intellisense to contacts, and then I’m going to write one line of code here to initialize my control. It just does some initialization work.

Now, one more step. One of the great features of Microsoft Office 2007 is the Ribbon. With Visual Studio Tools For Office I can now extend the Ribbon using a great visual designer that makes it super easy. We’ll go ahead and add a new item. This time we’re going to choose the Ribbon Designer. We’ll call this the Contact Ribbon, and what we’re going to see here is a great design surface just for the Ribbon. I’m going to click on it, go here to the ID, tell it I want to extend the contact ribbon, and then just by setting some properties we can control the labels here. So we’ll change group to say organizational data, go ahead and the change the tab one to say sales reports.

Now, I’m just going to drag and drop a button control right onto the ribbon and customize that a little bit. First I want to set show image to true, because I want to change the default image. Instead of saying button one, we’ll call it View Adventure Works Data. I want to take advantage of a resource that already exists in my application. We’ll jump in here and point at the Adventure Works logo. Last, but not least, we’ll go ahead and change this to be large. Double click on the button just like you would in any other application to get to the code behind, and generate the event. And I’m going to write one line of code here to go ahead and launch SharePoint, which is going to give me a view of my organization’s data. Not only can I look at my individual, but I can get data across the entire organization.

Hit Control F5, run the application. Again, it fires up Outlook automatically, I’m going to go into the contacts region. We’ll go ahead and choose C to make it simpler to find Eric. Doubleclick, my individual sales report is integrated right into the ribbon. Click on that, it fires up my custom pane, get my data from SQL Server, my reports from Excel Services. I can click through these to analyze data about my company. I can easily change to the customized sales reports tab. Fire up the SharePoint site right from there, and the beauty here is now I can see the individual, and I can leverage SharePoint, I can customize Outlook, and all of this without very much code, using Visual Studio 2008 and the skills that I have today.

Thanks, Bob.

BOB MUGLIA: Thanks, Brian. (Applause.)

I think that’s a great demonstration in terms of the integration across Office and the ease and simplicity of using Visual Studio Tools for Office and .NET, to extend the Office applications, allowing information workers to work in the environment that they’re comfortable with, and at the same time integrate in the business data. You see the ability to connect to a SharePoint Web site, and all the very rich things that can be developed as a company portal that allows you to take business applications and make them easily available to end users. So lots of great stuff there with Visual Studio Tools for Office, and the integration across both Visual Studio and Office.

I mentioned before that sometimes you want to write applications for your company that need to work for consumers who run on a variety of different platforms. They may be running a Macintosh, they may be using Firefox, or Windows and IE, and you want to write something that’s both rich, as well as has the reach that allows you to get to everybody. So a few months ago we introduced a new technology that we think will revolutionize the way rich Internet-based applications, or browser-based applications will be created. That technology is called Silverlight. With that I’d like to introduce Jamie Cool to give us a demonstration of Silverlight.

Good morning, Jamie.

JAMIE COOL: Thanks, Bob. (Applause.)

I’m going to be showing you some examples of rich Web experiences and applications that you can build yourself using Silverlight and the skills you already have. Let’s start with media. Here we have an interactive media player showing clips of Fox’s upcoming summer movies. Silverlight makes it very easy to seamlessly integrate video into your Web page. Silverlight has support for standards-based video playback, and supports a wide range of rich video features. For example, this video is currently playing in 720p, true HD quality in your browser. I can also go right from the browser into a full-screen experience, and also add interactive content overlays directly over the video, in this case in the form of a chapter play list.

Now, beyond just having a rich feature set of a platform, we wanted to have a highly performant platform. Let me show you what I mean, because Silverlight is a great cross-browser, cross-platform technology, I’m going to do the next demo in Firefox. Here we have two videos playing simultaneously, one overlaid on top of the other. We can have not just two videos playing simultaneously, but, in fact, 10 videos playing simultaneously.

BOB MUGLIA: Cool.

JAMIE COOL: Definitely very cool. So this type of platform that enables this kind of performance in the browser opens up a door to all kinds of new experiences in the browser moving forward.

BOB MUGLIA: And in this case we’re showing it in Firefox. It could just have easily be done on a Macintosh, as well. The goal is to enable you to build rich Internet applications and/or video-based applications that can reach out and be used by anybody.

JAMIE COOL: That’s right. Let me show you an example of how Microsoft is using Silverlight on some of our consumer-facing properties to enhance the experience. In this case we have an Xbox 360 ad, built using Silverlight. We’ve been running this ad on Microsoft properties for about four weeks now, and it allows us to provide a much more immersive, visually engaging experience for the users of our site. As you can also see from this particular ad, we now have a black version of the Xbox 360, just in time for the countdown to Halo 3.

Now, part of what it means to be a great Web platform s that you have a great, quick installation experience. Let me show you what that looks like for Silverlight. We’re going to switch machines, and we’re going to try and run a Silverlight-based chess application. In this case, though, the machine doesn’t have Silverlight installed on it yet, so the Web page detects that, and provides the user with a button that they can use to get Microsoft Silverlight in order to run the application. Simply by clicking on the button, and then going through the standard browser-based security dialogue that users are familiar with, Silverlight will download from Microsoft.com, and install in seconds, and then the application itself will just run.

BOB MUGLIA: So we’ve made the deployment as easy as possible to ensure that solutions that you build to reach consumers will have the broadest possible reach, because it’s so easy for customers to be able to get Silverlight.

JAMIE COOL: That’s right. And in this case, this application was built with not just Silverlight, but Silverlight and .NET. So I have a chess app, and I can go ahead and make a few moves, and play against the computer. We’ve actually implemented the AI for this application twice. Once using C# and .NET, and another time with the same implementation using JavaScript. And the way that chess AIs work is that the faster they can run, the more moves ahead into the future they can see, the better they can do. So we can let .NET and JavaScript play each other, because .NET is so much more performant in the browser, well  (applause.)

BOB MUGLIA: .NET gives you incredible performance for Silverlight applications, but the most important thing to realize is that what we’ve done is we’ve taken the desktop CLR, the one that runs behind ASP.net, as well as all of our .NET applications, and we’ve slimmed it down a bit, and we have a really reasonably complete subset of the .NET environment for you to build rich Internet applications. It’s pretty amazing the kinds of things  in fact, I think it’s a whole new step forward, in terms of he kinds of applications that can be delivered to consumers within a browser.

JAMIE COOL: Yes. Let me show you a slightly more interesting application that was built using .NET and Silverlight together. This is an example of a next-generation video editing application we call Top Banana. What it does is gives me a design surface that I can add videos to, and work with. Let me just toss a couple of videos onto our design surface, go ahead and let them play so we can see what we’re talking about here, and then I can work with these videos. I can break a video apart into its relevant key frames. I can trim it down to a relevant subsection, how about just getting the skateboard section here. I can merge videos together by overlaying them into a common stack, and then previewing what that unified video is going to look like. Now, this is obviously a very cool application. One of the coolest things about it is that the total download size of the application is less than 50k.

BOB MUGLIA: So let me emphasize that point. The Silverlight download itself, the one that includes .NET, is on the order of 4 megabytes, and so can be done very, very quickly by anybody. But, once that download is on a machine, an application of this kind of richness can be downloaded in less than 50k, or the size of  smaller than many of the Web pages that exist on the Internet today. The thing that’s amazing about that is not only was the download size so small, but the application itself was written in just a few weeks.

JAMIE COOL: That’s right. It was actually written using Visual Studio, and done in less than three weeks. And more impressively, it was done by folks that had never written a Silverlight application before. That’s because they were able to use skills they already had to build this application. This is the Top Banana project open in Visual Studio. It was built by developers that know .NET. It was built using C#, a language they know, Visual Studio, a tool they know, the .NET framework, an API that they know. The UI was all built using WPF and XAML inside of the Microsoft Expression Design Tool, and then integrated into Visual Studio to provide the interactivity. And all this was done in a matter of three weeks.

BOB MUGLIA: So we’re pretty excited about what Silverlight can do to the Web. We think it’s going to light up the Web in a way that it’s never seen before.

JAMIE COOL: That’s right. And one of the nice things is that everything that I’ve shown you today is available for you to try out in at least alpha or beta form, both the runtime and the tools. You can get them from Silverlight.net. You can also find more information out about Silverlight there, or you can go to any of the many sessions we have on Silverlight here at Tech•Ed.

BOB MUGLIA: Great.

JAMIE COOL: Thank you very much.

BOB MUGLIA: Thanks a lot, Jamie, we appreciate it. (Applause.)

So lots of great new tools to work for both creating applications within your business, the business process applications, integrating with services, connecting to your end users with Office, or reaching any consumer across the Internet with Silverlight and Visual Studio and Expression. So there’s lots of great opportunity to build business applications that are easy to manage and easy to service.

So in terms of looking at it, we think it’s a great year. A lot of great products have shipped over the last six months. A lot of great products will ship. Over the last six months we’ve seen Office, as well as [Windows] Vista come out, as a base platform for you to build applications. We have a great new set of server products coming over the next year, lots of great System Center products. We saw Operations Manager, Configuration Manager is shipping, Virtual Machine Manager, many new System Center products.

I’m pleased to announce today that we’re acquiring  Microsoft is acquiring a company called Engyro that will provide operations manager, management connectors to connect to other heterogeneous systems. I mentioned our commitment to heterogeneity. We’re doing an acquisition to improve our management heterogeneity in terms of connecting to other management tools. So that’s a great thing, lots of great things in System Center.

Forefront, our new security product, we just rolled out Forefront Client Security, as well as our Server Security product. We’ve got our next generation of those products coming, with an integrated environment there, so lots of great things happening in the security space. And BizTalk, I mentioned BizTalk R2, that’s shipping later this year, another great update to BizTalk and business process. But, from a server perspective we have three major new products coming. For developers Visual Studio 2008, SQL Server 2008, and Windows Server 2008, most of which are shipping  SQL is shipping next year, Visual Studio and Windows are shipping late this year, so lots of great products coming out over the next 6 to 12 months.

We think it’s very exciting, lots of real world stuff to do, so it makes for a busy week here at Tech•Ed to learn about these products. So where do you begin? Well, clearly this is the place. You’re in the right place, because we have hundreds of sessions to help you learn about the new products and technology, as well as to network and talk to other people, both Microsoft people and your colleagues across the industry. I mentioned the optimization models, those are a key thing to learn about, and they’re an opportunity for you to take and look at your organization, and see how the technology that Microsoft is building, the software that’s out there, the work that the industry is doing, how all of that can apply to your organization.

So this is the right place, and we’re very excited to be here at Tech•Ed. This is our favorite show of the year. It wouldn’t be that way except for you. We love learning from you. We love listening to you. We love working with you. It’s a great show. This is the 15th anniversary of Tech•Ed. I think it will be the best one ever. I thank you for being here. Have a great show. Thank you very much. (Applause.)

Related Posts