Bob Muglia: Microsoft Virtualization Launch

Remarks by Bob Muglia, Senior Vice President, Server and Tools Business at the Microsoft Virtualization Launch
Sept. 8, 2008, Bellevue, Wash.

ANNOUNCER: Ladies and gentlemen, please welcome the Senior Vice President, Server and Tools Business, Bob Muglia. (Applause.)

BOB MUGLIA: Well, good afternoon. And thanks to those of you who are joining us here in the Seattle area, and those of you who are watching the video around the world. We’re here today to talk about virtualization, and as we know it’s still early days for virtualization. Only about 12 percent of servers are virtualized, and very few desktops are virtualized. We foresee a world in the future where virtualization will become ubiquitous and it will be used across all the desktops and servers that we use in business. So let’s take a few minutes to step back and think about what this world would look like.

We live in a connected world; business is connected to business, to their trading partners, and of course to their customers. But in this connected world physical locations don’t matter. Why? Well, we can’t overcome the speed of light, so locality is important, and there are solid business reasons to locate data in different places. So data centers wind up being spread at different locations around the world. And although there is this spread, we can manage this environment better. Virtualization will play a key role in this, and as servers and data centers become virtualized, businesses will gain the flexibility to run what they need wherever they need it, the flexibility to move data and workloads from one server or data center to another, the flexibility to respond to the needs of their changing business. We think about virtualization predominantly on the server today, but as we move forward virtualization isn’t just about the server, it also affects the desktop, and the many devices that each of us use every day in our jobs and in our lives.

Virtualization will change the way we work with information, and applications and data, the information we work with will all become virtualized. They will be available wherever we are, no matter what device we’re working with. We think about the devices a lot, but it’s not really about the devices, it’s really all about people, and the way they work together. So imagine this world where virtualization becomes ubiquitous.

Five years ago we announced a vision for a dynamic world. We call it Dynamic IT. We said it was a ten-year vision, we’re five years into it now, five years of solid progress. This vision really has two parts, the first is the dynamic data center, which will transform the way businesses create and work with applications within their environment. The second is user-centric computing which will change the way people work with their information every day in their lives. This is our vision for the future of IT, we think it’s a pretty exciting vision.

The Industry View of Virtualization: Tom Bittman of Gartner

But let me start by today, this afternoon, by providing you with an industry view. So please welcome Gartner’s VP and Chief of Research of Infrastructure and Operations, Tom Bittman. Tom.

TOM BITTMAN: Good afternoon. Why is virtualization such a big deal? What is this all about? A lot of people understand, and they think it’s about saving money, it’s about saving power, it’s about green IT, it’s about space, and that’s certainly true. But in our perspective, it’s much, much bigger than that, and people are just now beginning to understand that. We think that there’s a major transformation taking place in IT. This major transformation that’s been taking place for years is starting to accelerate, and we believe that virtualization is a major enabler and catalyst of that transition that’s taking place.

There’s a couple of things here that I want to talk about. One is that we believe virtualization changes architectures, technologies, of course, but it also changes processes, it changes cultures, it changes behaviors, it changes the players in the market and how they compete. There are some massive changes taking place because of virtualization. And it’s not just within IT. In our perspective, because of the agility, because of the speed, the elasticity that’s being enabled by virtualization, virtualization actually can enable a significant business transformation, and we’ve seen this happen with early adopters. We’ve absolutely seen this happen.

Now, that’s great, but it’s actually bigger than that. Let me connect this now to a bigger trend, another trend that’s very, very new, very nascent, just now emerging, of cloud computing. We believe that virtualization actually unlocks cloud computing. Now, what does that really mean? What that means is these changes in processes, the changes in orientation from components to services, the changes in behavior and culture, the changes in how we pay for things, all of that is aligned with what’s happening, and what will be driven by cloud computing. These are very, very congruent trends that are occurring out there. What the cloud is delivering is essentially virtualized services.

Now, on the other hand, we don’t believe that that means we’re all moving to a big change where everything will go out to the cloud. We actually believe that what’s happening is we’re seeing a transformation occur inside of IT, in addition to outside in external service providers, and that change that’s happening inside of IT is moving them to become more, and more cloud-like. So in other words, I like to say that we’re seeing this transformation both inside and outside of IT towards cloud computing, and I’ll be talking about that over the next few minutes.

So what do we mean by this transformation? We believe that we are moving from a world where we manage components, we manage silos, we did resource management, we did capacity planning, we did performance management within silos. That’s how we did things. Sometimes we put automation tools on top of those silos, but they were still silos. And what’s happening is we’re seeing a fairly major change here, as we move toward virtualization, where we’re moving away from components and toward layers, and toward pools, and moving toward this kind of model we’re much more elastic, we can do things a lot quicker than we used to be able to do them, but we are not done yet.

By itself virtualization doesn’t finish the story. Wouldn’t it be interesting if we could take these pools of resource, and tie to it the service level requirements of end users, if we could automate it to really meet service-level needs. And that’s where we believe we are going. We are moving toward this version of service-orientation of everything, and automated environments.

Now, this vision, this vision, this transformation we’ve talked about at Gartner for about seven years, what’s new and different now, and what connects us together is we’re starting to see external service providers take these concepts and deliver them on the Internet to end users as services. That’s what’s happening. And we believe that while internal IT is evolving, external IT, service providers are starting to deliver these services in the form of cloud computing.

Cloud Computing and Virtualization

Now, let’s dive a little bit deeper, and talk about the technologies. I do want to touch on this. Cloud computing, rather than go through a definition, there are a couple of key attributes to cloud computing that really tie this together with virtualization. One is services-orientation. Another is utility pricing, whether it’s subsidized, or I pay for it based on use. Another is elasticity, perhaps massive elasticity. Finally, it’s delivered over the Internet. Of those attributes of cloud computing, the first three are identical to what’s happening inside of IT. Again, these transformations are taking place in the same direction.

Virtualization, what we’re really talking about with virtualization is not necessarily consolidation. A lot of users think this is a new form, a new way of consolidating. It’s much bigger than that, and in fact, in many ways it’s opposite of consolidation. What’s happening here is we’re actually creating a decoupling of things that were previously locked together. These things that were locked together in the past, we’re finding ways to create abstraction layers to separate them out.

Now, that decoupling, that abstraction layer becomes a services interface. Now, let me give an example. A hypervisor is a service provider providing compute as a service to virtual machines. That’s what’s happening throughout the stack, services orientation throughout. That’s what virtualization is enabling. Now, just creating that interface, and that decoupling isn’t all that’s happening. What’s also happening is we’re having the possibility here of delivering these different services in unique and different ways. We like to call these alternate delivery models.

So for example, what if I have my application, running on my operating system, running on your hardware. That’s what’s now called compute as a service, infrastructure as a service, something like that, or I could reverse that. What if I take your application, your operating system, and run it on my hardware. That would be something like a software appliance, or maybe on-premise software as a service. The point is, virtualization is creating all these possible models, this decoupling, this abstraction, and this creation of service layers is making this possible. But, it’s not all about technology, not at all about technology.

We think that there’s quite a bit here that’s taking place in the periphery, process, people, culture, behavior, mindset. In fact, one of the biggest things that I think is changing is that we’re seeing a massive shift in mindset in how we think of IT. We used to think of it as components, we bought components, we put them together, the business would ask for a new server, they’d get a server, they’d have a poster over it, that’s their server. That’s all changing.

What’s happening is we’re moving this mindset from components to services. And that’s significant, because instead of  it’s not just about managing services internally, it’s also about consuming services, and being ready to consume services perhaps in the cloud. This is a cultural change, it’s a behavior change, it’s a mindset change, and it’s important.

The other one is pricing and licensing. The prices for software typically are tied to a serial number, a specific server, that box, it’s got to be running there, and it doesn’t ever change. That absolutely goes by the wayside as we virtualize, as we move to the cloud. That model must change and become much more usage oriented.

IT funding, typically we get paid for things based on a project, go buy a bunch of servers, a bunch of storage, network capacity based on this project. Well, now we’re actually fluidly expanding things, and contracting them, and we’re using IT as hotel space. Funding needs to change, and in fact, we’re seeing this from our users who have virtualized.

What they’re starting to see, what they’re starting to feel is a need to do more usage accounting, who is using what? How much are they using, and in fact, one of the most interesting trends that’s beginning to pick up is an interest in charge back, speed and elasticity, being able to move things much faster. When you virtualize you can typically deploy a server, for example, about 30 times faster. That’s dramatically different. And what that actually does is it changes behavior of your customer. What we’re seeing is demand for equipment, for IT, is doubling when that IT resource is virtualized, when it’s easy to get, when the barriers to entry are low, they ask for more.

Management, management needs to go through a speed shift. What that means is, it used to be things were fairly static, and I had a process that managed that. Well, things just got a lot faster, and they’re moving around, and they’re changing in size, and they’re here with this application today, and they’re over here tomorrow. And I’m using this service provider today, and now I’ve got an internal. Management has to change dramatically to manage this. Now, on the other hand, these services layers are creating a new way to manage, new instrumentation that makes it possible to federate things and to manage things in this tool environment, much more seamlessly. So management is being turned on its head and being enabled.

And finally, this is all leading toward cloud computing. This is all congruent with the idea of cloud computing, services orientation, decoupling of things, and allowing you to have service access to that, pay-per-use, elasticity and dramatic up and down usage. This is all very congruent with cloud computing. And I’m not saying that what we’re doing here is moving people to an environment where they will go to the cloud, where they’ll outsource everything. Actually, I’m saying the opposite. I’m saying that what’s happening is a transformation in IT, driven by virtualization among other things, that’s allowing IT internally to act more and more like an internal cloud provider.

By the way, at the same time they become a better cloud computing consumer, because they’ve changed their paradigm, their behaviors, and their mindset in order to do that. Major changes are happening with virtualization, and what we recommend is absolutely this is not something you take lightly, this is a strategic change, look ahead to where virtualization is going long-term.

Thank you very much.

BOB MUGLIA: Thanks, Tom.

Thanks for that view of virtualization really at the center of IT. It’s very much in sync with Microsoft’s view of how virtualization plays a key role. What I’d like to do now is really focus in on virtualization’s role on the desktop, and on the set of devices that people work with.

Virtualization’s Role on the Desktop: User-Centric Computing

For a while in the past data was locked inside the corporate firewall, and that’s where users got information, but over the last two years there’s been a proliferation of portable devices and the Internet, and the access to data from just about anywhere. We’ve seen devices spring up and portables spring up into people’s hands, and users have taken and unlocked that data, and brought it out into the wild. So the data is now roaming free in a way that it’s never roamed before. And in a lot of senses our IT policies have not kept up with that environment. And there are things we need to do to refocus on managing not just the devices, but the way users work with information. Different users have different needs.

Contractors come into an environment, and they’re on the corporate network, but what rights do they have to the applications and the business data that’s available in that environment. Users are at a coffee shop, do they have the same rights to work with business applications in that environment as they do when they’re inside the corporate firewall? That’s a policy decision.

Now for years we’ve thought about managing devices, and the focus of desktop management software, and device management software is all about managing the devices. It’s really the wrong way to look at things. The right way to look at things is to focus on how we can enable users to have access to the information they need when they need it regardless of what device they’re on, and provide users with as seamless an experience as possible as they roam between devices.

A personal experience, something that I saw over the weekend. My wife and I work together in our family office in our house, and I have a wonderful desktop PC there, and we just got a brand new 22-inch monitor, and we’re all excited about that, and it’s a great environment to work. And I certainly can access my e-mail there. But when I’m on that computer, I don’t have access to all the business applications that Microsoft provides, and it’s an environment in which I’m shut out of a lot of the things I need to do. Now, is that a policy decision from Microsoft, or is it a limitation of the way our IT department has rolled out the technology? The right solution would be to use virtualization technology, perhaps presentation virtualization, to give me access to that same corporate application that I should have access to. The right solution is to allow me to have the same environment on my desktop at work as I do on my portable, and the appropriate environment when I’m working on a non-corporate-owned PC, or when I’m accessing data from a mobile device.

That whole seamlessness together we think of as user-centric computing, and we’re taking steps in direction of reaching this vision. We’ve seen some important steps taken over the last few weeks. With that, what I would like to do is invite Scott Woodgate up to show us some of those steps, and where we are today, and give you an idea of what’s become possible. Scott.

Desktop Virtualization Demo

SCOTT WOODGATE: All right. Let’s talk about desktop virtualization and Windows Vista together creating new business value, including improved business continuity, data protection, absolutely user-centric computing. Here is my Windows Vista Enterprise laptop. I have Application Virtualization 4.5 installed. Now, I’m not a user of Visio, but it is a corporate approved application, and I just received an urgent e-mail from a customer including a Visio file, so I need to use Visio right now. With the power of App-V, all I need to do is go to my start menu, and even though Visio is not installed on my machine, I can launch the shortcut, and Visio is streamed in runtime down to my system, and I’m good to go.

In addition to the use of  here we go, streamed down to my system, and in a few seconds I’ll be good to go. No longer do I need to go to grab a setup CD, give my machine to an IT administrator, I simply launch it, and I get on with opening my customer’s file and doing my business.

Now getting access to corporate approved applications is one thing. But it’s a whole different story when I want to have a new application added. Adding a new application causes my IT to test that new application with every other application on my system. This can take weeks or months.

BOB MUGLIA: One of the things we found when we talked to IT administrators is that the costs of provisioning and deploying PCs is one of the most prohibitive things within their organization, and it very much limits their flexibility. So Application Virtualization is a technology that dramatically simplifies this, it allows applications to run side-by-side, and brings the right applications to end users at the right time. We see this as something that will drive costs down substantially for IT in the coming months and years, and really just like we think of server virtualization as being ubiquitous in the server environment, we see application virtualization becoming ubiquitous in the corporate desktop.

SCOTT WOODGATE: Absolutely, Bob. So I’m already running Office 2007 here, and it turns out I have a legacy Access 97 database that I want to open. Now, Office 2007 conflicts with Office 97. They share some registry keys, I can’t install those on the same machine. But with the power of Application Virtualization, conflicts are eliminated, so I can simply go ahead and Office Access 97 is streamed down to my PC, and I can go ahead and use those two applications that normally conflict with each other just fine to get my job done. This is a big advance, because IT can much more rapidly deliver me applications to do my job.

BOB MUGLIA: So we just shipped this new version of App-V, and we think this version is ready for broad deployment. We’ve localized it in 11 languages. It’s gone through a whole lot of validation testing, and we think it’s going to transform the way corporate rollout desktop applications.

SCOTT WOODGATE: Absolutely. So let’s move beyond the PC. It turns out I was in New York over the weekend, and I was in a taxi, and I don’t know if this has ever happened to you, but as you can see on the screen, I had an unpleasant experience. I left my laptop in a taxi. This is usually a big disaster. Corporate proprietary information might be leaked, fortunately I was running Windows Vista Enterprise with BitLocker, so my information was safe. Nevertheless, it would still typically take me hours to go ahead and get a new PC, install my applications, install my data. I probably lost some data in the process. This is a big deal, and it happens 75,000 times a year.

Now, let me show you what the deal is, let’s go download this laptop. I’m on a brand new laptop. My IT department has just given it to me. The only thing this laptop has on it is a base image of Windows Vista. As you can see, I have never used it before. That’s all it has, a base image of Windows Vista. None of my applications are on it, absolutely none of my user data is on it, and it’s certainly not personalized with things like my wallpaper.

As I log into the system, the combination of Windows Vista and the Application Virtualization technology gives me the ability to get access to my desktop in exactly the same way it was available on my previous PC. And so here I am on a brand new machine that I never installed any of these applications on literally productive within minutes.

BOB MUGLIA: So all your data comes down in your My Documents folder, all the applications are available through virtualization?

SCOTT WOODGATE: It’s all here, my wallpaper, my data, my applications, and that time it used to take me when I lost my laptop in reduced productivity, it’s all here, I’m ready to go. This is a big deal for mobile workers.

Virtual Desktop Infrastructure

Let’s change tasks, there are other scenarios for different users in the enterprise with Windows Vista. For example, task workers, or offshore contractors, some companies, especially in highly regulated industries, are looking to provide virtualized Windows Vista desktops running on servers. This scenario is called VDI, or Virtual Desktop Infrastructure. Now VDI isn’t for everybody, it doesn’t work for laptop users, but it’s certainly an option when deploying Windows Vista. So what I want to show you is in partnership with Citrix, Microsoft technology for Virtual Desktop Infrastructure.

So here is the login screen into my virtual desktop. I’m using Citrix Xendesktop 2.1, and I’ll login as my username, and when I press login, Citrix Xendesktop 2.1 integrates with Microsoft System Center Virtual Machine Manager, and of course Hyper-V. And so my desktops are running virtualized on a server using System Center Virtual Machine Manager, Hyper-V, and Xendesktop.

BOB MUGLIA: And it’s the same environment you have on your portable.

SCOTT WOODGATE: Yes, that’s the amazing thing. Whether I’m on one laptop, another laptop, or even on a server in the data center, I have exactly the same environment.

BOB MUGLIA: We’ve had a great partnership with Citrix over the years, and we recently extended that to cover the VDI or desktop virtualization scenario, and you see the results here.

SCOTT WOODGATE: Absolutely. So, Bob, this is truly the birth of user-centric computing.

BOB MUGLIA: Great. Thanks a lot, Scott. (Applause.)

So that gives you a glimpse of where we see the world moving towards, a world where we move to manage the user rather than managing the sets of devices. And you’ll see lots of changes. System Center will evolve in this way, all of our product lines will continue to evolve as we head towards this vision.

Virtualization in the Data Center

Now what I want to do is talk about the data center, and talk about the path that we’ve taken with virtualization in the data center environment. Now without virtualization, we come from a world where virtualization was non-existent, and we’re still at a stage where only a small percentage of servers are virtualized. And in this world we see very low percentages of utilization in server environments. It’s typical that companies run servers at less than 15 percent. It’s actually not untypical to have less than 10 percent or five percent utilization on a set of servers.

Now, what’s the first thing that virtualization does? Well, with virtualization, of course, you have the ability to do consolidation. And in a world with consolidation for virtualization, the ability to drive up that utilization is very substantive. A little story here, I was talking to Debra Chrapaty, who runs the Microsoft data centers. We purchased 100,000 computers last year for that environment, so we’re actually the largest commercial purchaser of computers in the world. And Debra was talking to me about how she is trying to save energy because of the incredible energy usage that these data centers take. And one of the things she’s done is, she’s painted the roofs of the data centers white to reflect the sunlight so that the cooling requirement is down. Now, when you look across the Microsoft data centers, we have everything from highly utilized machines that are forcing on very special purpose tasks for like search, but a very large percentage of the systems run at that same level of utilization that businesses run at. So 5 percent, 10 percent, 15 percent sorts of utilization, and you can imagine the amount of energy savings that’s possible if you’re able to consolidate those servers down and get higher levels of virtualization.

Well, that’s exactly what we’ve done with We’re in the process of using Hyper-V to consolidate that entire environment. And if you go to, a very substantive percentage of the sites today are running on Hyper-V, all of TechNet and MSDN is running on Hyper-V, and those sites are getting over 50 percent utilization with very substantive management benefits associated with it, so the cost reduction both in terms of capital, energy, as well as management time is very substantive.

So what’s the next step? Consolidation is great. What’s the next step that people are using virtualization for? Business continuity. Virtualization provides a great and simple environment to allow workloads to move from one data center to another in the event of some form of catastrophic loss. So global deployment across different data centers providing great provisioning flexibility, and the ability to simply switch over in the event of some form of catastrophe. Now one of the things we feel very important is, we think that business continuity is going to grow in importance for companies as they recognize the criticality to their business and the criticality to their shareholders. So we see this as something that needs to also be broadly used. We think it needs to be a broad part of the platform that’s available to customers really regardless of the way they’re deploying information. And with that capability comes many, many great things that will drive forward for business.

So let’s start, let’s take a look at how some of today’s virtualization technologies can be used, and I what I would like to do is invite Senior Product Manager Edwin Yuen up to show us Hyper-V and virtualization in action. Edwin.

Hyper-V Demo

EDWIN YUEN: We’re trying to make virtualization a seamless part of the IT experience by making Hyper-V a feature of Windows Server 2008. And with the power of Hyper-V, and the management capabilities of System Center, we’re really bringing the management of both physical and virtual resources together from a single management infrastructure.

With Windows Server 2008, we have Hyper-V, the hypervisor-based high-performance virtualization platform built right into Windows. If you know Windows, you know virtualization. So let’s go ahead and take a look.

Here we’re actually using System Center Virtual Machine Manager 2008, our centralized console for managing the entire virtual infrastructure to take a look at our system. Let’s look at one of our virtual machines here. This is actually a 64-bit Windows Server 2008 virtual machine, it has four virtual cores right in the system, and it actually has 16 gigabytes of RAM right in the VM. In fact, Hyper-V supports up to 64 gigabytes of RAM per VM, and up to one terabyte of RAM on the physical host. But it’s much more than just talking about virtualizing systems, and virtual hardware. What we really care about are the workloads that are running inside of those virtual machines. We recently made an announcement providing support for many of our most popular server applications running in a virtualized environment with the same level of support as you would in a physical environment. So let’s take a look at some of those. Here we have a virtual machine that’s running Microsoft Office SharePoint Server 2007, we have a virtual machine running Microsoft Exchange Server 2007, and we even have a server that’s running Microsoft SQL Server 2008. Again, all of these applications are fully supported by Microsoft running in a virtualized space.

BOB MUGLIA: Now these are very data intensive workloads, ad one of the things customers have told us in their experience with Hyper-V, and certainly our testing shows, is that they get rock solid performance and capability when they deploy their applications using Hyper-V.

One of the things we’ve been most gratified with with Hyper-V is the incredible performance it provides relative to everything else that’s out there in the market. In June, QLogic did a benchmark where they looked at I/O performance of native systems and Hyper-V, and Hyper-V achieved 90-95 percent on fiber channel, and really 99 percent of the same performance as native when running in ISCUSI, and actually was able to get 180,000 IOPS per second through that environment, which is the industry record on that class of system.

Since June, they’ve continued running their test, and what they’ve found is that they’ve scaled out multiple virtual machines, that performance has continued, so it’s possible to get great performance, and great consolidation at the same time.

EDWIN YUEN: And our customers have also told us how important it is to run heterogeneous environments. Hyper-V was designed to enable most X86 and X64 operating systems, not just Windows and Windows applications. Here we have a machine that’s running SUSE Linux Enterprise 10 64-bit Edition. In fact, we’ve made integration components available for Linux to ensure that Linux not only runs in Hyper-V, but runs well in Hyper-V.

But what if you don’t have Windows Server 2008? Our goal is to make virtualization ubiquitous, with that in mind, we have Microsoft Hyper-V Server 2008. Hyper-V Server 2008 is our standalone bare metal hypervisor solution. So let’s go ahead, and we’re going to remote into one of these boxes. I’ll go ahead and login. And once we’ve logged in, you can see there’s no additional resources left to anything that’s not virtualization. There’s really no GUI, just a simple command line interface in order to configure the server. Once the server has been configured, you remotely manage it through the Hyper-V Manager in Windows Server 2008 and Vista, or use Virtual Machine Manager 2008.

BOB MUGLIA: As we announced today, the Hyper-V Server will be available within the next 30 days as the no cost download on the Web, so customers who have older systems that are not using 2008, for example, can use Hyper-V server to do things like server consolidation, and get a great environment.

System Center Virtual Machine Manager 2008

EDWIN YUEN: Let’s not forget about management also, for without great management, a lot of the benefits of virtualization get lost in an administration nightmare, and that’s where the System Center family of products comes in. System Center helps bring together both the physical and virtual management from a single set of tools, a single set of consoles. With that in mind, let’s take a look at the latest member of the System Center family, Virtual Machine Manager 2008.

Here we have the Virtual Machine Manager 2008 console, and what we can see is that Virtual Machine Manager not only supports Microsoft systems, but a broad range of virtualization platforms. So if we take a look here, we’ll see that we’re managing a Virtual Server 2005 Server running on Server 2003. We’re managing one of our Hyper-V Server 2008 systems that we’ve just talked about. We’re managing several Windows Server 2008 Hyper-V Servers, including a cluster. And we’re even managing VMware ESX 35 Servers, and its clusters directly right from Virtual Machine Manager with the full functionality that you’d expect in order to manage VMware.

BOB MUGLIA: Customers told us that they wanted to have a single console that they could use to manage their different virtualization environments, so we worked really hard to make virtual machine manager not just a good VMware management tool, but a great one.

EDWIN YUEN: And then also, now that we have physical and virtual management, again, we need to extend those capabilities right into the workloads that we’re running, and we do that with the integration of virtual machine manager, and System Center Operations Manager 2007. Here we have Operations Manager 2007, that’s monitoring the health and the performance of our entire data center, both physical and virtual. And we have a couple of alerts here, both on a virtual machine, and on a physical host. In this case, the CPU utilization is very high. Those alerts are actually sent directly into Virtual Machine Manager through a feature called PRO, or Performance, Resource and Optimization. What PRO will do is not only show us that the alert exists, but give us the full cause and resolution, what the issues are, and actually have a one-button click in order to implement and fix that issue. In fact, we can actually send the issues that are being monitored through Operations Manager and Virtual Manager through extensible PRO paths which our partners can build to really give us additional insights into workload. And, finally, we can configure these PRO tips to automatically implement for the administrator, really leading to that dynamic data center that we’ve talked about.

BOB MUGLIA: Having a complete management solution is really critical, and with System Center, our customers are able to manage the physical environment, the virtual environment, as well as their applications in a fully integrated way. That’s an offering put together that nobody else in the industry really provides. Great. Thanks a lot, Edwin.

EDWIN YUEN: Thank you, Bob. (Applause.)

Hyper-V and Windows Server 2008 Previews

BOB MUGLIA: So those are some features that are either available now, or available within the next 30 days, both Virtual Machine Manager and Hyper-V Server will be out within the next 30 days. What I wanted to do now was turn to something in the not too distant future, and give you an idea and a sneak preview at the next release of Hyper-V, and the next release of Windows Server 2008 Code Named R2. This is the first time we’ve done a public demonstration of the next release of Windows Server.

So I’m going to start by bringing up a video, and you see it in that picture-in-picture there. And what this video  this video is streaming off of a server, it is running on a virtual machine, in fact, it’s running on this virtual machine that you see here that’s called N-1. And so the video is running consistently on that virtual machine. And what I’m going to do is show you one of the new features that’s coming in the next release of Server 2008, and Hyper-V. There’s actually quite a few new features there which we’ll talk about both at the upcoming PDC (Professional Developer’s Conference) in late October, as well as at WinHEC which is the first week of November. We’ll go into a lot of detail on Server 2008 R2 at that time.

I just wanted to show this one feature that you might be interested in, we call it Live Migration. What I’m going to do now is, I’m going to start this migration from one server in a cluster to the other. You can see it’s started. So why don’t we zoom up on that picture, just take a look and see that it’s running consistently. It’s running consistently, no skips or anything. And it’s done. So if you want to take a look here, you can see it has now switched over to N-2, and what you’ve now seen is the first public demonstration of Hyper-V Live Migration. (Applause.)

There is no magic in VMotion, it’s just a feature and we’ll have that feature in the next release of Hyper-V, and Windows Server 2008.

The Evolution of Virtualization

So we’re talking about what’s available now, or what will be available in the shorter term. Let’s move forward and look at where the future lies, where are we going? Well, we think of this as the next stage in the evolution of virtualization, that’s the dynamic data center. The dynamic data center, think about where we’ve come from, and where we want to go. Where we come form is a world where IT would provision a server based on the needs of a given business, and it would literally be a physical server that ran a given application. Tom talked a little bit about this. We’ve moved to a world with virtualization today where IT buys a set of servers, and then they provision the applications that are associated with the different business needs onto those physical servers, using virtualization.

Well, the next stage, the next stage in the evolution of virtualization is to have a world where IT buys tens, hundreds, or even thousands or tens of thousands of computers, utilizing the incredible power that exists, and the incredible power that’s available from the next generation, the current generation of servers, making all of those resources available to the businesses within their environment, a big pool of servers, potentially spread between multiple data centers.

There’s a set of business applications that need to be run. And in this world, the world of the dynamic data center, the data center software itself assigns those applications to the server, it grows the capacity as needed by the applications, and shrinks it when that capacity is no longer required.

So the IT administrator is actually taken out of that role, and the capacity and utilization can go up, because the system is utilizing it to the best possible capability, and the needs of the business are solved at the same time the costs, the people costs, that are so important are driven down. That’s really the core of where we see the world in the future, with the dynamic data center, going. But, the data center and on-premise is only one part of this environment.

There’s another important part that will complement on-premises data centers, and that’s the services in the cloud. There’s been a lot of talk about cloud services, and how there is an emergence of a next generation platform that will help to drive new capabilities, new ability to provision data, get information, get applications deployed quickly, connected to users. There’s no question that the cloud is a very important next step in the evolution of the data center, and an important step in the evolution of virtualization.

As Tom said, we very much believe that there will be a mixture of on-premises resources, together with the cloud, and certainly for some time on-premises will dominate in terms of the total number of resources deployed. But, over time the cloud and service-based computing will emerge as a very important alternative for companies, as they look at how they deploy their resources. So let’s take a look at virtualization’s role in cloud services.

Virtualization plays a key role, because it’s a core enabler in the cloud services to be built. Virtualization enables workloads to move between on-premises, or back. It is critical to enable scale out, within the cloud environment. I mentioned the importance of virtualization to scale out on premises. That same kind of capability, that same kind of thinking is critical for scale out in the cloud environment.

So virtualization is absolutely critical as we move to a cloud-based service environment, where there is this application platform presented. But, virtualization is only one part of the solution. You need a complete platform. Virtualization plays a key role, it’s a foundational role, but it’s only one part. First of all, there are a broad set of services that are required for people to build next-generation, service-based applications.

Certainly there’s the underlying compute services, the programming environment, such as .NET, storage environments, all of those things are foundational and need to be present, on top of those services are important services that businesses need to implement their corporate applications, workflow services, communications services, database services, services that are based on identity connect the on-premises environment together with the cloud environment, so that when a user leaves a company, that their deprovisioned not just on premises, but on the cloud at the same time. You can imagine the complexity of a world where you might have multiple service providers, of ensuring that your corporate data is secure as you have different users coming and going. These things must be connected and identity services are critical to that.

Now, investing across that broad set of services is absolutely critical and essential. Virtualization is just a piece, but all of these other services are important. And when you think about either the dynamic data center on premises, or this dynamic services-based environment, the ability to understand what makes up the components of an application, how they’re connected together, and what is necessary to do as you expand the resources that an application requires, or deallocate those resources, that’s also critical. We’re investing very heavily, we think a very key part of this is modeling and modeling technology, to describe what an application is all about.

Now, I was talking to an industry analyst this morning about modeling, and I could just see their faces glaze over as I talked about it. I think they had this vision of 20 years of failed modeling projects. So I realize I say this with a bit of trepidation, because people’s expectations about modeling may be skewed by the past. But, one of the things that we see is that you can flip this whole thing upside down, and instead of operating in the abstract, as modeling has always done, we’re focusing on how modeling can be a key concrete component to describe an application, through its entire lifecycle, from the point of application definition and requirements, through architecture, development, deployment, and ongoing operations, having a description of the system that accurately defines what the components of the application are that are augmented through each stage of this process. That’s a critical component, and another critical component is to take that model, take that blueprint that defines what an application does and use it to describe the ongoing operation of the application.

So all of these things are required as we go forward into the future. Virtualization, a complete platform with features like identity and database, an environment that describes the complete application infrastructure, the modeling environment that describes how these components are together. You’ll hear much more about all of this at the upcoming Professional Developers Conference on October 27th, in Los Angeles, which is where Microsoft will be talking much more in detail about our services platform, and how all of these components fit together. You’ll see that in Los Angeles.

So as we move forward into the future, we talked about a 10-year vision for dynamic IT, five years into that 10 years vision. We’ve got a lot of great work to do, Microsoft as a platform provider, the entire industry, bringing together the combination of the on-premises world and the future world of services computing. Transforming the way data centers are built, and changing the experience that end users have.

There are many pieces to this. Virtualization plays a very key role in it. There is more, but it’s the kind of thing that I think we all have a lot to look forward to. It’s something that we’ll all do together, the platform providers, the broad partners that in the industry that will deliver, like so many other partners that are here today with us, they’re helping to make this a reality for all of our end customers around the world. Ultimately it’s all about helping those customers to be as successful as they can possibly be.