Bob Muglia Keynote: Microsoft Tech•Ed 2008 – It Professionals

Remarks by Bob Muglia, Senior Vice President, Server and Tools Business
Microsoft Tech•Ed 2008 – It Professionals
Orlando, Fla.
June 10, 2008

BOB MUGLIA: Good morning. Good morning. Welcome, welcome to Tech-Ed 2008. It’s great to see everybody here today. It’s really great because this is a week where we will all get together as IT pro heroes, and understand more about your companies, understand more about ways that you can drive things forward in your business.

Now for the last three months or so, we have been going all across this country and around the world celebrating. There’s a good reason to celebrate, there are some good products to talk about, Windows Server 2008, SQL Server 2008, Visual Studio 2008. But those products really just set the context for what we were really doing, which is going out and talking to our customers, the IT pro heroes around the world that really drive forward their business.

You see at Microsoft we see ourselves as software providers, software platform providers that are enablers for the industry. We work together with thousands, tens of thousands of companies across the industry to give you the tools and services, and capabilities you need, so that you can take that technology and bring it into your business, and differentiate what you’re doing, really drive forward and create business advantage. We truly believe that the IT pros are the unsung heroes of business, and that’s what this celebration has been all about.

Now we’ve been celebrating for three months, and this week is a continuation of that celebration. It’s a chance to get together with other heroes across this country and across the world and talk about what you’re doing, what you want to do, what you plan to do, and to learn, and to drive forward and understand how you can drive your business forward. And one of the great things about this is looking at heroes that have done phenomenal things in this country and around the world. And I want to start today, and start the week by talking about and showing you a real true hero. So let’s run the video, and let’s talk about Hunter.

(Video segment.)

BOB MUGLIA: So what an incredible way to start. Let’s invite Hunter on stage to talk to us about his experiences with Hurricane Katrina. Hunter. (Applause.)

Good morning. Welcome. It’s great to have you here. Tell us what happened after the Hurricane hit, and you had to work to set up a system.

HUNTER ELY: Well, one of the first things that we did was, we went to go help the hospital with Catherine Marsh and they were checking people in and out on paper, and they were having a hard time keeping track of who was where, and what supplies were coming in and out. And one of the first things we did was take a bunch of laptops that were sent to us and put Groove on them, so that we could have this kind of database that would grow and keep itself up to date on or off the network. And it’s worked great.

BOB MUGLIA: The laptop and the Groove software tracks the people that were coming in, the patients, and it let the doctors and nurses know where they were, as well as family members, too, right?

HUNTER ELY: Yes. It was very easy to keep track of what was going on instead of moving from this paper system that nobody had any idea of how to use, or how to keep track of.

BOB MUGLIA: This is not something you had had preexisting. How long did it take you to set this system up?

HUNTER ELY: We had this set up over the course of a day. We had people trained up over the course of two or three days, and then those people just kind of disseminated that knowledge, and just handed out laptops.

BOB MUGLIA: That’s really stepping up and just making it happen really quickly.

HUNTER ELY: Oh, they worked great on it.

BOB MUGLIA: Now SharePoint was involved in this as well, right?

HUNTER ELY: SharePoint to backend this for people that didn’t have these laptops, so that they could at least log in and get some of that data, get some of those documents, whatever they needed.

BOB MUGLIA: So they could get the information. And one of the things was helping to connect families with the patients that were there, right?

HUNTER ELY: Right. You know, people would come in and out, and they had to connect them with a person, otherwise we didn’t need them there, they were just taking up space. So we would connect family members with people that were actually in the hospital.

BOB MUGLIA: That’s great. It’s an amazing thing to see people stepping up in incredible circumstances like this, and using technology to help people. You’re an incredible hero, we really appreciate it.

HUNTER ELY: The people at LSU that all worked together and made it really work well.

BOB MUGLIA: Great. Thank you so much. Thanks for being with us today. (Applause.)

That’s really what it’s all about, and while fortunately we don’t have natural disasters happening all the time, there are issues about all of our businesses and organization space, and it really is IT pros that step up and really solve those problems for our companies and our organizations, and this is just one of many great examples, thousands and thousands of examples that happen all the time.

So let me start this morning by putting some of this in context, and really how we think about how Microsoft can help to provide that underlying infrastructure for your businesses. About five years ago, we focused on a mission that today we call Dynamic IT. It is a very broad idea. It is an idea that says that technology can help to transform your business, that we can work through the lifecycle of a business application from the point of its definition all the way through to its architecture, development, implementation into production, and to ongoing change, management, and operations. And all of those things can be connected together in a seamless way.

It’s an idea that across all of our organizations there are many aspects where technology can be applied effectively to help drive down costs and increase business value. The fundamental goal, of course, is to reduce the ongoing daily maintenance costs, so that more of the focus, the dollars that are spent and the focus of IT can be placed on developing new applications that drive business advantage, and allow your users to really have access to the information they need to make the best business decisions.

We started this journey five years ago, and we said it was a 10-year vision. We’re five years into it, we really feel like there’s been five good years of work that we’ve done together with all of you, and together with the industry. A lot of progress has happened in that period of time. We still have five more good years of work to do to really fully realize the vision to allow ITs to be truly dynamic within organizations, but step-by-step this work is helping organizations get benefit every day. One of the most important aspects to really drive benefit within IT is the infrastructure optimization models which provide a context for how you can use technology within your business. These are on

We’ve seen thousands and thousands of organization around the world apply these ideas to their organization, understanding how when they are using technology are they using it as effectively as they can. What are the steps, what are the things that they can do to increase the value that they get out of the software and out of the systems that they’re deploying, and step-by-step moving those organizations from situations where they might start, where IT is a call center, to a place where IT really becomes a strategic asset for their business. These steps are well-defined, they can be applied to customers, and we do this all the time. When customers have issues, we often can go in and look at what they’re doing and say, here’s how you can make it better by applying these infrastructure optimization models. So I encourage you to learn more about that, because it really does provide a roadmap for the adoption of technology within your business to allow you to really get the most benefit out of it.

So this is all part of a long-term strategic vision. We’ll talk a little bit more about some of the ways that it’ll play out later today, but there’s very much things that can be done now. Now, when we think about the context of Dynamic IT, there are many sets of systems that are very important. Really business applications are critical, desktop management is critical, there are many things that are very critical. One of the most critical aspects for all businesses is managing their identities within their organization, and all the connections of their users, the security, the credential management. And a lot of focus needs to be done within the industry all up to help you to manage those identities.

As we move forward into a world where you begin to use services to run part of your business, managing your identity across multiple service providers becomes a really challenging problem, and federation is a really key issue. Making sure that the credentials of your users are always kept up to date are critical. Allowing and empowering users to work within the organization, and actually add themselves to groups, or get a password reset done if they’ve forgotten their password, that’s part of the cost, and then, of course, connecting all of those pieces and processes together into a single workflow that allow you to, from your HR system, to find an identity through its lifecycle and manage it automatically without having to write a lot of code.

A lot of challenges with identities, a lot of foundations in place with Windows Server and Active Directory, that’s a foundation to build upon, but it’s really not the whole problem by any means. So one of the things we’re trying to do is think about how we can help you to solve these sets of challenges you face in the identity space, and I’m pleased to announce today that a new product, Identity Lifecycle Manager, is entering its beta, it’s public beta right now, and this is a product that can really help you drive down your costs of identity management, and we think it has some great capabilities. It will be very easy to deploy in your organization.

So, with that, what I would like to do is invite Fred Delombaerde up to show us Identity Lifecycle Manager 2. This is the first time we’ve really shown it publicly in a thing like this.

Identity Lifecycle Manager “2” Demo


BOB MUGLIA: Good morning, Fred.

FRED DELOMBAERDE: What I’m going to show you today is Identity Lifecycle Manager 2 Beta 3, and Beta 3 is the first public release of Identity Lifecycle Manager 2, and we’re announcing availability here today at Tech-Ed.

What I want to focus on specifically is some of the new features we’re introducing as part of this release, which are targeted at empowering IT pros for managing identities across the entire enterprise. And so let’s jump right into the demo here. I’m going to log in as a full system administrator into the Identity Lifecycle Manager portal. I’m going to jump down here to the processes section, and what we’re going to do is manage the business processes that drive the IT management of identity in a day-to-day manner.

So let’s go ahead and focus on creating new full-time employees, and a specific challenge with creating new full-time employees is ensuring they have access, and have been provisioned for the appropriate applications on their first day of work, as well as provision for the appropriate distribution lists, and security groups when they first login.

So I’m going to go ahead and jump over to the activities tab here. What I want to start doing is building my business process for creating a new full-time employee. What I’m going to do is go ahead and click add activity, and I’m going to select synchronization rule activity, and we’re going to start building the workflow process for creating a new full-time employee within our organization. So I’m going to go ahead and select Active Directory user account, and so we want to ensure that users have access to Windows, as well as Exchange mailboxes on their first day of employment.

Let’s go ahead and add a second activity, again, we’re going to select the synchronization rule activity, so we’re going to provision an additional application. And in this case what we’re going to do is select a third party application. So you can see from my list of drop down applications here that I can actually go ahead and add an SAP role. So not only could we add Microsoft technology, such as Active Directory, but we can also tie in third party applications, such as SAP, as well.

Let’s go ahead and click okay, and I’m going to submit these changes. Now, as that submission process is taking place, I want to jump over here to a custom HR application that’s been developed for Fabricam, and this custom HR application is very similar to something that we would see with a tool from SAP, as well as PeopleSoft, in that its’ built on top of a SQL Server backend. So what I want to do is create a new employee called Melissa Meyers, and so historically the challenge with these types of HR applications has been that in order to provision somebody within the SQL data store and then tie into the connected directories, and connected system, we would need system administrators, and well as developers to write custom scripts to them provision those people as Active Directory, as well as SAP, and other connected systems.

BOB MUGLIA: So one of the key things we’re doing with the Data Lifecycle Manager is automating the workflow creation process for you, with the ability to connect into a wide variety of third party applications. So in this case we have a custom HR application, but it could just as easily have been PeopleSoft, or SAP.

FRED DELOMBAERDE: Absolutely. So let’s go ahead and click finish. Now a couple of key things are happening. ILM is actually monitoring the SQL database for this custom HR application, and we’re pulling that data directly within ILM and then firing off the appropriate business processes. So ILM has now detected those changes, and we can see we’ve now received an e-mail, without writing any code, that Melissa Meyers has actually been provisioned for both Active Directory, as well as SAP.

So let’s jump back here into our custom HR application. What we’re going to do is we’re going to edit Melissa Meyers’ profile, and so what we want to do is change her from HR to finance manager. Now, historically, or specifically in this case, Fabricam has decided to make  to throw in an additional challenge here. So any changes to job title actually require an additional approval process. Historically the challenge with doing that is that we now need a developer to develop the custom approval process directly within our HR application.

So, again, now that we’ve tied in ILM to the business process of our custom HR app, we can go ahead and now submit these changes, and see how ILM offloads the requirement for development, and actually will process the additional approval directly within Outlook. So you can see, I’ve now received an e-mail notifying me that the job title attribute for Melissa Meyers is requesting a change from HR to finance manager. What I’m going to do is go ahead and actually approve this, and we can see some of the nice integration that we have with Outlook. Now, had I actually rejected this change, ILM would have pushed that data back to the custom HR application, ensuring that we have a compliant state.

The other key point to make here is that since we’re surfacing this concept of approval, and authorization directly within Outlook, we now open up the possibility of enabling IT pros, and information workers to be part of the overall business process. So as we wait for the approval message just to be surfaced here, Melissa Meyers has already been provisioned for Active Directory, as well as SAP. You can now see some of the mails coming in, so please welcome Melissa Meyers to the financial managers team. So ILM has now processed this request, and we now know that it’s within compliance.

BOB MUGLIA: So one of the key things we’re trying to do is make it straightforward for IT administrators to set this up, so that the business managers can be a part of the workflow process. And not a lot of code, there’s no code that really needs to be written to make that happen.

FRED DELOMBAERDE: Absolutely. So another key point is that now that we’ve provisioned Melissa Meyers for these applications, one of the other things that has been going on behind the scenes here is that ILM has also been managing the distribution group, as well as security groups for Melissa Meyers. So if I come into the global address list, and take a look at Melissa Meyers, you’ll notice that not only has she been provisioned for baseline distribution lists, such as Fabricam training, but she’s also been given specific distribution lists, which are calculated dynamically within ILM. So she now has access to the financial management team, as well as the SAP finance role.

BOB MUGLIA: When it comes to distribution lists, one of the things that ILM is really good at is enabling users to set up their own distribution lists, to manage those themselves, so that administrators don’t have to get involved in that process. You can take owners across the business, and have them be owners of the distribution lists, so they can control all the management associated with members of that distribution list. Another really great thing, from a perspective of saving cost, is that ILM supports password reset, so that administrators and help desk personnel don’t need to get involved in that very time-consuming, and frequently occurring activity.

FRED DELOMBAERDE: Absolutely. So what we can see in this very short demo is some of the new features that we’ve introduced as ILM beta three, and how we can empower the IT pro to manage identities end-to-end within the enterprise.

BOB MUGLIA: Great. Thank you.

FRED DELOMBAERDE: Thanks, Bob. (Applause.)

BOB MUGLIA: This is a product that’s focused at solving a problem that, effectively, all organizations have, and we’re really going to try and make it very easy for you to acquire it, and very easy to incorporate it within your organization, a real opportunity for you to bring cost out of your business. And we’re pretty excited about that.


So now sort of moving on, let me talk a little bit about a concern that all organizations have, because of the heterogeneity of the  the natural heterogeneity of the enterprise environment, and that’s interoperability.

We’ve been focused over the last few years on making Microsoft systems the most interoperable in the industry. We’ve now published over 50,000 pages of our protocol and format documentation. We’re driving standards forward, and establishing standards with many, many other organizations in the industry. We’re really  I think the most important thing is that we’re focused on connecting with our customers to understand where they need us to interoperate. The last two years I’ve chaired a council of IT pro executives, CIOs, who have come in and told us about their interoperability concerns, and we’ve had a wide variety of work streams with teams from industry, many, many organizations, government, education, many businesses around the world have been engaging with us to understand what we should do together with the rest of the industry to make systems as interoperable as possible.

There are hundreds, and hundreds of opportunities for us to improve this together, listening to our customers, working with others in the industry, whether it’s in the identity space, the management space, where because of the feedback we’ve received as a part of this interoperability focus we’ve now announced heterogeneous management support for Linux and UNIX environments, with our Operations Manager product. We announced that a few months ago. Whether it’s looking at how we interoperate on document formats, or import-export, all of these things are great examples. Of course, an important example is the example of how business systems interoperate, and how you can connect existing systems in your organization together with Windows-based systems, and how to make that really, really easy and as transparent as possible.

Web services is a way to  is a mechanism that can help to do that, and we see some pretty impressive things coming from the work that’s been happening all across the industry, in making Web services the really, truly interoperable way of connecting business applications.

With that, what I’d like to do is introduce Greg Leake, who has been a director at Microsoft for many, many years, and Jonathan March from WSO2, and one of our great partners that are working together with us to help to make these systems interoperable.

Greg, good morning.

GREG LEAKE: Good to be here.

BOB MUGLIA: Jon, good morning.


BOB MUGLIA: Thank you for joining us.


GREG LEAKE: So to get started, I’m here actually with Jonathan March, with WSO2, and he’s a director of architecture for WSO2. Why don’t you tell everybody a little bit about WSO2.

JONATHAN MARCH: Well, WSO2 is an open source company, and we build a complete SOA platform on top of the Apache Web Services Project.

GREG LEAKE: Great. So what we’re going to show is an end-to-end service oriented app, built with .NET 3.5 and Windows Communication Foundation. Okay. And to get started, I’m going to show you kind of an all .NET implementation, just to give you a sense of the application. It’s a stock trading scenario, and the app is called .NET Stock Trader. Interestingly, if you like this app it’s all downloadable off of MSDN as sample code, and template code. You can take it, your developers can work with it, et cetera.

This is a Windows Presentation Foundation front end to the application. So I’m going to just log in as myself here. And we’re going to see that I get a market summary, what’s going on in the market. I can look at my account information, go over here to the portfolio page. We get a little 3D modeling going on from the smart client. But, here’s the really key point, all of the data everything you see here is actually not being serviced within the presentation client, it’s actually flowing from a service, a backend remote service that we’ll call the business service layer, and that’s a WCF service built with advanced Web service standards, namely the WS* standard that includes reliable messaging, transaction, and full message-level security, using certificates to encrypt the whole payload. So it’s very secure.

In fact, let’s continue on here. Let’s go ahead and buy a stock. I think I’m going to buy S2 here, and I’ll buy 333 shares, so you can see this order from the others I’ve got in my portfolio. So we’ll place that order and what’s happening here is a service-to-service interaction. So we’ve gone from the client to the middle-tier business service, and yet that’s going to invoke another service, which is the order processing service, also over advanced message security and Web services. And we can see we got the order alert that the order processing service has, indeed, placed that order, and we’ll see that in our portfolio.

So let’s bring up a quick slide to just recap what we’ve seen so far. We’ve seen kind of the all .NET implementation of this composite SOA application. The Windows Presentation Client, the middle-tier business services talking to the order processor. But what we’re going to see next is a couple of things. We can swap in at any layer of this application an alternate non-Microsoft platform technology, because we adopt the same Web service advanced standards. The first thing we’re going to see is PHP seamlessly connecting as an alternate front-end to the business services done in .NET. The second thing we’re going to see is, we’re going to swap out that order processor component from the .NET implementation to a Java-based implementation running in the WSO2 Java Application Server.

With that, I think I’m going to turn it over to Jonathan to show you some of this stuff.

JONATHAN MARSH: Thank you. So I’m going to show first connecting to the .NET Business Service using the PHP application. So let me just bring that up. I can login, it’s a very similar kind of functionality to what Gregory has in his smart client. In fact, I’ll login as Greg so I can see the transaction that he just performed. I can go to my portfolio, or to Greg’s portfolio here, and see that, yes, the 333 shares are there. In fact, I can place an order here, I can sell, I’m going to sell those 333 shares that Greg just bought.

GREG LEAKE: Thank you. Did I make any money?

JONATHAN MARSH: And we can see in the portfolio that those shares are now gone. So what we’ve really shown here is a PHP application, it’s developed on the WSO2 Web Services Framework for PHP, which is based on the Apache Axis2 project. The C implementation, so it’s unmanaged code, talking to the .NET managed code business service layer.

And then what I’d also like to show is, talking from the .NET business service layer to another managed stack, which is a Java implementation, and we’ve written a Java order processing back-end service that’s hosted in the WSO2 Web Services Application Server, which, again, is based on the Apache Axis2 project, but this time the Java implementation.

So, Greg, can you configure your .NET middle tier to talk to my Java back-end?

GREG LEAKE: Yes. Let me do just that. So what we’re going to do is, we’re just going to do a quick reconfiguration so the business service, that middle tier service, is now going to talk to an alternate order processor to place and process these orders. So I’m going to bring up the configuration page here for the business service layer. And right now it’s just connected to the .NET order processor. I’m going to change the order mode, there are some pre-set modes here we’ve built into the sample, and I’m going to change this from talking to the .NET order processor to actually talking to the WSO2 Java order processor. So I’ll update the configuration.

Now, of course, what I next need to do is tell the business service where to find the Java order processor. So I need to give it a description or a location for that order processor, and I can do that here in the connection step, and I’m just going to add a new connection here over to the WSO2 Application Server, and that’s on my local server, so I just have to type that in. Whoops, a little keyboard issue there. There we go, local host ought to work, okay. Add the connection, and now what we’re going to see in the connections tab is that we’re connected not only at the top level to the .NET order processor, but the active one is now the WSO2 Java Application Server. And, in fact, that has a console here that’s going to display the decrypted contents of the order message, because remember we’re going over full message security with full digital certificate encryption of the message payload. So I think we’ll all configured, and I’ll just turn it back over to Jonathan to go ahead and place an order here.

JONATHAN MARSH: Okay. Let’s bring back up the PHP, and we can go to our quote. Here we go. I’m going to buy shares of this one, let’s buy 111 shares. Now what happened here is, the PHP front-end talked with the .NET business process layer, it went back here to the Java order processing layer, and the confirmation is coming back up actually into the smart client. We can see here that the smart client has received the order confirmation because it’s still logged into that.

GREG LEAKE: We’re logged into the same user, same session, so it flowed all the way back up. So we went PHP, .NET, Java, and it flowed all the way back up to the order alter, to the WPF smart client front-end.

JONATHAN MARSH: So this is really quite impressive. What we’re really showing here is Microsoft and open source solutions working together to solve the needs of today’s diverse enterprises, and we’re going from .NET through advanced message security, and advanced Web services, to Java solutions, and to unmanaged code, like the Axis2/C project, which is driving the PHP. So the breadth of this interoperability is really quite impressive, and it’s really thanks to Microsoft’s industry leading support for the open standards around Web services that really makes this possible.

GREG LEAKE: Thank you very much for your time. (Applause.)

BOB MUGLIA: Great. Thank you, Jonathan, for joining us.


BOB MUGLIA: Thanks, Greg.

GREG LEAKE: Thank you.


BOB MUGLIA: So interoperability is one of those opportunities that we all have to help make our business systems work together. There are many other great opportunities that exist for us to incorporate technology into our business, to help drive down costs, and make our business more effective. One of the most obvious, and broadest of those technologies is virtualization, and at Microsoft we see a variety of forms of virtualization that can be incorporated into your business systems to help you manage your systems more effective, and to drive down costs.

Now when people talk about virtualization, they are often referring to hardware or server virtualization, and there is no question that this is one of the best opportunities that exist in the marketplace to help consolidate servers, and simplify management overall. So you see great adoption, and great interest, and it’s an area where certainly Microsoft is partnering across the industry to help you incorporate that into your business, and it’s a really important piece.

Another opportunity, which is in a sense quite different, although the technology is a virtualization technology, is application virtualization. And here there’s the opportunity to take and separate applications from the underlying operating system, and allow those applications to be delivered much more effectively without going through a complex installation process. The cost associated with packaging applications is quite high, and deploying them is quite high. If they can be streamed down on demand or brought down because of IT policy, they can be delivered much more effectively, and the system, the desktop, or ultimately the server can be managed much more effectively.

In some organizations who have a large number of desktop users, and in particular in areas where security of information and centralizing of that information within a data center is really important, in areas like financial services, the idea of doing desktop virtualization, or VDI is also a great opportunity to take and run implementations of Vista, copies of Vista, in a virtual machine on a server, on a set of blades, for example, in within a data center, and then bringing the user experience down to dumb terminals that exist inside the desktops in corporations. That’s a great opportunity that exists as well.

And Microsoft is working together with partners in the industry, particularly Citrix, in helping to make this a very viable and cost-effective solution for people. And for many years we’ve worked with Citrix on presentation virtualization with our Terminal Services product, and now Citrix’s XenDesktop to help you take and run applications on a server, and then deliver them to end users so you don’t have to go through the installation process.

All of these are forms of virtualization, and they all provide a different set of benefits that collectively they can be used to help drive down costs. So to take a look, let’s start by taking a look at how one of our customers is applying Microsoft Virtualization within their organization today to get some really effective cost reduction. Let’s run the full video.

(Video segment.)

So I was fortunate enough yesterday morning to have breakfast with Chris, and had a chance to talk to him about his experiences with Virtual Machine Manager, but also with Hyper-V, and they’ve now deployed Hyper-V within their production systems, and they are seeing tremendous advantage with that product. They’ve seen great density and great performance, and rock solid stability. And that’s an experience that most of our customers, in fact, almost all of our customers who have begun to trial and work with Hyper-V have seen.

We’re bringing out Hyper-V as a part of Windows Server 2008. The response to Windows Server 2008 has been incredible from all of you, and the feedback has been great, and we’ve appreciated that. I think it’s true because of the fact that we tried to build the product you asked us to build, and with Hyper-V we’re doing exactly the same thing. And what we’ve done here is, we’ve built a rock solid OS component that works and works, and does so with great performance and stability. This is a truly production ready virtualization system that you can use to very inexpensively incorporate. It’s very simple, and simple to incorporate into your environment. It can be used for great consolidation. We’re seeing amazing performance out of Hyper-V.

And, frankly, I’m very, very pleased by how quickly we’ve been able to make this a system that performs at the same levels of the systems that have been in industry for a long time. When we started doing  when Hyper-V stabilized last fall, and we were able to run our first performance benchmarks on it, one of the things we did, of course, was compare it to ESX, VMware ESX, and honestly it was my expectation that we were going to be pretty far shy of the standard that ESX has established. What we found was that it performed at, and in some ways above the levels that ESX performed, and particularly with things like I/O, which is an incredibly important part of so many workloads. The performance of Hyper-V is just outstanding, the reliability is outstanding.

In comparison, in my old days when I used to work at Microsoft I remember the early days of Windows NT when we worked for really three releases to match the performance that NetWare was able to achieve with file serving. And in this case, we’re really coming out with a release, right out of the gate, that is rock solid, production ready, and absolution performs at or better than anything in the industry. So this is technology that you can really adopt within your organization.

And, of course, a major part of that is incorporating it into an overall manageability solution, and with Virtual Machine Manager and System Center, we provide the context for world-class management of virtualization.

Now, that’s true today. These are all things that are available either now or shortly. Hyper-V is in its final stages of release candidate, and will be available certainly within this summer. We’ve committed to do that by late August, and we’re going to beat that date. So things are looking very, very solid. We’re ready to go on this, and ready for you to begin applying this within your organization.

Now, as we look forward we see all sorts of way in which virtualization will help to change the dynamic data center, and help propel dynamic IT forward. Today we see the use of virtualization for consolidation, we see the use of virtualization to drive down cost, and in some ways improve management. We think we’re really at the beginning of a multi-year roadmap to help use virtualization as a key enabling technology within your organization to change and transform the data center.

The first step is a step that so many of you are doing right now, which is server virtualization, taking the data centers you have, the physical hardware, creating a set of operating system images that then can be applied and run on one or another hardware, and physical machines, and moving those things around as appropriate. That’s a key first step, and it really does open up a lot of possibilities.

Over the next few years we see the software transforming to the point where it can manage this in a much more dynamic way automatically, but there’s a set of steps that need to be done, and a set of investments that need to be made to enable this for your organizations. One of the next steps that we’re working on is to take and use application virtualization so that we can separate the applications that you’re running, the business applications, from the underlying operating system components.

The current structure is sub-optimal today. If you have a couple thousand business applications, you need to have a couple thousand operating system images that bind the applications and the OS together. And that’s a set of images that needs to be maintained, and patched, and it’s not the way we really want to do it. The better way to do it is to have the application separate from the operating system. You have a much smaller number of operating system images, maybe 10 or 20, within your organization that need to be maintained. OS plus middleware, small number of images, keep those maintained, and then you bring those together at runtime, so that they then get placed in your corporate machines.

In order to do that you can’t just do this willy-nilly. You need to have an environment that can describe the relationship of the application components with the different operating systems and the different environments. That relationship has to be defined in a concrete model that is a part of the runtime environment.

Application Virtualization

So Microsoft is making investments not just in hardware virtualization, and server virtualization, we’re also making significant investments in application virtualization to separate the application from the operating system image, and then a very significant investment in modeling technology to allow the components, all of the different connections and components of the business applications to be described, so that they can be managed much more effectively, and then brought together at runtime. What you really want is to be able to take that model, and at runtime, at the time that it’s time to place that application, to bring  have the model define how to take the application, the OS component, and then place it on one or more pieces of hardware, and then manage that in a self-contained, dynamic way.

That’s the vision of where we’re going, and you’ll see that coming over the next few years. That’s what really dynamic IT is all about. There’s so much that can be done now, there are so many ways that products like System Center can be applied, together with Hyper-V, within your organization to get extreme benefits of driving down costs right now.

So let’s take a look at  let’s see a demo and take a look at how Hyper-V and System Center Virtual Machine Manager can be used today. So I’d like to invite Rakesh Malhotra up to give us a demo.

Hyper-V and System Center Virtual Machine Manger Demos

Rakesh, good morning.

RAKESH MALHOTRA: Good morning, Bob.

All right. One of our goals with System Center is really to combine physical and virtual machine management in an integrated solution. What you’re seeing here is the beta of Virtual Machine Manager 2008. This is our centralized console for managing your virtual data center. In the 2007 version of the product we incorporated features that let you drive scenarios like server consolidation, and rapid provisioning, and just overall agility using virtual server. We’ve improved on every single one of those scenarios, by taking advantage of the new functionality available in Hyper-V, built late into Server 2008.

In addition, our customers are telling us that they want to use a single console to manage their entire virtual infrastructure, regardless of the hypervisor they happen to use. So with VMM 2008 you can manage Virtual Server, Hyper-V, and even VMware-based environments.

So to demonstrate that last point, I’m actually going to go ahead and bring up the VMware virtual infrastructure client. This is VMware’s software for managing VMware ESX hosts, and VMware-based virtual machines. What you see here is that I have a New York data center, I have a production ESX cluster with three nodes in the cluster, and several virtual machines running on it. I can get basic power information, status information, directly from this console. If I swap back to the Virtual Machine manager console, and if you take a closer look, you’ll see that I also have this New York data center represented here. There’s that three-node ESX cluster that we saw in the virtual infrastructure client, with online VMs, but right along side it I have a brand new Hyper-V cluster, running Hyper-V, and Windows Server 2008. Our goal is really to blend the experience together, really make it seamless, really integrate it.

BOB MUGLIA: This is an example of the interop feedback that our customers have really given us. They were very clear that many customers have VMware inside their environment, and they’re interested in Hyper-V, but they wanted to have a single management solution that really was able to combine these two, and provide a great runtime experience for both. So there’s only one console for the administrator to work with.

RAKESH MALHOTRA: Exactly. One of the new features in Hyper-V is a feature we call quick migration. It allows you to very rapidly move virtual machines between physical pieces of hardware. Let me show you how that works. I’ve selected a virtual machine running on one of my Hyper-V hosts, and I right click, select the migrate option, confirm that I want to take this action, and the first thing Virtual Machine Manager does is helps me find a new home for this VM. It looks at the requirements of the Virtual Machine, and it compares that against the available capacity in my data center. It crunches numbers like CPU, network, and disk IO and memory, all that stuff together, and gives me a pretty simple to consumer star rating system. So the host, the physical host, with the highest star rating is going to assure that I get the best possible match of resources to my VM.

The next thing I’m going to do before I actually take off the quick migration is point out that at the end of every one of the wizards in Virtual Machine Manager is the ability to generate the Power Shell script that’s the equivalent to what the wizard would do. If you don’t want to continuously click through wizards, you want to automate this, you can just cut and paste that as a Power Shell script. In fact, the entire user interface that I’m showing you today is build completely on top of Power Shell. So our API is your API.

BOB MUGLIA: One of the things we’re doing across the board is investing broadly in making Power Shell commands much more accessible for all of our products. Virtual Machine Manager was one of the first to really deliver on this. In this case, everything you can do in Virtual Machine Manager can be done, and is done, through Power Shell, and the user interface is actually built on top of that, as Rakesh said.

RAKESH MALHOTRA: Exactly. I’m going to go ahead and kick off this quick migration, and it will take a few seconds to complete. What you’re going to see is this virtual machine that I’ve quick migrated goes into the under migration state, and it will pop from this Hyper-V host, Hyper-V 01, down to Hyper-V 03, which was the physical machine that intelligent placement had selected for me.

BOB MUGLIA: Now, a migration like this is a user-initiated migration, which is generally done, because of things like hardware maintenance, or management of the upgrades of the underlying virtualization subsystem. And with quick migration what we do is we actually save the VM to disk, and then re-instantiate on the other physical machine. And the speed of it is really based on the size of the VM together with the speed of the underlying SAN.

RAKESH MALHOTRA: Exactly, in this case I have a virtual machine with about a gigabyte of memory, exactly a gigabyte of memory, and it took roughly 10 seconds to migrate it between physical hardware, so pretty agile.

As some of you may know, in the VMware infrastructure, in their environment, they actually support a feature called VMotion, and that allows me to migrate running virtual machines between physical hardware without any downtime at all, as perceived by the end user. You can drive that with Virtual Machine Manager 2008, as well. So just like the Hyper-V experience, I’m going to click on a VMware VM, and I’m going to choose the migrate option. Again, we’re going to get intelligent placement, looking at our infrastructure, giving us recommendations on where to move the VM. If you pay attention to this transfer pipe here, you’ll see that it’s listed as live, and live means that I’m going to employ VMotion, and users won’t notice the downtime.

I’ll confirm that selection, and just like in Hyper-V, I can output the Power Shell command list. I’ll kick off the live migration in VMotion, and again, you’ll see a very similar experience to what you saw in Hyper-V, this virtual machine will now move from ESX 01 down here to ESX 03. Since this is happening without any user interruption, you can do it during regular business hours, as opposed to maybe doing a scheduled maintenance window.

BOB MUGLIA: Now, live migration is a great feature, and it’s something we’ll add in the next version of Hyper-V, it’s certainly something we have up and running right now, and it will be present in the future. But, one of the things we wanted to do, again, with this idea of having a single console that manages both, is it’s a clearly an important feature for VMware, and we anted to make sure that all of the capabilities of VMware that users want were present in Virtual Machine Manager, so that, again, you would have one environment that really took advantage of the best features of both, and you could use it across both environments.

RAKESH MALHOTRA: Exactly, and as you see here, the VMotion is completed. And again, in just a few seconds, in a very seamless way, between Hyper-V and VMware, we were able to migrate virtual machines around in the environment.

Now, this is great, I mean, it’s a really important feature, but the real power of that in a data center is in having the system take these actions for me, on my behalf, automatically, based on the changing needs and requirements of my data center. So let’s walk over to System Center Operations Manager, where I’m managing a service that happens to be an order tracking system, and it’s a pretty typical three-tier application. I’ll go ahead and expand the different tiers of the app.

I’ve got two Web servers on the front end, an app server in the middle, and a SQL Server on the backend. The SQL Server happens to be physical, it’s got this gray icon, and the darker icons indicate that they’re virtual machines, so a mix between virtual and physical machines.

Now, Ops Manager has this deep and rich knowledge about what’s running inside my virtual machine, it knows the apps, it also knows how the servers are connected to create a service that my end users can consume. What we’ve done is taken this deep and rich application knowledge that Ops Manager has, which is actually very critical, understanding the application in your environment is critical before you make any type of changes in configuration, or resource assignment. We’ve combined this with the agility that you get out of Virtual Machine Manager. We’ve created a feature that we call Performance and Resource Optimization, or PRO for short.

I’m going to go ahead and go back to the VMM console, and bring up Pro Tips, and Pro Tips is just advice that the system is giving me on how better to align the resource allocation to the demands of my users. Ops Manager is looking at my services, watching the applications, it’s constantly looking for ways to better optimize my environment. And when it finds one it generates a Pro Tip.

In this particular case I can see that the order tracking system that we were just looking at is getting slammed pretty hard, that’s why we had those warnings in the diagram view. And it’s prompting me to add another IIS service to the Web farm. I’m going to go ahead and implement this Pro Tip. And when I click implement, behind the scenes Virtual Machine Manager is running a bunch of Power Shell automation, to actually go ahead and provision a new IIS server, using a template that it has already stored in its library, customize it, add it to the Web farm, hopefully it will get me back into a state of health for my application.

BOB MUGLIA: So one of the most important things that we see is that management is not just one part of the system. Virtualization management is very important, but IT pros in the real world have to manage the physical machines, they have to manage the virtual environments, and they need to manage all of the applications that exist on top of that. With System Center we have a complete set of applications, and management tools that all work together to manage your environment. In this case it really shows the power of thinking cohesively like that. If you want to move a VM from one machine another, based on conditions, the virtualization system can only know so much. It can’t know what’s really going on within the applications, how many transactions are being processed is an example. Well, the application can know that, and Operations Manager, it’s straightforward to write an Operations Manager pack that determines that. With Pro Tips what we’ve done is we’ve done a natural extension of Operation Management Pack, to allow the industry overall, and yourselves, to configure things that can drive Virtual Machine Manager based on what’s actually happening within the applications.

RAKESH MALHOTRA: And a great example of that is we’re working with HP, in this particular case to generate to generate Pro Tips about power consumption. In this case the Pro Tips prompted me to migrate a virtual machines outside is an HP enclosure to get me back into compliance with my power threshold policy.

The last thing I’ll point out about PRO is that you can also have the system auto-implement a subset of the tips, based on policies that you can define. So, in effect, the system becomes self-healing, and self-correcting.

BOB MUGLIA: We’re seeing customers do some pretty amazing things with Pro Tips, in terms of automating their environment, and one of the key things, of course, is this is just a feature of System Center. It’s not an extra-cost item. It’s something that’s just built in, and it allows you, as well as our partners across the industry to customize it to meet your needs.

RAKESH MALHOTRA: Absolutely, so let’s go back to our Ops Manager console, and blow this up into a full window, and see if we’re back into compliance. We see the warnings have now gone away, and I’ve got three Web servers on the front end, rather than the two that I had previously. There’s my app server and my database. So with System Center you’re getting this deep, knowledge-driven, dynamic IT, and you can manage physical and virtual together.

I encourage you all to download the beta of Virtual Machine Manager 2008, and send us your feedback.

BOB MUGLIA: Great. Thanks so much, Rakesh, I appreciate it. (Applause.)

This is real world application of great technology being able to help drive down costs and allow you to apply it in your business. And one of the great things, of course, is that combination of Hyper-V together with Systems Center is really a fraction of the cost of what you’re paying today for virtualization, and the overall solution is much more complete and much more integrated, and easier to set up. So we’re pretty excited about how this can be transformative within your organization.

Let me talk a little bit about application virtualization now, about the challenges you face about rolling out applications within your environment to your desktops. There’s a lot of situations where you have older applications that are not compatible with new operating systems and new browsers, where you really want to be able to roll those out quickly. And one of the key technologies I talked a little bit about this earlier, is the idea of application streaming, and application virtualization, and SoftGrid is really the premier technology across the industry, it’s a company we acquired a few years ago, and it’s now been incorporated into our Desktop Optimization Pack, which is available now.

And this summer we have an update of that coming, Version 4.5, which will really enable enterprises to deploy applications much, much more effectively. The cost of streaming an application and creating a stream package is much less than the cost of creating an install base package, and then the flexibility of deployment that this provides, the ability to separate that application from the operating system is key. So we’re really excited about what that will do to help you to drive down your costs of deploying desktops, because we know that application deployment is one of the big obstacles you face.

Now, compatibility issues with applications are also a challenge, as you have older applications that you need to bring forward in the future. And there’s a number of ways that companies deal with this today, sometimes it’s Terminal Server and running the application on the server, sometimes that solves the problem for them. In other cases, people want the application running on the desktop, or on the portable even in a disconnected way, and they’re dealing with the browser out there, the application, and the business app is incompatible. Well, virtualization is a technology that can be used to help solve this by running Virtual PC on a virtual machine on a desktop. So you have your Vista desktop as an example, and XP running inside a virtual machine, with seamless window connectivity, so the user gets what appears to be a seamless experience, but in fact you’re running this in a separate environment.

That’s the technology that we acquired recently with Kidaro, and we’ll be bringing that into MDOP early next year. So with that what I would like to do is invite Jameel Khalfan up to show us a demo of Kidaro. It’s pretty cool stuff.


BOB MUGLIA: Good morning.

JAMEEL KHALFAN: Good morning. We’ve acquired Kidaro, and they’re all about desktop virtualization, and we’re calling this product Microsoft Enterprise Desktop Virtualization, it will allow IT administrators to manage and deploy Virtual PCs out to their end users’ desktop.

To start off, I want to show you the administration console, and give you an idea of what you can do. So we have an area where you can monitor the health of all your virtual machines. You can run reports. We can look at all the images that you have. And over here you can set policies for your different user groups. I want to draw your attention to the middle in the clipboard area. Here we can define copying and pasting in-between the Virtual PC and the host machine. So in an example I’m going to show you of an end user doing some order entry, you might not want them to copy information from the application on the Virtual PC back up to their host system. But it’s probably okay for them to copy information to their virtual machine. So we’ve reflected that in these settings. You can also do the same for file transfer.

The next one I want to show you is the applications area. You might want to have an easy way for your end users to launch those applications that are on your virtual machine. Here you’re able to publish those applications from the virtual machine into the start menu of your host system.

BOB MUGLIA: So in this case, you might have a business application which is not compatible with Vista, and you want to run that in a virtual machine, and we can put some administrator defined policies on how information can be shared back and forth, so there are some good features there, or you might have something that runs in an older version of the browser that run in IE7, and again you can run that older browser in the XP virtual machine.

JAMEEL KHALFAN: Exactly, that’s right. And the next tab I want to show you is, in fact, the Web tab. And this is where you’re able to redirect certain Web sites, you can choose either a white list or a black list, to use a Web browser on the virtual machine. So for example here, I’ve entered the MS TechEd Web site as one of the Web sites we’re going to redirect to IE6 on our virtual machine. So now let me show you the end user experience. Some of you may have some applications that are incompatible with certain OSes, they only run on one particular OS. I’m going to go here to my start menu, and launch an application that will only run in XP. And so there you go. You don’t see any of the UI from Virtual PC, all you see is the application, and you’re able to use it here just like a normal application.

BOB MUGLIA: So let’s be sure the context is understood. So what we’ve got is a Vista desktop that’s running with Virtual PC on top of it. There’s an XP virtual machine without it actually running, and the Kidaro technology forwards all the user experience over to the Vista desktop. So your users really have no idea they’re running virtualization. It’s something that you can set up and configure to provide the application compatibility you need to deploy your applications.

JAMEEL KHALFAN: Exactly. And what we have done here is added the ability to add a red window around the application. So if you want to distinguish a virtual versus a regular application, you’re able to do that.

Now, I want to highlight the copy and paste functionality, and copy some important information to the virtual machine here. You can see I’m able to paste it in the application from the virtual machine. But if I want to take information off the virtual machine and paste it onto my host system, you get a popup saying, according to the policy this is not allowed. So that’s how we’ve added the ability to use applications that only run on one particular OS, or applications that you want to run on you virtual machine.

Now I want to show you the Web reader action feature. So I open up Internet Explorer, go to, my favorite search engine, then I decide I want to go over to a Web site over here.

BOB MUGLIA: I’m sure all of you use it all the time.

JAMEEL KHALFAN: All the time. So I open up the MS TechEd site, and it launches in an IE6 window. So over here you can see, just like that, you’re able to use only those Web sites you specified in this IE6 window.

BOB MUGLIA: And, again, that IE6 browser is running in the XP virtual machine, but the user has no idea about that.

JAMEEL KHALFAN: Exactly. So now I’m using my Web site over here. Let’s say I want to go to another Web site,, well that wasn’t on the list of Web sites that w can use in IE6. So you can see it goes back and opens up in IE7. So we’ve shown you how enterprise desktop virtualization can help you with those application compatibility challenges as well as Web site compatibility challenges and, as Bob mentioned before, I think the best part here is that this will be included in the Desktop Optimization Pack when it’s released next year.

So for more information feel free to stop by our booth afterwards, and I hope to see all of you there.

BOB MUGLIA: Great. Thanks, Jameel, I appreciate it. (Applause.)

So, we know that deployment of applications is one of your biggest costs, and we’re doing a number of things to work together with the industry to drive those costs down, and make it a lot simpler for you, while having a great experience for your end users.

So, one of the opportunities that we see coming is the ability to use services to help offload a set of functions that today you run on-premises within your organization.

Let me be clear: Services are not necessarily the right choice for everybody, but we do think that offering a set of services capabilities is a choice that should be available to you.

So, Microsoft is building a set of services. We’re starting with Exchange, SharePoint, and Live Meeting, a new set of services that we’re making available to customers who wish to run these things in a services way; so the ability to take and offload that work from you, freeing up some of your time to work on other things, such as business intelligence or other areas where you can help to drive business value for your company.

Now, again this is about the power of choice. We’ll still drive forward and build the best versions of Exchange and SharePoint, and make those available for you to deploy within your organization. We’ll take the learnings we have from our services work and apply them to our on-premises versions, so you’ll get all the benefit of that if you want to work with on-premises software, but if you want to have that choice so you can choose to have some of your users, say by geography, maybe your branch office users, use Exchange as a part of their daily business; or perhaps by workload, different functions that exist within your organizations, say maybe some of the features like e-mail but not others; or perhaps by the role of people within your organization, maybe having a subset of your people do it. Again, it’s about providing that power of choice, the combination of the best on-premises software together with the best service offerings. Those things provide you with unparalleled capability of choosing how to drive your environment forward.

We’re pretty excited about the way this technology can be applied to help drive down your costs, and again to free up some time.

One of the key challenges that we know you’ll face is how you connect your service providers with your on-premises software.

I talked earlier about identity and how important it is when you have one or more service providers that you have connections between your existing identity systems that you have today with those that are managed with service providers. Of course, there are the issues of migrating data.

So, with that, what I’d like to do is invite David Chow up to give us a demo of Microsoft Online Services. David, good morning.

DAVID CHOW: Good morning, Bob.

BOB MUGLIA: Great, good to see you.

DAVID CHOW: Microsoft Online Services is a set of enterprise applications that Microsoft hosts that we deliver to you as a subscription service. As Bob said, the initial set of services includes Exchange Online, SharePoint Online and Live Meeting.

Underlying all these services is a fundamental belief that it is about a choice, a choice for an IT pro to decide which applications they want to continue to run on-premise and which of those they want to run online.

The example I’ll give you today is Exchange Online. In this Exchange example I’ll decide to run a certain percentage of my users continue to run on the on-premise environment in Exchange Servers that I host. On the other hand, I want to move a certain set of users to the online environment, and this would be your branch offices.

To do that, the key component is making these two environments connect with each other. This is what we call an e-local existence.

The first thing we need to do is a thing called directory synchronization, followed by mailbox content migration.

Directory synchronization is a tool that we provide and allows you to have your AD environment synch with your online environment. Once those are in synch, we’ll be able to interoperate between those environments.

So, let me go through and show you the tools that allow you to do this.

The process is very simple. All we have to do is first log in as the online credential, followed by your Active Directory logon credentials. Once we have joined those two areas, we’re going to be able to trigger a process that will synchronize all your Active Directory changes into the online environment. This could be changes such as creating new user, deleting user, or changing attributes of any certain users.

Now, let me go back to the admin environment where I can show you how these users are being managed.

I first will go to a list of users where under this list of users what you’ll see is a set of users that we have online. As you can notice on this green button, you’ll see that these accounts are now synchronized with your Active Directory environment. Any changes you make to Active Directory are automatically shared. You are in control of all the data.

On the other side of this is a list of what we call disabled accounts. These are accounts that I’ve decided to stay within my on-premise environment. Depending on my time or my plans, I can select to activate any of these users any time that I want by simply clicking this one button.

BOB MUGLIA: So, one of the key things that we know that as you move to an online service, clearly keeping your directories in synch is critical, having that on an ongoing basis, but also having a migration process where some of your users exist with your Exchange systems on-premises, and some might run inside the service.

DAVID CHOW: That’s correct.

The next thing once you activate the users is figuring out whether you want to actually migrate the data. So, let me take you to the migration tools we have.

In the migration tool what we have is the ability to look into all your Active Directory as well as all the mailboxes you have on your existing e-mail systems. Check the list of all those users that we have. By selecting any of the users you can decide whether you want to migrate the data over by simply clicking the migrate mailbox. In a few clicks away we can decide whether we just want to forward e-mail or actually copy entire contents. And also select the type of content you want to copy, whether that is just e-mail or all the other content and the date range you want to copy. The final step you can now trigger this process, where I can now migrate all my data onto my online environment.

So, let me go back to an Outlook view of this account. As you can now see, all the data is now migrated onto this Outlook mailbox, including e-mails, calendars, tasks, and if I want to create a new e-mail, all you have to do is hit the To button. As you can see, here’s a directory of not just the online users but all the users on your existing system as well.

BOB MUGLIA: So, this is a key thing. From your user’s experience it’s the full Exchange, Outlook environment, everything is there, it looks exactly the same to them, but the actual ongoing running of the messaging system of Exchange is happening inside a Microsoft datacenter, and we’re providing you with high availability and a whole broad set of capabilities there.

DAVID CHOW: That’s right.

So, to wrap this up, Microsoft Online Services is about giving IT pros yet another way to deliver the latest and greatest features to end users, but do it in a way where it’s easy to use, simple, and with you in control. Thank you.

BOB MUGLIA: Great, thanks a lot, David, appreciate it. It’s great. (Applause.)

Online services are just a choice that we want to offer to you as you think about running your business, and we think that over time businesses of all sizes will begin to consider this as a part of your ongoing IT operation, and Exchange and SharePoint are really just the beginning of the set of services we’ll provide, but again we’ll continue to offer great products that you can incorporate into your environment today and into the future as well. We’ll do that in parallel.

So, now let’s turn and talk about managing data inside the database, which is, of course, a critical part of your organization, and SQL Server and running sets of databases is key.

We’ve seen great adoption of SQL Server 2005 over the last two years, and we’re on the verge of having the next version of SQL Server out, and we see a lot of great opportunities to take and simplify the administration and provide a whole broad set of new capabilities for your development organization to take advantage of as they deploy these new features in SQL Server 2008.

So, the database is a key part of the virtualized datacenter of the future, and so we’re thinking really hard about how SQL Server can be simplified and easier to install. With 2008 we’ve added a whole broad set of features to the product to make it easy for you to deploy it within your organization, easy for you to manage it on an ongoing basis, and to get some great cost-savings advantage as you run your ongoing operation.

In fact, that focus on cost savings, reducing the amount of data storage, simplifying the way you administer has been a key part of the design process for SQL Server 2008.

What I’d like to do to start is show you how customers are using SQL Server, and incorporating that into their environments. So, let’s run the San Diego Zoo video.

(Video segment.)

What a great place to work, huh, the San Diego Zoo, all the great things with all those animals there, and it’s great to see how IT technology can help to improve the experience of visitors to that zoo, and help them to understand their customers better, and it’s just one of so many examples of how IT pro heroes within organizations can really make a difference for their organization and for their customers.

SQL Server is a big part of how the San Diego Zoo is applying technology, and it’s very great to see, and it’s great to see how they’re using 2008 today as a part of the beta process.

We are very close to shipping this product. It’s in great shape. Many, many customers, there are thousands and thousands of SQL Servers in production. Microsoft runs our entire SAP system on SQL Server 2008, and has for many months. This product is also incredibly solid, and I’m pleased to announce today that it has gone to the RC process, and that software is available to all of you. It’s in our booth, and you can go pick that up if you want to start working with SQL Server as it nears its final process for production.

So, with that, what I’d like to do is invite Val Fontama up to show us some of the administrative features that are included in SQL Server 2008. Val, good morning.

VAL FONTAMA: Good morning, Bob. Thank you very much.

Good morning, everyone. There are so many new and exciting features in SQL Server 2008. Today, I want to show you two of them that demonstrate the power of this new release, first data compression that helps you reduce your storage costs while improving performance, and second, policy-based management that dramatically simplifies database management for you. And the best thing about it is you get both of these features without modifying your existing applications.

Let’s start with data compression. As your data grows, the new data compression feature will help you shrink your data, and therefore reduce your storage costs, and improve performance.

I’m going to demonstrate by running this simple query here on these two tables. I’ll show you the results before and after compression, just so you see how it works.

Okay, let me go ahead and run the first query. This is before compression. Next I’ll compress those two tables using page compression. You can actually compress the whole database if it makes sense for you, but in the interest of time let’s just do a table compression. So, we’ve now compressed the table, and now I’ll go ahead and run the same query. It’s exactly the same query, but this time I’ll run it on the two compressed tables. So, we’re done.

Let’s switch to profiler, and see what I just did.

So, here are the results before compression, and that’s the results after compression. You’ll notice that compression reduced the number of page reads by fourfold, from 454 down to only 118 page reads, and that’s because compression in SQL Server reduces the number of pages we hold on disk, so it places less demand on your whole application, and overall notice that the total elapsed time also drops by more than 50 percent. In fact, in this case you’re looking at even 75 percent, from 188 down to only 46 milliseconds.

BOB MUGLIA: So, this is an example of a feature within SQL Server 2008 that you can bring into your existing environments, just upgrade the server, it’s a very straightforward process, and then compress your database environments. What you have is better performance and lower disk source, so you can reduce your SAN costs, your disk storage costs, and at the same time get the benefit of better performance because there’s less pages to bring into memory.

VAL FONTAMA: Absolutely, Bob.

And, in fact, we also have further results. These results are actually validated by ISV benchmarks we have, by Comstar, Siemens, and Microsoft Dynamics. These results, the benchmarks actually show storage savings of up to 80 percent using the new compression feature in SQL Server 2008.

Next let’s talk about data management, because SQL Server 2008 now ships with a new policy-based management that dramatically simplifies database management for you. Through policies you can now run all your SQL Servers consistently, you can run them to comply with corporate regulations, and you can actually automate a lot of routine administrative tasks.

Let’s see what those policies look like in SQL Server 2008.

So, under the management node we now have a set of best practice policies that ship out of the box with the product. In fact, this is only a subset; there’s more in the product, as I’ll show in a minute.

Let’s click on the auto-showing property to see more details. This policy is actually a simple performance-based practice that spans all your online user databases, and for each one it checks to make sure that you’ve disabled the property called auto-shrink; the reason being that if you enable that property, it actually has performance implications.

You can run this policy on-demand, as I’m doing here, or on a schedule. So, you can set it to run once every day or once every month.

The other thing is you can run all your policies inside Management Studio, as I’m doing now, but if you prefer to manage your servers from black screens, you can equally do so from the comfort of PowerShell. This is thanks to the deep integration we now have between PowerShell and SQL Server 2008.

BOB MUGLIA: I said earlier that across the board Microsoft is using PowerShell as a great command-based management tool, and incorporating that in our products. With SQL Server 2008 we’ve really built a broad set of commandlets that allow you to control all of the server environment, so basically anything you can do from an administrative perspective you can do with PowerShell.

VAL FONTAMA: Correct, Bob.

Let’s switch back to Management Studio.

Now let me show you how easily you can apply a policy across multiple SQL Servers in your organization, and to do that I simply click on my server group there, do evaluate policy, and first I need to load my policy. Here is the complete list of the performance and security-based practice that ships out of the box with SQL Server 2008.

I’ll go ahead and select the auto-shrink policy we saw earlier, and now I’ll apply to all my policies by simply clicking on that button.

So, notice that most of the servers pass this policy, but one fails. I wonder why. So, if I click view, we see here that the reason it failed is that its auto-shrink policy was set to true, which is bad for performance, and so it violates the policy.

Now let me show you how easily we can fix this problem now in SQL Server 2008.

With one click of a button, the policy-based management has now brought back all my servers into compliance. So, this really demonstrates how the policy-based management simplifies database management for you, makes it so easy to bring your servers into compliance, and to run all of them consistently.

BOB MUGLIA: Policy management is certainly a critical thing for a database and many enterprise systems, and so building this in and building this broad set of policies into SQL Server is really there to simply the lives of IT administrators.

VAL FONTAMA: Absolutely, Bob.

So, I’ve just show you very quickly two features that demonstrate the power of SQL Server 2008: first, data compression that helps you reduce your storage costs, your back costs, while improving performance; and second, the new policy-based management that really simplifies database management for you. And the beauty is you get both of these features without modifying any of your applications.

Thanks very much.

BOB MUGLIA: Great, thanks a lot, Val, appreciate it.

VAL FONTAMA: Thanks, Bob. (Applause.)

BOB MUGLIA: Now, we only had time to show a couple of the IT administrative features of SQL Server 2008. There are so many more, and there’s a broad set of features for developers as well, to use spatial, great new enhancements to the BI subsystem, great new enhancements in the reporting environment, reporting services. So, SQL Server 2008 is a really major upgrade. As I say, we’re now in the final development process with our release candidate out the door. The experience of our customers that have been using this product has been really tremendous.

So, this is sort of the third of the trifecta of Windows Server, Visual Studio, and SQL Server, that is becoming available. We’re pretty excited seeing all that there, and we’re pretty excited about all of the ways in which we’ve had interactions with customers on this.

So, one of the ways we have an ongoing interaction with customers is the conversations we have and the information we publish on TechNet. TechNet is an important resource for IT pros to get information, and it’s an incredibly widely used Web site at Microsoft. We’ve done lots of work to continue to improve it. We improve our own costs of running at the same time. TechNet is all running on Hyper-V right now. TechNet and MSDN are fully virtualized, and we’re in the process of going through and using Hyper-V to virtualize all of So, one of the world’s leading Web sites is running virtualized today.

But in the process we’re also doing things to improve the experience of TechNet and changing it from just a publishing-based Web site where you go to get information to a Web site where you can connect and work with the community that’s out there. We’ve seen thousands and thousands of IT pros, heroes across the world connecting and having ongoing conversations.

On the screen they’re showing a ticker, which is a new feature that’s launching right now, which just shows ongoing conversations that are happening on TechNet. So, you can click on this on the Web site in communities, and whatever is going on right now, you can click on that and go to the active community feature.

We’re also doing all sorts of features to bookmark and allow you to specify what you really care about within TechNet, and also help bring in articles and other information into the community to be shared with others.

Connecting, working together, that’s really what it’s all about to help you the IT pros, the heroes of our industry, to help drive your business forward.

This week is the continuation of the celebration that we’ve been having of you, the celebration of all of the great work that you do to help drive your businesses forward. We have a role in it, but it’s really all about you. This is going to be a great week for all of you. Thank you very much for coming. I’m really glad to see you today. Thanks a lot. (Applause.)


Related Posts