Microsoft’s Vision to Help IT Customers Reduce Complexity and Deliver Greater Business Value

ORLANDO, Fla. — June 7, 2005 — This week at Tech·Ed 2005, Microsoft’s largest annual conference for IT professionals and developers, Microsoft outlined key company investments to help organizations streamline IT and build new systems to boost competitive advantage. At the event Bob Muglia, senior vice president for the Windows Server Division, presented his long-term vision for the Windows Server platform. To learn more about Microsoft’s future strategy for its server operating system, PressPass spoke with Muglia.

PressPass: Can you put the Windows Server strategy into context?

Muglia: One fundamental problem we’re looking to help customers solve has to do with the effectiveness of IT departments and the cost structures they face. A number of surveys show that about 70 percent of IT costs go into maintaining existing systems. That leaves only 30 percent of IT budgets to focus on new solutions that can add new value to the business. Our goal, very simply, is to reduce the costs associated with maintaining existing systems, thereby enabling a larger percentage of the budget to be put into building new systems.

PressPass: Please describe the Windows Server vision.

Muglia: The overall vision consists of five areas where we can make a fundamental difference over the next five to 10 years in customers’ infrastructure, in the partner ecosystem and in the applications that are built on Windows Server. The five elements are focused on making sure we’re building the right products to help IT administrators be more efficient and deliver value to their businesses. The focus areas are the .NET distributed applications platform, intelligent distributed storage, integrating the edge, the Dynamic Systems Initiative (DSI) and, finally, what we call “the right server for the right job.”

PressPass: What do you mean by the right server for the right job?

Muglia: “The right server for the right job” is job one in the sense that it covers the basics — focusing on workloads and simplifying administration and management. Customers run many different kinds of workloads in the Windows Server environment, including networking, security, terminal services, business applications, database, Web server, and e-mail server. We track about 20 of these workloads — in some areas we are very strong and in other areas we have room for improvement. We look at each workload individually to determine if we are meeting our customers’ needs in that specific area.

Our goal is to be best of breed across every workload and be consistent in the way we offer each workload to customers. For example, branch offices are an area of increased focus for customers and where Microsoft is investing across key workloads to offer cost-effective solutions. We have made some announcements today related to our branch- office strategy.

The Windows Server System is a key part of “The right server for the right job” and integrates key functionality — such as identity management — across all the workloads to give administrators a consistent approach to each workload. In terms of simplifying administration and management we have made great progress, but still have much to do. For example, we are changing the command line environment in Windows using a new object-oriented command line technology, code-named “Monad,” that will exceed what has been delivered in Linux and Unix for many years. It will take three to five years to fully develop and deliver. We’re also building a next-generation user interface, taking our existing Microsoft Management Console (MMC) technology to the next level in terms of usability.

Small Business Server 2003 (SBS) is a great example of a “right server for the right job.” SBS fully integrates our knowledge about the small-business customer, and offers a complete solution that is very easy to deploy and use. We see SBS as a great model that can be applicable in many other scenarios.

If we execute on the plans we have now, and continue to provide the best total cost of ownership (TCO), I believe we will gain market share in spaces where UNIX and Linux have traditionally enjoyed success.

PressPass: Tell us more about Windows Server as a platform for .NET distributed applications.

Muglia: The world has moved from standalone, Web-based applications served by traditional application servers to connected systems that are distributed within an organization, and often with business partners. Web services have emerged as a standard way to support these distributed applications. Microsoft has invested heavily in Web services, working with industry partners to build a common set of specifications, often referred to as WS*. This is important because customers want heterogeneous interoperability with Web-service applications, and Microsoft is leading the way in supporting heterogeneous Web services.

Looking ahead, “Indigo” will make it simpler by an order of magnitude to build these Web-services applications — especially in combination with Visual Studio 2005. “Indigo” is the code name for a secure, reliable and transacted messaging infrastructure for building and running connected systems. It will be integrated into Windows Server in the “Longhorn” timeframe. We’re very confident that as Web-services adoption grows, our Web-services strategy with Microsoft .NET will be unambiguously the best and easiest place to build these business applications.

We have taken a fundamentally different approach of integrating Web services into Windows Server so that developers and customers can count on the services being there and don’t have to deal with the complexity of integrating disparate technologies. We think that’s critical because these services are fundamental and something that every application requires. Customers can reduce the cost of developing new business applications by choosing the Microsoft platform.

PressPass: What are you doing to help customers better manage data-storage requirements?

Muglia: The exponential growth of stored data in the enterprise, on portable computers and even on mobile non hard disk-based devices is rapidly changing the way people work. It’s also presenting real challenges for IT. Delivering large amounts of data to remote workers is difficult, so we have to change the the way information gets moved around the organization.



Bob Muglia, Senior Vice President, Windows Server Division

One solution we’re working on is the ability to send information over a Wide Area Network in a way that is smart about conserving bandwidth and absorbing network latencies. This fundamentally shifts to a model where most applications begin to work with data that is accessed locally in the remote site but is a cache or replica of data that is stored at a central location. This means, useful work can be done quickly in the remote office, network usage is minimized through smart compression techniques, and the data is protected at the centralized storage location in the data center.

The other side of this is what has happened in the central storage space, where we’re seeing a trend toward disk-based data protection. For the first time, it is cheaper to backup data on disk than it is to backup data on to tape — that is a big shift that will have far-reaching consequences in the backup and recovery space. This year, we’re introducing Microsoft System Center Data Protection Manager (DPM), built from the ground up to optimize disk-based backup and to provide customers with rapid and reliable data protection. DPM uses efficient byte-level replication to deliver faster backup and less potential data loss, while lowering the total cost of a company’s data protection environment through operational efficiency.

PressPass: Today you talked about “integrating the edge.” What does that mean, exactly?

Muglia: Today, with the increased connectivity and proliferation of devices, firewalls no longer represent a wall that separates an organization’s intranet from the Internet. If you take your mobile device to a Wi-Fi hotspot in a coffee shop and connect to your corporate information, is your machine on the Internet or on part of your corporate intranet? I would argue it’s on both, but access from the coffee shop may not be as secure as desired. This example illustrates that the way we typically measure the boundary between the Internet and the intranet is fundamentally flawed, and it has to change. To date, the physical boundary has determined access, and we need to move to a world where access is no longer constrained by physical boundaries — by topology — but is determined by policy.

From a user perspective, the experience will be much better and will move from one constrained by physical boundaries to an experience that is defined by whether the user is online or offline. Administrators should be able to define different policies for different types of resources. For example, you might want your e-mail server to have one policy where users can access their e-mail from an Internet terminal, but a completely different set of policies for access to secure documents or secure business information such as source code.

So how do we do that? Several technologies will enable this, but they’re going to take time to come together. First, there’s federated identity, which enables companies to work with business partners and to have a consistent identity story. Our answer for this is clearly Active Directory, which customers are adopting broadly, and the functionality for doing federated services with Active Directory is built into Microsoft Windows Server 2003 R2, scheduled for release the second half of this calendar year.

Second, there’s authentication and authorization. We need to move to a world using two-factor authentication or biometrics, such as a thumbprint. That sort of technology has to become ubiquitous to change the way we do authentication. We also need to make it easy for consumers to exchange credentials. And we actually have a lot of work going on there that’s generating a fair amount of excitement. Next, there is the issue of addressability. There already are billions of devices, and there will be many more. Four billion IP addresses are not enough to address all this, and thus Internet Protocol Version 6 (IPv6) is a critical component.

Fourth is the definition of the boundary, and we believe the technology to do this is IPSEC — an Internet security standard protocol that provides for confidentiality and authentication of individual IP packets. You can already use IPSEC in Windows Server 2003 to define the boundaries and security rights that users have to get to machines, but we’re making it a lot easier to use in “Longhorn Server.”

Once defined, the boundaries need to be secure. We need to make sure machines don’t have viruses or other malware. That’s where our anti-malware products and Network Access Protection come in — another feature of “Longhorn Server” — to make sure that machines are fully up to date before they are granted any sort of resources.

Finally, policy dictates that some applications should be available anywhere. Yet today, users either have full access to all resources or none at all. With per-application access, each application will allow access only through a defined protocol, and only that protocol will be allowed through the firewall and given access to corporate resources. Outlook 2003 and Exchange 2003 are a great example of giving users access to corporate resources from outside the network—where the user does not have to specify whether he or she is connected to the corporate network or the Internet to have an “online” experience. Microsoft is expanding support for per-application anywhere access to Terminal Server, Intranet applications, and SharePoint web sites.

PressPass: How does this all fit into Microsoft’s Dynamic Systems Initiative?

Muglia: Underlying all of these efforts is a recognition that this is fundamentally about how Microsoft can help make people more effective by allowing them to focus more on adding new value, and not just managing existing systems. We have done a lot of analysis to understand where IT costs go, and one of the biggest issues is how people spend their time managing and running systems throughout the lifecycle of a product or of a business application. This is the core of Microsoft’s Dynamic Systems Initiative (DSI), launched in March 2003 — lowering the cost of running systems by enabling IT teams to capture and transfer knowledge in models throughout the system lifecycle and across the organization of developers, IT professionals and information workers.

We’re moving into a world where workloads are going to be moved dynamically across different machines, so improving the utilization of datacenter equipment is a critical opportunity that can come from a model-based environment. The enabling technologies of DSI are the System Definition Model (SDM) and WS-Management. We’re building the SDM to provide a common contract between the people that develop applications and those that deploy applications and manage infrastructure. The Team System Edition of Visual Studio 2005 actually allows developers to create SDMs when building applications. That SDM will be consumed in Windows, starting with “Longhorn Server,” as well as in our management products such as Microsoft Operations Manager and Systems Management Server. Some of our systems use model-based management today, such as Microsoft Operations Manager 2005, Windows Server Update Services and Microsoft Update. Another important component of DSI is a transition to virtualization as a more standard element in IT. We currently offer a server virtualization solution with Virtual Server 2005 and we plan to make virtualization part of the Windows platform in the Longhorn wave.

Related Posts