Enterprise IT: Where It’s Been, Where It’s Going

REDMOND, Wash. – Dec. 15, 2009 – Ten years ago, the world was bracing for a technical nightmare as Y2K loomed. Of course, a month later the IT industry could take a bow (and breathe a collective sigh of relief), having saved governments, businesses and consumers from supposed disaster — a real concern that had been over-hyped in the late ’90s.



Bob Muglia, president of Microsoft’s Server and Tools Business, says cloud computing will drive tremendous technological change over the next decade.

At the time, Bob Muglia was running Microsoft’s business productivity group, in charge of Office applications. Since then, he has been responsible for everything from server infrastructure products and developer tools to enterprise management solutions and cloud services. That puts him in an ideal position from which to reflect on the evolution of the IT industry over the past decade.

As the decade draws to a close, PressPass asked Muglia about the trajectory the industry is on as it hurtles into the second decade of the millennium.

PressPass: Which of the IT developments of the past decade would you say have the staying power to carry us into the next?

Muglia: One was the explosion in the number of sources and devices people use to get information. I believe this trend will continue, with even more variety of information appliances and a proliferation in the types of information that interest users.

A second development was the shift that the Web brought about in the way people do computing. Organizations started looking to the Web as a means to tie together and integrate the various systems they had running within their own businesses, as well as externally sharing and integrating information, data and systems. In 2000, when Microsoft released the first beta versions of the .NET Framework, it paved the way for service-oriented architectures and the Web-services programming model. This also was the first step in our longer-term vision for application development, where developers can build an application once and access it anywhere — on the desktop, on mobile devices and in the cloud.

Another key trend is the concept of autonomic computing, or self-managing systems, which emerged in 2002 in response to the rapidly growing complexity of enterprise computing environments. Microsoft’s version of this was our Dynamic Systems Initiative, which transformed how IT departments manage enterprise software. Although I may be biased, I believe Microsoft is the only company in the industry that is turning the concepts of autonomic computing into a reality. Our customers are clearly seeing the benefits of what we now call Dynamic IT — an application platform that aligns computing resources with an enterprise’s shifting business conditions and technology needs.

And in the context of trends that have longevity, we really have to include virtualization. Virtualization offered relatively quick returns on investment, reducing capital cost, lowering the amount of hardware businesses needed to buy, and boosting efficiency.

Now people are looking to take the next step beyond virtualization — cloud computing — which essentially couldn’t have happened in any significant way without virtualization. Virtualization enables workloads to move between on-premises software and the cloud. Of course, virtualization is only one enabler for cloud computing.

PressPass: How important will cloud computing be in the next decade?

Muglia: The cloud and the next-generation application model that it brings with it will be one of the most important influences in the coming decade. There’s a pretty long list of advantages to cloud computing for businesses and their customers, including reliability, scalability, and low or no capital expenditures. These advantages are driven by the cloud application model and the changes in the way applications will be written. At Microsoft, we will focus on protecting the customer investment in existing applications as they are moved to the cloud and, more important, we also will enable customers to modify those applications and easily write new ones that take full advantage of the capabilities the cloud can deliver. Our goal is to enable every customer to rapidly build applications that can reach anywhere in the world, are always available and can scale as needed.

There’s more than one flavor of cloud computing, including private clouds that run on a business’s on-site servers. And it needn’t be an all-or-nothing proposition; we expect customers to want to integrate on-premises datacenters with an external cloud.

I’m convinced that years from now we will look back on the advent of the cloud as a major transition point. I can’t begin to predict what implications it holds for commerce, science, engineering or global economic development, but I have a fundamental belief that society will do wondrous things with computing power over the next decade, and the cloud will unleash that power.

Still, there’s a lot of hype about cloud computing today, and the reality is that the market is still nascent. Recent industry analyst reports say only about 4 percent of small and enterprise companies from the United States and Europe are actually taking advantage of the cloud.

A useful way to think about the cloud is as an analogue to yesterday’s mainframes, in the sense that it has the ability to run tier-one, mission-critical applications at massive scale with a high degree of availability. The difference is that the cloud can scale way beyond anything that you could previously implement.

A year ago, our customers were mainly just curious about the cloud. But today many are already dipping their toes in the water and starting to experiment with it. Over the next two to three years, we’ll see serious implementations being done in the cloud. This is one technology that will catch on quickly because it is consistent with the existing environments people have in their organizations, and it’s inexpensive to deploy. Over the next 12 to 18 months, we expect to see one out of five companies either using cloud services or planning to implement them soon.

PressPass: You mentioned the growing variety of devices that are available for accessing information. Do you see this trend leveling off anytime soon, perhaps because we’ll reach some hardware limitations?

Muglia: I believe this trend will continue, with even more variety of information appliances and a proliferation in the types of information in which users are interested. One thing that will change, however, is the user experience across those devices. It will become easier to move information from one device to another. Right now, things are still pretty “siloed.” For example, if you’re in the middle of a browsing session on one device, you should be able to easily pick up at that same point on another device. We’ll start to see this sort of convergence in the next decade.

I have a healthy faith in Moore’s Law, and I’m confident that as the chip industry continues to deliver more computing capacity, there will be applications that require that capacity. Or, to put it another way, I’m convinced that the needs of the business community will consume all of the computing capacity that becomes available in the future, and then some.

As computing power increases at the edge of the network in the various clients and devices, it’s going to be important to pull the right information together so people — and business users in particular — are accessing information they need and that is useful to them.

PressPass: What will be the most significant changes in enterprise IT over the next five years?

Muglia: The two big changes that I see happening are the shift in focus to user-centric computing and the evolution of the datacenter. These are both massive transformations. In the case of user-centric computing, it is really about empowering end users within organizations to do their jobs effectively, regardless of where they are, yet at the same time having them operate under the set of policies and controls that are important for the needs of the business.

And, in the case of the datacenter, we’ve moved from a world where a business unit has an application they need to run and then IT procures a server, to a world that treats computing as a pool of resources that is effectively available on demand and can be managed at a fraction of the cost. There is a central theme to both of these: flexibility.

Businesses and workers — including IT professionals and developers — expect more out of technology than previous generations did. These expectations are driving a paradigm shift in technology and profoundly changing the way technology will be consumed, delivered and experienced.

Related Posts