Skip to Main Content
Skip to main content
Stories

Digital Transformation

Microsoft Azure pulls out all the stops to support soaring demand and enterprise expectations for mission critical computing

Demand for cloud is soaring, driven by enterprise appetite for scalability and flexibility with no compromise on performance, latency, security or resilience. Mission critical computing is now also transitioning to public, private and hybrid clouds spurred by the rapid maturing of the underlying technology and the promise of digital transformation.

As a result, Microsoft has, for the last three-four years, experienced 100 per cent growth rates in terms of demand for Azure services with no signs of that slowing.

Speaking at an event in Sydney, Mark Russinovich, chief technology officer for Microsoft Azure, revealed the scale and depth of the company’s investment and research into next generation hyperscale cloud that will help future proof current enterprise investments and open the door to new classes of computer services and applications.

His presentation coincided with the announcement of general availability of two Azure Australia Central regions which have been expressly designed for Government and national critical infrastructure application.

Housed in Canberra Data Centres facilities, Microsoft has also been awarded protected status by the Australian Signals Directorate for 25 Azure services and 10 Office 365 services, meaning that 85-90 per cent of all government data can now be held in Azure – the first hyperscale global cloud to achieve that protected level status in Australia.

The technical foundations of Azure are already impeccable and demonstrably suited to mission critical computing, but as Russinovich explained, the emerging demand for real time artificial intelligence, global reach and scale, and affordable high-speed storage demand continual investment and research from Microsoft and its partners.

Data centre design

Russinovich explained that Microsoft takes a defence in depth approach to data centres; “Your data is the most precious thing we manage so we take a very multi-tiered approach to security of data in our data centres – that includes everything from access approval to the physical location.”

In terms of the data centre architecture, for mission critical applications Microsoft enters a country with two regions to allow users to create applications that are disaster recovery resilient, keeping the data in that geography – but with data centres separated by long distances to maximise disaster survival.

Russinovich explained that while that is ideal for asynchronous replication there could be some minor data loss.

To support synchronous data replication which is necessary for much mission critical computing, Microsoft is rolling out availability zones globally. These feature three Azure data centres which are physically isolated, close – but not too close – to optimise local disaster survival while minimising latency and addressing the risk of data loss.

The latencies targeted are about 600 microseconds between centres and 2 milliseconds across a region. Having three data centres also allows an organisation to develop an application that supports quorum, so they can replicate data across the three and perform voting to determine majority right for data access.

Russinovich also reaffirmed Microsoft’s commitment to sustainability in its data centre design and operation. In 2012 Microsoft achieved carbon neutrality with its data centres and has a long-term goal to move to 100 per cent renewable energy. Russinovich said it is on target to achieve 60 per cent renewables use by 2020.

Mark Russinovich, Chief Technology Officer for Microsoft Azure

Future state

Russinovich explained that besides harnessing today’s cutting-edge technologies, Microsoft is investing heavily in future state technology and in simulators that allow customers to prepare for the future.

Quantum computing is a prime example of Microsoft “pushing forward the envelope of computing,” though he acknowledged that could take 5-20 years to materialise commercially.

Right now, he said; “There is a quantum arms race going on. The kinds of breakthrough you can get with computation efficiency with quantum computing is orders of magnitude. Problems that would take billions of years to solve on conventional computing you can solve literally in hundreds of seconds on a quantum computer,” said Russinovich.

He added that the focus is on making quantum computing accessible, bringing it out of research laboratories and into production despite the significant technical issues associated with the need to operate at very low temperature to minimise noise. Microsoft’s quantum efforts focus on developing more stable “topological qubits” which promise much higher efficiency than rival designs.

Microsoft has already released a quantum computing software developer kit for Windows and Linux along with a 16-quibit simulator to allow organisations to start experimenting with quantum application design. Larger simulators are available on top of Azure.

Sydney is also the location of one of Microsoft’s international Station Q research labs in association with the University of Sydney.

Physical network

Microsoft‘s global Azure network is today connected by one of the largest networks on the planet, with its intra region, regional gateway service connecting to 30,000 miles of fibre backbone network. Every region has at least two paths for redundancy, and Russinovich said that Microsoft continued to invest – often with partners – in the rollout of high speed highly resilient global networks.

At the enterprise networking level Russinovich said that a transition was underway as organisations moved from software defined networking to hyper scale software define networking (SDN) and that network performance will escalate, first to 50, then 100 gigabit networks in the coming two years.

That said, Russinovich acknowledged a series of technical challenges associated with high processor consumption. To address the issue Microsoft is exploring the use of field programmable gate arrays to overcome the issue with no compromise to SDN flexibility.

The accelerated networking afforded by that approach is currently available to enterprises on request, and Russinovich said that the plans were to make it the default approach “in a little bit.”

Scalable computing

As the adage goes “much wants more”. When Azure launched in 2010 it was based on units of 12 cores and 32 GB or Ram. More recently it has made available units featuring four sets of 18 core processors and 4 TB Ram.

Microsoft’s Project Olympus looks over the horizon and will open source technologies that it believes will be essential to handle even larger emerging cloud workloads. Russinovich said these new architectures would support 50 gigabit per second networking.

A joint project with Nvidia and Ingrasys meanwhile is developing a high density graphical processor unit that can be used as a foundation building block for neural network training applications.

That allies with Project Brainwave, an initiative to leverage field programmable gate arrays to handle the inference and scoring necessary for real time artificial intelligence applications.

Future Storage

Russinovich explained how there were two factors ratchetting up the scale of enterprise data collections. “Cloud is driving storage and the retention of more data…AI is all about having large amounts of data and processing that data.

Our goal is to make it possible to store all the data you might need in very cost-efficient way.

– Mark Russinovich

He said that Microsoft was working to improve the performance of flash, and had also introduced Azure archival storage on tape. “One enclosure up to 72k tapes in it,” he said.

Tape however has limitations, and Microsoft’s Pelican project is leveraging hard disks with 11 petabytes of data storage – using a scheduler to spin disks up and down on demand.

Project Silica meanwhile is a joint effort by Microsoft, Microsoft Research and the University of Southampton using femto-second lasers to write data to glass which does not decay over time and needs only light to be read.

In theory a 25mm square of glass could store 50 TB of data in layers of “voxels” etched by the laser.

Project Palix, in conjunction with the University of Washington is also delving into how DNA could store data in research that Russinovich says could potentially allow 12 ZB of data to be stored in a data centre rack sometime in the future.

Trusted computing

Security and privacy are the watchwords of successful mission critical cloud computing. An initiative with Intel to create what Russinovich described as “confidential computing” is well underway. This allows data to be stored and processed in Azure – without Microsoft having any access to the data.

Using a specially designed “trusted enclave” protects everything in the enclave from outside access and tampering.

“It is protected from rogue administrators, malware. If law enforcement wants access to the data …we simply cannot from a technical perspective give access to that data,” said Russinovich.

He said one of the key applications for the solution was in emerging blockchain solutions as it could rapidly, securely and privately manage distributed ledger processing without the costly and non-confidential proof of work that has plagued blockchain applications such as Bitcoin.

Russinovich said that at present the proof of work required for a single Bitcoin transaction consumed the energy equivalent of a US household over ten days.

Instead “the intersection of blockchain and confidential computing” meant that “If we all agree on piece of code and trust the enclave technology, when that code executes a transaction – we trust the code and the enclave technology.

“You can create a network where any node can perform the transaction by trusting the code in the enclave.”

Microsoft’s Project Coco is now taking distributed ledgers or blockchains and “infusing them with the characteristics of confidential computing,” ready for release as Azure based blockchain-as-a-service solutions. Russinovich said that Microsoft intends to open source the Coco framework and will also integrate it with Ethereum.

That, he said, would allow database level throughput and efficiency, with no compromise of confidentiality or governance.

Russinovich shared with attendees the vision that Microsoft has for the ongoing development of Azure and its applicability for mission critical computing – but he did not sugar coat the extent of that challenge.

“Azure is doubling year over year – it may grow ten times, 100 times – trying to stay ahead of that is no mean feat at this point in time.”