Datacenters are in the midst of a rapid transition into something more orderly, and more efficient in cost, power and human effort – unless the germs of chaos infect the future datacenter, according to research firm Gartner.
In a pitch for a big datacenter conference Nov. 25-26 in London, Gartner addressed datacenter managers eager to keep their facilities relevant and effective with an outline of what not to do during the next few years to keep from having to re-orient and start all over again. “Historically, data centers have been viewed solely as service delivery centers in which cost and risk must be balanced,” according to Rakesh Kumar, VP and analyst at Gartner.
As the long-running global recession continues to ebb, and the effect of multiple disruptive technologies (mobile, cloud, social media, information/analytics) changes expectations of what a datacenter should be able to deliver, datacenters will have to be quicker to respond to changes in the business environment, and not simply keep an eye on technological evolution, Kumar said in a statement announcing the outline.
“Over the next five to 10 years most organizations will need to change their approach to previous data center strategies used in the last five to seven years,” he wrote. “Agility, a critical third variable, will become increasingly important in future.”
Much of the advice is unsurprising. Power efficiency has become so important a topic during the past decade that most datacenters have already considered or are in the process of adopting energy efficient processors, memory, storage and computing systems.
Most datacenter managers will also have heard recommendations that they invest in better tools to allow enterprise datacenter managers to do more in less time, make disaster recovery a part of the core datacenter strategy, modernize datacenter facilities and transform consolidation and efficiency optimization into a continuing priority rather than a one-time effort.
One suggestion, managing capacity by vastly increasing the volume of usage data being considered and deeper analysis of the trends within it, is unfortunate only because it is one of the rare instances in which technology IT normally recommends for the benefit of business units can be turned to help IT make better decisions as well.
The hard point of data on which many of the other recommendations depend, however, comes from Gartner surveys projecting the kinds of fundamental platform changes that datacenters are going through right now.
During the next five years, according to Gartner, the mix of operating systems in the average datacenter will change to include fewer Linux systems and far fewer UNIX systems, which are already on the way out. Instead, Windows servers will become the norm and mainframish boxes running IBM’s Z/OS will expand, at least in some areas, as high-volume transaction processors.
In fact, Gartner recommends users start projects by 2014 to move all a company’s applications from UNIX to other operating systems except those with “extreme uptime, latency and compliance requirements.” Those can wait until 2017 to start their migration.
Gartner still recommends that datacenters use a variety of architectures and delivery models, but the definition of variety has changed. Rather than keeping pools of all the major hardware platforms and operating systems inside the building, diversity comes from the cloud.
Using a range of outsourcers, single-application service providers, infrastructure-, platform- or management services from public cloud-service providers gives datacenters a presence on all the hardware and OSes its service providers support, without the cost or trouble of maintaining everything themselves.
During the next decade, all the various XaaS micro-markets will converge, however, into a single, diverse cloud-computing platform that can provide a good chunk of the infrastructure even for large companies, according to Kumar.
The year 2016 will be a pivotal one for datacenters, as hybrid clouds replace private cloud in large companies, more than half of the average IT budget goes to services provided by datacenters owned by someone else, and – by the end of 2017 – half of all large companies rely on hybrid cloud models rather than private.
“There is a flawed perception of cloud computing as one large phenomenon,” according to Gartner analyst Chris Howard, who wrote the report. “Cloud computing is actually a spectrum of things complementing one another and building on a foundation of sharing. Inherent dualities in the cloud computing phenomenon are spawning divergent strategies for cloud computing success.”
That doesn’t mean it will be possible to build a successful enterprise datacenter operation by calling for take-out services, however. No matter in whose cloud the machines live that actually deliver them, the IT services on which a company depends have to be chosen, tailored, managed and developed by specialists who understand how to support the priorities of their company alone, rather than the wide customer base most cloud providers serve.
“These eight critical forces are the major factors to consider when developing a data center strategy,” Kumar said in the announcement of his report. “Individually and taken together, they will determine the appropriate level of risk, cost and agility that data centers will carry and provide for the business.”
Image: Shutterstock.com/ suphakit73