Almost every process associated with the data center these days is either a manual one, or else reliant on some sort of undocumented script that doesn’t scale particularly well. That drives up the costs associated with managing the data center, and creates more opportunities for human error.
In order to address these issues, there’s a movement afoot to create the “programmable data center,” where an IT administrator can more holistically manage servers, storage, and networking components. While still in its relative infancy, a number of vendors have expressed interest in the movement’s underlying concepts, all but ensuring its growth in coming years.
VMware has been particularly active in the drive to make the data center more programmable. The company is promoting the notion of a software-defined data center, which it hopes to make a reality via the combination of virtualization and software-defined networking (SDN) technologies. “We want to extend the benefits of virtualization well beyond compute,” said Bogomil Balkansky, VMware senior vice president of cloud infrastructure products.
While VMware pushes a programmable data model based on its technologies, vendors such as Puppet Labs are making the case for a more platform-neutral approach.
Puppet Labs has developed a declarative language for configuring systems that can be extended across the data center: the organization recently announced the creation of an open source project in conjunction with EMC, called Razor, to accomplish that goal.
There’s already open source project known as Chef, created by Opscode, with a similar set of goals.
In a similar vein, Reflex Systems, a provider of virtualization management tools, is trying to drum interest in VQL, a query language that the company specifically developed for IT pros. According to Reflex Systems CTO Aaron Bawcom, the next step for the company likely involves open-sourcing portions of the VQL technology, while also making it easier for others to embed a run-time engine that supports VQL within their products. “What we provide is a way to gain real-time awareness of any changes to the IT environment,” he said.
The Reflex approach is attractive, suggests Cory Miller, vice president of engineering services for Bremer Financial Services, because it helps concentrate the management of IT in one place: “The helps avoid the blame game, because we can do more root-cause analysis on the impact an issue might have on a particular business service.”
The Pace of Change
But the pace at which the programmable data center will evolve is still anybody’s guess. Rather than trying to boil the data-center ocean, most of the leading proponents advocate that organizations take a more measured approach to modernizing data-center management.
“There are a lot of organizations that either fear change or simply are not good at change,” said Puppet Labs CEO Luke Kanies. “Organizations should start a small project and then build on the compound interest provided by that initial success.”
“It really is a journey,” added Floyd Strimling, technical evangelist for Zenoss, a provider of IT monitoring tools. “There’s no such thing as a cookie cutter data center, so the introduction of new systems and processes can actually wind up increasing complexity, at least in the short term.”
Things get even more complicated when trying to manage IT processes across heterogeneous systems and networks. The end result are scenarios where organizations might have a state-of-the art server from Cisco, IBM or Hewlett-Packard being deployed alongside ten-year-old Unix servers with completely different management paradigms.
Driving to Nirvana
How long it will take for this data center nirvana to become a reality is also a matter of debate. Sunil Potti, vice president and general manager of the cloud networking group at Citrix Systems, believes it’ll take until at least 2014 to 2015 before the programmable data center concept becomes a set of widely-deployed technologies.
For Potti, a key missing element is the “gateway” technologies that will bridge the divide between applications running across heterogeneous networks and multiple instances of SDNs; without those gateways, organizations would have to deploy forklift upgrades to their networks—a very unlikely scenario.
In the meantime, the network will continue to not only act as a bottleneck in terms of performance perspective, but also at the rate at which changes can be made. “Most networks today are bifurcated for multi-tenant and security reasons. But it’s also what prevents them from being agile,” Potti added. “And once you solve that problem, you’ll still have to take on all the storage management issues.”
Mike Matchett, senior analyst and consultant with the Taneja Group, thinks that the ability to integrate IT management functions with some type of programming language will gain momentum, delivering that brave new world of data center management much faster than most people currently believe.
“We’ll still need expertise to set up the data center,” he said. “But in the not-too-distant future data centers are going to come and go as just another object to the application.”
Ultimately, the issue affects the way applications are deployed. “We need to think about this in terms of layers of IT processes,” noted Dave Roberts, vice president of strategy and evangelism for ServiceMesh, a provider of tools for managing application deployments. “Organizations ultimately want to move to a straight-through approach to IT processes where human intervention becomes the exception.”
Given the scale of the challenge facing IT organizations these days, Edward Haletky, CEO of the IT consulting firm The Virtualization Practice, LLC, suggests there really is no choice but to embrace higher levels of IT automation. Most data centers, he added, are heavily dependent on custom scripts for automating tasks that tend to be poorly documented and refuse to scale well. Without some advancements, IT organizations will be incapable of keeping pace with the current demands of virtual-machine sprawl—never mind actual cloud computing.
“I can create a virtual data center inside a data center today. It just takes an awful lot of work to make it happen,” he said. “We’re at that stage now where we had better start automating everything.”