Matching Data-Center Costs With Actual Benefits

Datacenter performance-analysis software developer Romonet has launched a new version of its cloud-based suite of applications, designed to keep datacenters at peak energy- and cost-efficiency.

The original version of Romonet’s SaaS-based Portal datacenter infrastructure management (DCIM) application relied on predictive analytics for capacity planning and cost predictions. Portal 2.0 contains analytics that make datacenter performance metrics both more granular and more current; the goal is to deliver a set of data showing current performance broken down by workload or sub-system and compare that real-world performance to the performance numbers projected by datacenter managers.

Portal 2.0 does not manage the hardware itself, and doesn’t collect the performance data it uses to measure current performance. Instead, it is designed to pull data from another existing DCIM/metering application and identify places or functions whose costs are unusually high or low compared to projections.

Managing complex datacenters—especially heavily virtualized datacenters in which not all the servers or workloads are always visible—is more like a game of Tetris than it is like a straightforward set of systems whose performance can be measured and calculated using a spreadsheet, according to Andrew Hillier, CTO and co-founder of datacenter-analysis software maker CiRBA.

Most companies treat datacenters like apartments—a single unit for which they buy a defined set of equipment and from which they expect a certain number of workloads, according to Dan Kusnetzky, founder of consultancy Kusnetzky Group LLC.

A better model might be a hotel room—workloads come in, use the facilities for a while and then leave. The goal of management is to move as many workloads into and out of a room as possible, rather than getting one workload into a single room with a static set of resources and keeping it there. “If resources aren’t reclaimed and used to support a different virtual workload, datacenter efficiency and overall datacenter performance suffer,” Kusnetzky wrote for ZDNet. In heavily virtualized datacenters, however, workloads tend to room surf more than capacity planners really expect.

Workloads running in virtual servers share resources by running on the same physical servers, using the same storage and networks as other VMs and other workloads. That makes it very hard to determine how much of a particular sub-system’s capacity should be measured to estimate the cost and level of efficiency of Business Process A vs. Business Process B.

Analytics from companies such as Ramonet and CiRBA can only work effectively if they can identify the resource demands made by each workload and how that demand impacts capacity in the rest of the datacenter, Kusnetzky wrote.

Romonet’s Portal takes data from metering apps and uses it to recalculate and present a picture of datacenter efficiency aimed at business and financial managers relying on Activity Based Costing (ABC) or other techniques to match business processes with the datacenter functions that support them.

Portal focuses on the output of the datacenter—the services or applications, customer transactions or information and business processes that are the actual services consumed by LoB employees running the business, according to company CEO Zahl Limbuwala. Comparing the efficiency of business processes and the IT services they depend on against measures of the performance of datacenter systems allows business managers to see the amount of effort particular functions demand and how much that effort costs.

With that in mind, business managers can focus current and future IT spending on the parts of the process that are most financially beneficial, rather than those that make datacenter hardware rev most efficiently, Limbuwala said.


Image: Sashkin/