Times Report Questions Costs of Data Processing

Many organizations have embraced Big Data, deploying all manner of numbers-crunching tools within their departments in order to gain deeper insight. And for a significant number of these organizations, it’s proven a solid decision, allowing them to streamline their operations by identifying sources of profit and waste.

But a new article by The New York Times suggests that the collective hunger for storing and analyzing data (particularly in the context of the cloud) could be creating a good deal of waste of its own, specifically from generating the electricity needed to power all that backend IT infrastructure.

The Times drew its story data from industry experts, including a study commissioned by consulting firm McKinsey & Co. The newspaper found that only 6 percent to 12 percent of electricity flowing through active servers, with the rest of it going to idle hardware in case of sudden demand. The study’s total sample size was 20,000 servers in roughly 70 large data centers.

Worldwide data centers, the Times further claims, “use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants.”

SlashDataCenter has an exhaustive breakdown of the Times analysis, highlighting a couple of issues such as its near-total omission of Power Usage Effectiveness (PUE), the governing metric of the data-center industry.

All that aside, the Times does raise an interesting point about Big Data, one that few people consider as they’re opening a browser window and studying a report or graph: it can cost a lot, not only in terms of cash but also resources and time.

A recent survey by Symantec found that a typical data-center organization experiences some 16 outages in the course of a year, with respondents citing “complexity” as a primary cause of such downtime. Around 65 percent of those surveyed cited business-critical applications as the root of that increasing complexity, followed by the growth of data (51 percent), mobile computing (44 percent), server virtualization (43 percent), budget shortfalls (43 percent), and the public cloud (41 percent).

So while many companies have enjoyed rises in profitability and productivity thanks to business intelligence and analytics, and a couple of (relatively) simple approaches can help determine whether an organization has the foundation in place to create value from Big Data processes, it’s always worth running an analysis of the costs of owning one’s own servers for crunching data versus handing off those operations to the cloud—and, for those environmentally-minded, an examination of the possible carbon footprint created by those decisions.

 

Image: l i g h t p o e t/Shutterstock.com

Post a comment Your email address will not be published.