Whether you’re a networking nerd, an electronics nerd or a physics nerd, you’ll find something interesting in Computerworld’s inside look at how the biggest names on the Internet design their data centers to get ever closer to peak efficiency. Data centers are inherently fascinating to everyone in the tech industry, and as we migrate to a more cloud-based future they’re only going to get more integral and more fascinating.
The key to a successful data center is its “power usage effectiveness,” or PUE. This is a measure of how efficiently they run. Many data centers run at an efficiency of around 2.0 PUE, meaning they use twice as much energy as they actually need. The PUE is the total energy consumed by all data centers within a company, divided by the energy consumed by IT.
Data centers devour energy in two ways: running the systems themselves and keeping them cool as they work. Attacking on both fronts—system efficiency and cooling efficiency—is vital. Everything matters, from the type of electricity used (AC or DC) down to the humidity of the local environment and the physical orientation in the landscape of the data center itself. As for uninterruptible power supplies, the one at Cisco’s new data center in Allen, Texas, looks like a steampunk submarine. It’s really cool stuff.