[caption id="attachment_10650" align="aligncenter" width="618"] A screen from "Titanfall."[/caption] An Iowa datacenter project, codenamed “Project Mountain,” is a Microsoft facility that could help power the company’s cloud services, including Office 365 and Xbox. According to the Des Moines Register, the project in West Des Moines will create 24 full-time jobs and cost nearly $700 million. Microsoft recently unveiled the next-generation Xbox One, due in November, and it’s clear that the gaming platform—which depends heavily on the cloud—will require quite a bit of backend IT infrastructure in order to satisfy the typically demanding gamer audience. Microsoft has stated it will dedicate 300,000 servers to powering cloud-connected players. The new data center will rely on Microsoft's Generation 4 Modular Data Center design, in which the whole datacenter is a set of containerized, self-contained units that can be added or subtracted easily to meet changes in demand for capacity without overbuilding the whole facility. The 40-foot metal containers are pre-wired for power and connectivity, rely on natural airflow for most of their cooling, and plug directly into a central backbone supplying both power and bandwidth, according to Fast Company. Like its facilities in Chicago and Dublin, Project Mountain will be built on a "cloud-scale data center design"—a plan to deliver both high performance and high reliability that depends on software rather than redundant hardware to keep the facility online, according to David Gauthier, director of datacenter architecture and design at Microsoft. "As the emerging generation of cloud providers and software developers more readily embrace the fact that hardware fails, service availability is increasingly being engineered at the software platform and application level instead of focusing on reliable hardware," Gauthier wrote in a February blog explaining the approach. The approach boils down to treating the datacenter like a cloud in its own right, complete with integrated hardware, storage and networking systems that can dynamically adjust to deal with traffic spikes or hardware failures. "It is much easier and faster to simulate or make changes in software than physical hardware," Gauthier wrote. "The telemetry and tools available today to debug software are significantly more advanced than even the best data center commissioning program or standard operating procedure. Software error handling routines can resolve an issue far faster than a human with a crash cart.” The upshot for users (if everything works the way it should) is a facility able to consistently deliver high performance without unmanageable costs to the provider. But Microsoft is far from the only vendor relying on automation, intelligent infrastructure-management software or virtualization to manage and optimize datacenters—most major cloud providers are buying or building some version of the same thing. Cloud-service reliability won't make or break the success of Xbox One, the version of Microsoft's game console due to be delivered in November. When Microsoft first announced the console, it suggested that the hardware would need to “check in” with Microsoft at least once every 24 hours; in addition, the company said it would keep a central database and authorization service with databases of games and consoles—and that games played on different machines would be de-authenticated. The intent was to expand Xbox One’s abilities via connections to the Internet and specific cloud services, not to shackle users with unreasonable DRM, Microsoft VP Phil Harrison told gaming site Kotaku. But Xbox users freaked, to put it mildly, and Microsoft reversed its policies. In addition to features such as chat or video connections for the Xbox Kinect feature, Xbox Live will try to improve game performance by taking over some of the less time-sensitive graphics calculations. That workload, which Microsoft calls "latency-insensitive computation," mainly covers rendition of the portions of onscreen images that don't have to be updated dozens of times per second to keep game-play lively and in sync with the game's controllers, Matt Booty, Microsoft GM of Redmond Game Studios and Platforms told Ars Technica in May. Having plenty of processing power within the cloud, rather than on thousands of individual game consoles semi-consolidated for massed group play, also gives third-party game developers a better networked platform for their games. Even more important to game providers, the automation that Microsoft datacenter managers will use to help run their own operating systems are also available to third-party software developers to help automate or optimize installs—and to provide AI or other capabilities to improve game lay by customizing application performance and functions to match the user’s abilities, Jon Shiring, engineer of the upcoming game “Titanfall,” told Forbes. "That has allowed us to push the boundaries in online multiplayer,” he added. “Over time I expect that we'll be using these servers to do a lot more than just dedicated [application platforms]. This is something that's going to let us drive all sorts of new ideas in online games for years to come." Hosting third-party games as well as its own apps and administrative tools would tax even the biggest datacenter. That may be why Microsoft is already talking with Iowa officials about $300 million worth of expansion and improvements to the $700 million datacenter. It won't be obvious until November whether one datacenter can compensate for any of the possible problems or complaints following the update of a popular console. It is clear from Microsoft's recent experience that there are a lot of things that go into the success or failure of a game system, and not all of them have to do with capacity.   Image: Respawn Entertainment