[caption id="attachment_13745" align="aligncenter" width="500"] IT moves fast, but follows the lead of business and may never catch up.[/caption] Datacenters and IT departments are still changing too slowly to keep up with the demands of the businesses they serve, according to new research from Forrester. The IT infrastructure at most big businesses is evolving quickly, aided by a combination of cloud-based application and infrastructure services, the dynamic capacity allowed by both internal and external clouds, and software-defined everything, according to Forrester analyst Rachel Dines, who presented her findings at the Fujitsu Forum 2013 in Munich Nov. 5. Only one in three business-unit managers surveyed by Forrester believe the technology in their companies meshes effectively with specific business goals the company is trying to reach, Dines added, according to a story in Channelnomics that covered her talk. A quarter of business leaders actually think IT is holding the company back. "That's not good," she said. One reason IT can't keep up with changes on the business side is that IT has to bring decades' worth of IT infrastructure along with it on any leap forward – an effort often prevented by the technology itself. Most IT systems run faster with each generation, but they don't accelerate at the same pace. Servers and networks have grown in power and throughput capacity far more quickly than enterprise storage hardware, for example, slowing the speed of every system dependent on that data. Business-unit managers see the potential for innovation in technology-based businesses, but don't like that the bulk of IT budgets go to maintenance and support of existing systems rather than creating innovative new ones, Dines suggested. Forrester surveys show that European businesses are trying to push the percentage of IT budgets they spend on maintenance from 48 percent to 41 percent. Datacenters could probably manage that if they were able to automate as much of their work as it is technically possible to do, but hesitate to do so because IT managers making the decisions don't trust the tech to handle the automation. "Virtualization, mobility, agile development, cloud are all causing complexity to shoot upwards," Dines is quoted as saying in a Nov. 5 story in TechWorld. "We’ve gotten to the point where no matter how many people we throw at a problem it’s beyond human scale and that’s where automation can come in." Datacenter managers also have to take a more holistic approach to systems development by spending 80 percent of their time and budgets building relatively generic systems that can be adapted to a range of special-function additions, rather than making every new project a one-off effort with its own stack of support systems. Only 20 percent of the average datacenter should consist of specialty systems, not the bulk of them as is the case now. "I see organizations getting into these heavy silos of infrastructure that is just one application; that makes me nervous," Dines said. Some theoretically-perfect software-defined system that can be reconfigured, add or drop resources on the fly, and connect to cloud-based apps or data inside (and outside) the firewall would solve most of the automation and resource problems, but still wouldn't get IT fully up to speed with the things the business side genuinely needs. "Software-defined was a good step but it doesn’t go far enough," Dines said. "We want to think about order-to-cash, payroll, supply chain management. Actual business processes instead of [applications like] ERP and CRM and HCM and a million other acronyms." Each of those acronyms contains a million business functions, each of which would need to be translated into IT systems, so there will always be a lag between demand by business and supply by IT. If IT understood the ultimate aim of each round of demands, rather than just the functions involved in an automation process, it might be able to shrink that gap, though likely never close it completely.   Image: Shutterstock.com/sarra22