Network-market-dominating Cisco Systems has announced plans to buy Composite Software, a data-virtualization company, for $180 million in cash; the deal is scheduled to close in the first quarter of 2014.
The acquisition is designed to give Cisco’s networking products the ability to track, manage and display a company’s data wherever it is stored—including external and internal cloud platforms, virtual servers, virtualized storage, within virtual applications or SaaS implementations, or even on public folders stored on physical file- and application servers, according to Hilton Romanski, Cisco VP and chief of business development.
The goal of the acquisition, according to Romanski’s blog post, is to add composite data virtualization to the list of virtualization technologies built directly into Cisco networking and integrated-computing products. Data virtualization is the process of integrating data stored in a variety of applications, databases, cloud platforms and other locations through a single infrastructure that gives companies control over their data regardless of where or in what form it is stored.
Composite Software’s Composite Data Virtualization Platform is built on middleware designed to connect to data stored in any type of database, file system or application, and present it in a single, unified view to be used by any business application or end user, and managed from a central point by corporate IT.
Data virtualization is a services layer built into corporate IT infrastructures that is most often used to integrate data by replication or synchronization. It is most often used for advanced business-intelligence, enterprise search or transaction processing systems that need quick access to data stored anywhere within the company, according to Forrester analysts, which refer to data virtualization as “information as a service.”
In a project for the NYSE Euronext trading system, Composite Software built a DV system that acts as a data warehouse, the better to give 14 exchanges unified access to data on trades, orders quotes and other data, according to Cisco’s Romanski.
Once acquired by Cisco, Composite Software’s data virtualization will be incorporated in Cisco’s Smart Services products and be built into Cisco’s Unified Computing System products along with APIs to give it access to network topology and performance information.
Its assets will be combined with technology from Cisco’s recent acquisition of SolveDirect. The company’s cloud-based services-management-integration system is designed to simplify and automate data- and network connections between companies, allowing them to create automated processes and workflows that interoperate with systems from both companies. “By combining our network expertise with the performance of Cisco’s Unified Computing System and Composite’s software, we will provide customers with instant access to data analysis for greater business intelligence,” Gary Moore, Cisco’s president and chief operating officer, wrote in a statement.
In a March 2013 survey of 500 chief information officers, backup-software developer Veeam Software found that virtual infrastructures make up 51 percent of the average enterprise, and that 88 percent of CIOs regularly encounter problems with access, backup or recovery of data stored on virtual platforms.
Recovery of a virtual server, for example, took an average of five hours, compared to six hours for a physical server, the survey showed. Those numbers are actually worse than in 2011, when recovering a virtual server took four hours and a physical server took five.
Enterprise data warehouses are one way to address lack of control over distributed data, but are too static to keep up with frequent changes in business intelligence (BI) rules or changes in either data or the location in which it is stored, according to a report on business intelligence and data virtualization from Capgemini (PDF).
Data virtualization allows data to be stored in a logical layer within the IT infrastructure that not only makes data accessible, but does so in a consistent format that is readable by corporate applications and can apply rules for data quality, transformation, masking, cleansing or other data-management processes as part of its own data-management function, the report read.