IBM Overhauls Big Data and Storage Offerings

Enterprise Big Data demands a lot of infrastructure. IBM wants to provide that infrastructure.

IBM has unveiled what it calls a “formal approach” to leveraging its portfolio of data-storage and analysis technologies assembled over the past few years. Termed IBM Smarter Storage, the initiative is designed to boost the speed and efficiency of crunching massive amounts of data.

To that end, the initiative also includes enhancements to products such as the IBM System Storage SAN Volume Controller (SVC), a storage virtualization system, and IBM Storwize V7000, which will increase effective storage capacity by applying real-time compression and four-way clustering support. IBM has also added Enhanced FlashCopy capabilities to IBM System Storage DS3500, which results in 50 percent more snapshots for faster backups, and thin provisioning, for boosting utilization of disk storage.

On top of that, IBM has introduced software designed to expand and streamline use of IBM TS3500 tape libraries, as well as lifecycle management of multimedia files. The company’s engineers tinkered with the IBM Tivoli Storage Productivity Center (TPC) suite, for managing massive data-storage requirements; a Web-based interface apparently simplifies viewing and management of the infrastructure in question. TPC features integration of IBM Cognos, for creating customized data reports.

“Enterprises are dealing with data that is increasing exponentially in both size and complexity,” Rod Adkins, senior vice president of IBM Systems & Technology Group, wrote in a June 4 statement. “The enhanced systems and storage solutions we’re announcing today have the performance, efficiency and intelligence to handle this Big Data.”

Indeed, these offerings seem very firmly aimed at the enterprise, which generally has millions to spend on complex infrastructure. That being said, some of IBM’s competitors—mostly smaller vendors looking for a slice of the market-pie to call their own—have begun to issue analytics products (mostly cloud-based) meant for small- to midsize businesses.

As part of the initiative, IBM is also introducing the IBM Platform Symphony family, a grid manager for Big Data workloads and analytics; the IBM System x Intelligent Cluster integrated with IBM Platform HPC software, the better so (supposedly) simplify cluster deployment and deliver speedier data results; and the addition of IBM Platform Cluster Manager and IBM Platform LSF to the High Performance Computing (HPC) Cloud portfolio.

That’s in addition to still other offerings such as the IBM System x iDataPlex dx360 M4, a new and more robust update to the platform that can process workloads faster, and IBM General Parallel File System (GPFS) with the addition of Active File Management (AFM) software.

IBM clearly wants a giant piece of the Big Data market, particularly as it applies to technical computing, or the application of heavy-duty analytics and technical programming to visualize and solve commercial issues. Research firm IDC predicts that the technical computing market will grow from $20.3 billion in 2012 to nearly $29.2 billion by 2016.

IBM’s rollout comes at a time when other giant tech vendors, including Oracle and SAP, are devoting more resources toward the development of B.I. and data-analytics platforms designed to give businesses insight and an edge over the competition. IBM has also recently made some acquisition plays—including the purchase of Vivisimo and Platform Computing—designed to boost its strength in the category. Technology from the Platform Computing acquisition has already found its way into this latest round of IBM product releases.


Image: White78/