IBM’s Edge2012 conference, held June 4-5 in Orlando, Fla., highlighted the company’s recent work in storage. IBM used the venue to highlight the three thematic pillars undergirding its array of storage products: efficiency, self-optimization, and cloud agility.
A broad number of organizations, including many of IBM’s customers, are wrestling with a veritable tide of structured and unstructured data. Financial institutions, for example, need to manage tons of customer and financial information on an hourly basis; a state or city government wants to analyze its mountains of historical data for insights that will allow it to deliver better services or lower the crime rate. Facebook handles some 2.39 trillion I/Os on a daily basis.
That places considerable demand on data centers to store and deliver information in an efficient and timely way—while (hopefully) saving organizations a few bucks in the process. IBM is focused on lowering administrator complexity with regard to software UI (the aforementioned efficiency) along with tweaking data center software to better place data (the self-optimization part); it is also begun moving into the cloud on a number of fronts, including development tools.
IBM’s real-world solutions to those issues include its Real-Time Compression integrated with Storwize V700 and San Volume Controller (SVC), which can compress and update data at rest without moving it—sparing the data center from the I/O traffic associated with moving said data, compressing it, and then sending it back to rest. IBM is also offering a Storage Analytics Engine that examines storage arrays and reports on optimal migration.
IBM is also looking toward the longer-term, revealing plans during the conference for ways to preserve digital data for a half-century. Jane Clabby, an analyst with Clabby Analytics and a conference attendee, wrote that IBM has researchers examining the “advancing standards in this area both for bit preservation (media) and logical preservation (interpretation)” with the goal of producing a device able to store data for 50 years “in a dense (40 petabytes to one rack) low power format.”
However, no matter what equipment rolls out for servers and racks (now or in the far future), another analyst cautions that the data center is something that deserves examination in a more holistic way:
“IBM understands that making storage smarter and smarter is a long-term process that is fundamental to moving to IT-as-a-service,” David Hill, an analyst with the Mesabi Group, wrote in a June 20 research note. “IBM is responsible for delivering the products and services that make Smarter Storage happen, but customers need to increase their understanding of the process as a whole rather than focusing on individual products.”
That’s probably why IBM—along with data center competitors such as Hewlett-Packard—are so intent on pushing the concept of a “consolidated” or “united” data center at the moment.
Image: Jakub Pavlinec/Shutterstock.com