Federal Agencies Need More Storage, Personnel for Big Data

Federal agencies are filled to bursting with in-house data.

The good news for government agencies: they already possess the mountain of data necessary to make better decisions.

The bad news: those agencies need more data storage, computational horsepower, and personnel trained in analytics in order to unlock the power of that data.

That’s the conclusion reached in a new report by MeriTalk, an online community for government IT sponsored by NetApp. The report based its information on a March 2012 survey of 151 federal government CIOs and IT managers.

Some 59 percent of surveyed executives saw “improving agency efficiency” as the “top advantage” of Big Data, according to a May 7 statement from MeriTalk announcing the report, followed in succession by “speed/accuracy of decisions” and “ability to forecast.”

Around 60 percent of respondents said their agency analyzes its in-house data, with another 40 percent suggesting that data influenced strategic decisions. While that seems like a healthy percentage, other survey results hinted that, when it comes to utilizing Big Data and analytics, the federal government is moving much more slowly than commercial enterprises. It will take an average of three years for agencies to take “full advantage” of Big Data, CIOs and IT managers told MeriTalk, despite a rapid growth in stored data.

“Government has a gold mine of data at its fingertips,” Mark Weber, president of U.S. Public Sector for NetApp, wrote in a May 7 statement. “The key is turning that data into high-quality information that can increase efficiencies and inform decisions.”

According to survey respondents, obstacles to successful use of Big Data and analytics include a lack of resources—some 57 percent reported an in-house dataset too large to operate with existing analytics tools and infrastructure, and agencies overall felt they had less than half of the data storage, bandwidth/computational power, and personnel needed to most effectively drill down into their in-house data.

“Agencies need to look at big data solutions that can help them efficiently process, analyze, manage, and access data,” Weber added, “enabling them to more effectively execute their missions.”

On average, the surveyed executives estimated the time needed to double their data capacity at 10 months. Agencies have also embarked on attempts to optimize data storage, improve overall data security, and train workers to manage and analyze Big Data.

However, it remains to be seen whether the federal government can successfully implement a strategy that allows it to successfully harness growing amounts of data. President Obama’s Big Data Research and Development Initiative, launched in March, will pump hundreds of millions of dollars into improving the tools and techniques needed to leverage data to its fullest extent; but that process is just beginning.

Image: iDesign/Shutterstock.com

Comments

One Response to “Federal Agencies Need More Storage, Personnel for Big Data”

June 01, 2012 at 9:40 pm, Infomation News - Announcing SlashCloud said:

[…] and expert commentary on that fast-growing segment. Recent articles have covered everything from federal agencies’ need for more data analysts, to the Apple iPad’s effect on business intelligence vendors, to best practices for managing […]

Reply

Post a Comment

Your email address will not be published.