It probably should come as no surprise, but a new study by MeriTalk (the online community for the government IT community) sponsored by NetAPP backs up its title, “The Big Data Gap.” The findings should give U.S. policymakers food for thought. As the announcement of the study states, the study reveals that Federal IT professionals believe big data can improve government but that the promise of big data is locked away in unused or inaccessible data.
Hitting the highlights
The full study is available at the following link “The Big Data Gap.” It is based on a survey of 151 Federal government CIOs and IT managers from March 2012. As pointed out in its release, it corroborates the need for President Obama’s recently announced Big Data Research and Development Initiative which focuses on the big data promise – that improving our ability to extract knowledge and insights from large and complex collections of data will help government solve problems.
Here is a summary of just a few of the survey’s results:
- 59 percent of respondents said improving overall agency efficiency is the top advantage of Big data
- 51 percent stated that improving speed/accuracy of decisions was the number one advantage
- 30 percent said it was the ability to improve forecasting
The good news here is not only is there awareness but there is clear recognition of the benefits. However, and isn’t there always a “however”?, the recognition and awareness do not necessarily translate into actions.
- 60 percent of civilian agencies and 42 percent of Department of Defense/intelligence agencies say they are just now learning about big data and how it can work for their agency.
- The study notes that, “While the promise of big data is strong, most agencies are still years away from using it. Just 60 percent of IT professionals say their agency is analyzing the data it collects and less than half (40 percent) are using data to make strategic decisions.
- On average, respondents thought it will take their agencies three years to take full advantage of big data.
Almost as interesting was the acknowledgement of the tsunami of data heading the government’s way:
- Estimates are government agencies will be adding a petabyte of new data in the next two years.
- The survey found that respondents believe they have less than half the storage, computing, and personnel resources necessary to leverage big data for the top priorities listed above. Specifics include that the IT managers have:
- Just 49 percent of the data content storage/access they need
- 46 percent of the bandwidth/computational power
- 44 percent of the personnel they need
Anecdotally, 57 percent say they have a least one dataset that has grown too big to work with using their current management tools and/or infrastructure.
For those you who read my recent piece on a European Communications magazine quarterly survey of telecom executives, the bullets about lack of personnel seem to be emerging as a common barrier to the implementation of big data solutions in both the public and private sector. It almost is analogous to the question, “If a tree falls in the forest but nobody is around to hear it crash, did it make a sound?” In this case, even assuming there is technology that can generate business intelligence and insights are there enough capable bodies around to turn data into insights and insights into improved operations?
This education issue is clearly one that must be addressed. This was a thought emphasized by Mark Weber, president of U.S. Public Sector for NetApp who said, “Government has a gold mine of data at its fingertips…The key is turning that data into high-quality information that can increase efficiencies and inform decisions. Agencies need to look at big data solutions that can help them efficiently process, analyze, manage, and access data, enabling them to more effectively execute their missions.”
Nobody has said that implementing big data solutions would be easy. That said, government agencies may in many ways face larger obstacles than enterprises. As noted in the study, “While agencies have a huge amount of data – that continues to grow – in many agencies the data is locked away.” And, not only is the data being captured growing exponentially but much of the growth is coming in the form of unstructured data which is hard to handle.
Darkening the picture a bit more is the issue of data ownership.
- 42 percent reported that their IT departments own the data
- 28 percent said it belongs to the department that generates it
- 12 percent said it belongs to the C-level
There is a lot more in the full study to digest. However, the message seems loud and clear; the agencies (military and civilian) are more than aware that big data should be in their future. The questions, which hopefully the administrations initiative will provide guidance on, are what practical steps can be taken in the short as well as long-term for big data to find its rightful place in government.
In closing, I’d like to point everyone to the congressional testimony of William Scherlis, director of Carnegie Mellon’s Institute for Software Research, who presented at the House Ways and Means Committee Social Security subcommittee this week. He stated that the Social Security Administration (SSA), “cannot accomplish its mission without effective IT and effective IT leadership.” What can and should be done to bring the SSA up to speed, according to Scherlis is well worth the read.
Edited by Jamie Epstein