It probably should come as no surprise, but a new study by MeriTalk (the online community for the government IT community) sponsored by NetAPP backs up its title, “The Big Data Gap.” The findings should give U.S. policymakers food for thought. As the announcement of the study states, the study reveals that Federal IT professionals believe big data can improve government but that the promise of big data is locked away in unused or inaccessible data.
Hitting the highlights
The full study is available at the following link “The Big Data Gap.” It is based on a survey of 151 Federal government CIOs and IT managers from March 2012. As pointed out in its release, it corroborates the need for President Obama’s recently announced Big Data Research and Development Initiative which focuses on the big data promise – that improving our ability to extract knowledge and insights from large and complex collections of data will help government solve problems.
Here is a summary of just a few of the survey’s results:
The good news here is not only is there awareness but there is clear recognition of the benefits. However, and isn’t there always a “however”?, the recognition and awareness do not necessarily translate into actions.
Almost as interesting was the acknowledgement of the tsunami of data heading the government’s way:
- Just 49 percent of the data content storage/access they need
- 46 percent of the bandwidth/computational power
- 44 percent of the personnel they need
Anecdotally, 57 percent say they have a least one dataset that has grown too big to work with using their current management tools and/or infrastructure.
For those you who read my recent piece on a European Communications magazine quarterly survey of telecom executives, the bullets about lack of personnel seem to be emerging as a common barrier to the implementation of big data solutions in both the public and private sector. It almost is analogous to the question, “If a tree falls in the forest but nobody is around to hear it crash, did it make a sound?” In this case, even assuming there is technology that can generate business intelligence and insights are there enough capable bodies around to turn data into insights and insights into improved operations?
This education issue is clearly one that must be addressed. This was a thought emphasized by Mark Weber, president of U.S. Public Sector for NetApp who said, “Government has a gold mine of data at its fingertips…The key is turning that data into high-quality information that can increase efficiencies and inform decisions. Agencies need to look at big data solutions that can help them efficiently process, analyze, manage, and access data, enabling them to more effectively execute their missions.”
Nobody has said that implementing big data solutions would be easy. That said, government agencies may in many ways face larger obstacles than enterprises. As noted in the study, “While agencies have a huge amount of data – that continues to grow – in many agencies the data is locked away.” And, not only is the data being captured growing exponentially but much of the growth is coming in the form of unstructured data which is hard to handle.
Darkening the picture a bit more is the issue of data ownership.
There is a lot more in the full study to digest. However, the message seems loud and clear; the agencies (military and civilian) are more than aware that big data should be in their future. The questions, which hopefully the administrations initiative will provide guidance on, are what practical steps can be taken in the short as well as long-term for big data to find its rightful place in government.
In closing, I’d like to point everyone to the congressional testimony of William Scherlis, director of Carnegie Mellon’s Institute for Software Research, who presented at the House Ways and Means Committee Social Security subcommittee this week. He stated that the Social Security Administration (SSA), “cannot accomplish its mission without effective IT and effective IT leadership.” What can and should be done to bring the SSA up to speed, according to Scherlis is well worth the read.
Yesterday, the House of Representatives voted 215-205 to eliminate privacy rules aimed at protecting the browsing histories and data of U.S. broadband…
Potential benefits of data analysis include enhanced marketing potential, the ability to improve overall efficiency as well as the means to track and …
One of the major fears of any IT department is losing control - of projects, of users, of applications. Yet, even with the best technology solutions, …
Optane is Intel's brand name for 3D XPoint memory, a brand-new memory architecture which has speed a bit slower than DRAM but otherwise performs like …
If you want to know what the future of IT looks like, it's always good to look to IBM. The company pioneered and championed PCs, the Internet, open so…