It probably should come as no surprise, but a new study by MeriTalk (the online community for the government IT community) sponsored by NetAPP backs up its title, “The Big Data Gap.” The findings should give U.S. policymakers food for thought. As the announcement of the study states, the study reveals that Federal IT professionals believe big data can improve government but that the promise of big data is locked away in unused or inaccessible data.
Hitting the highlights
The full study is available at the following link “The Big Data Gap.” It is based on a survey of 151 Federal government CIOs and IT managers from March 2012. As pointed out in its release, it corroborates the need for President Obama’s recently announced Big Data Research and Development Initiative which focuses on the big data promise – that improving our ability to extract knowledge and insights from large and complex collections of data will help government solve problems.
Here is a summary of just a few of the survey’s results:
The good news here is not only is there awareness but there is clear recognition of the benefits. However, and isn’t there always a “however”?, the recognition and awareness do not necessarily translate into actions.
Almost as interesting was the acknowledgement of the tsunami of data heading the government’s way:
- Just 49 percent of the data content storage/access they need
- 46 percent of the bandwidth/computational power
- 44 percent of the personnel they need
Anecdotally, 57 percent say they have a least one dataset that has grown too big to work with using their current management tools and/or infrastructure.
For those you who read my recent piece on a European Communications magazine quarterly survey of telecom executives, the bullets about lack of personnel seem to be emerging as a common barrier to the implementation of big data solutions in both the public and private sector. It almost is analogous to the question, “If a tree falls in the forest but nobody is around to hear it crash, did it make a sound?” In this case, even assuming there is technology that can generate business intelligence and insights are there enough capable bodies around to turn data into insights and insights into improved operations?
This education issue is clearly one that must be addressed. This was a thought emphasized by Mark Weber, president of U.S. Public Sector for NetApp who said, “Government has a gold mine of data at its fingertips…The key is turning that data into high-quality information that can increase efficiencies and inform decisions. Agencies need to look at big data solutions that can help them efficiently process, analyze, manage, and access data, enabling them to more effectively execute their missions.”
Nobody has said that implementing big data solutions would be easy. That said, government agencies may in many ways face larger obstacles than enterprises. As noted in the study, “While agencies have a huge amount of data – that continues to grow – in many agencies the data is locked away.” And, not only is the data being captured growing exponentially but much of the growth is coming in the form of unstructured data which is hard to handle.
Darkening the picture a bit more is the issue of data ownership.
There is a lot more in the full study to digest. However, the message seems loud and clear; the agencies (military and civilian) are more than aware that big data should be in their future. The questions, which hopefully the administrations initiative will provide guidance on, are what practical steps can be taken in the short as well as long-term for big data to find its rightful place in government.
In closing, I’d like to point everyone to the congressional testimony of William Scherlis, director of Carnegie Mellon’s Institute for Software Research, who presented at the House Ways and Means Committee Social Security subcommittee this week. He stated that the Social Security Administration (SSA), “cannot accomplish its mission without effective IT and effective IT leadership.” What can and should be done to bring the SSA up to speed, according to Scherlis is well worth the read.
CAD Windows apps can now be moved to the cloud thanks to Menlo Park, California-based startup Frame (formerly MainFrame2). This represents what is lik…
Internet for the billions of underserved around the globe continues to get closer to reality. The latest is Facebook's plan to trial a version of its …
With change a constant in the technology, media and entertainment (TME) sector, it should come as no surprise that this past week alone saw important …
A recent study from Cognizant, a provider of information technology and business process outsourcing services, concludes that software "robots" are ha…
Google Fiber is gearing up to expand to one more metro area-Salt Lake City. The Utah capital will join the Atlanta, Charlotte, Nashville and Raleigh-D…