One of the ways I stay on top of industry trends in the U.S. is by monitoring what is going on in the government sector. The reason is that not only are governments really big spenders on IT in general, but due to their very nature they spend a lot on the creation, storage, management and analysis of an extraordinary amount of structured as well as unstructured data. State and local government IT professionals, like their colleagues in the enterprise world, are currently staring at a data explosion. A question regarding these big challenges is whether they have a strategy for using all of the capabilities of “Big Data” to improve decision making and meet mission objectives?
A new study by the wonderful folks at MeriTalk, the government IT network (underwritten by NetApp), called, “The State and Local Big Data Gap,” in a word says NO! It reveals that despite the known advantages of big data, few state and local agencies are taking action to harness and analyze it.
The study is based on a survey of 150 state and local government CIOs and IT managers that take place from November to December 2012. On one hand, it found that state and local IT professionals see the value in big data.
Where things get problematic are on the strategy, evaluation and implementation side of things. Interest is not being translated into action, and this is not a reflection of a trickle down impact of the current challenges facing federal IT people because of the sequestration.
While state and local IT professionals recognize the benefits of big data, few are moving on using it. The survey found that:
While they understand the promise of big data, just 59 percent of state and local agencies are analyzing the data they collect and less than half are using it to make strategic decisions.
On average, state and local IT professionals report that it will take their agencies at least three years to take full advantage of big data
“State and local agencies have made great strides in consolidating applications and data into fewer physical resources,” said Regina Kunkle, vice president, State & Local Government, NetApp. “Storage efficiencies like de-duplication and compression help to manage the explosive storage growth by reducing the amount of storage required and simplifying data management. However, agencies still have data silos, and they are just beginning to explore how to effectively analyze this disparate data. To help them unlock this valuable wealth of information, agencies should look toward big data solutions.”
There is a reason it is called big data
As MeriTalk notes, “The average state and local agency stores 499 terabytes of data. State and local IT professionals expect that amount of data to continue to grow.”
Data challenges are daunting
The survey looked not only at the state of big data adoption but also at other challenges state and local governments are having in managing the above big waves of data they are required to manage. The results are illuminating. The challenges in order were:
Where things get complicated is that respondents are unclear about who owns the data. 47 percent believe that IT owns the data and 31 percent believe ownership belongs to the department that generated it.
The bad news for vendors in the big data space is that these rather sophisticated users believe there is a gap between big data’s promise and big data reality. The numbers are sobering. State and local agencies estimate that they have just 46 percent of the data storage/access, 42 percent of the computational power, and 35 percent of the personnel they need to successfully leverage big data. In addition, 57 percent say their current enterprise architecture is not able to support big data initiatives.
The survey did have at least one somewhat bright spot. Despite the technology challenges, and the current funding issues which create planning challenges of their own and make new projects difficult to justify even with the ROI and TCO projects are impressive, some state and local agencies understand that employing next-generation technology is important. This is reflected in the fact that 39 percent of respondents are investing in IT systems/solutions to improve data processing, 39 percent are improving the security of stored data, and 37 percent are investing in IT infrastructure to improve data storage.
What the study reveals is that while big data is on state and local government IT professional wish lists, the more practical matters of data processing, storage and data integrity/security remain top of mind. Plus, data ownership challenges stand in the way and the business case, which must be driven by all of the stakeholders sitting at the table and agreeing to cooperate and then agreeing on a path forward, are reflected in the numbers about there being limited action and even limited discussion about big data implementations in this sector. This is unfortunate since a good case can be made that a more efficient and effective government, particularly at the levels where they are most experienced, is where big data solutions could be invaluable both in terms of service delivery and the resultant customer satisfaction.
It will be interesting to see if these results change markedly the next time MeriTalk takes a look.
After 20 years as commander-in-chief, John Chambers is moving from CEO to executive chairman. His replacement will be Charles Robbins, who joined the …
More sophisticated cyber-security threats are changing the threat detection landscape, and requiring companies to think differently about how they loo…
One of the things we in the technology industries are keenly aware of and focused on goes under the generic appellation, "the changing nature of work.…
Let's start out by saying I was anything but a huge fan of Carly Fiorina when she was CEO of HP, but everything is relative and I've come to look at m…
Conventional network defense tools such as intrusion detection systems and anti-virus focus on the vulnerability component of risk, with incident resp…