New MeriTalk Study Finds State and Local Agencies Not Prepared to Leverage Big Data

By

One of the ways I stay on top of industry trends in the U.S. is by monitoring what is going on in the government sector. The reason is that not only are governments really big spenders on IT in general, but due to their very nature they spend a lot on the creation, storage, management and analysis of an extraordinary amount of structured as well as unstructured data. State and local government IT professionals, like their colleagues in the enterprise world, are currently staring at a data explosion. A question regarding these big challenges is whether they have a strategy for using all of the capabilities of “Big Data” to improve decision making and meet mission objectives?

A new study by the wonderful folks at MeriTalk, the government IT network (underwritten by NetApp), called, “The State and Local Big Data Gap,” in a word says NO! It reveals that despite the known advantages of big data, few state and local agencies are taking action to harness and analyze it.

The study is based on a survey of 150 state and local government CIOs and IT managers that take place from November to December 2012. On one hand, it found that state and local IT professionals see the value in big data. 

  • Respondents cited improving overall agency efficiency (57 percent)
  • Improving speed and accuracy of decisions (54 percent)
  • Achieving a greater understanding of citizen needs and how to meet them (37 percent) as the top advantages to tapping into big data 

Where things get problematic are on the strategy, evaluation and implementation side of things. Interest is not being translated into action, and this is not a reflection of a trickle down impact of the current challenges facing federal IT people because of the sequestration. 

While state and local IT professionals recognize the benefits of big data, few are moving on using it. The survey found that:  

  • 79 percent of state and local IT professionals say they are just somewhat or not very familiar with the term “big data”
  • Only two percent say they have a complete big data strategy
  • Big data isn’t on the radar screen for 44 percent of state and local agencies – they are not even discussing it

While they understand the promise of big data, just 59 percent of state and local agencies are analyzing the data they collect and less than half are using it to make strategic decisions. 

On average, state and local IT professionals report that it will take their agencies at least three years to take full advantage of big data

“State and local agencies have made great strides in consolidating applications and data into fewer physical resources,” said Regina Kunkle, vice president, State & Local Government, NetApp. “Storage efficiencies like de-duplication and compression help to manage the explosive storage growth by reducing the amount of storage required and simplifying data management.  However, agencies still have data silos, and they are just beginning to explore how to effectively analyze this disparate data.  To help them unlock this valuable wealth of information, agencies should look toward big data solutions.”

There is a reason it is called big data

As MeriTalk notes, “The average state and local agency stores 499 terabytes of data.  State and local IT professionals expect that amount of data to continue to grow.”

  • 87 percent of state and local agencies say the size of their stored data has grown in the last two years
  • 97 percent expect data to grow by an average of 53 percent in the next two years
  • One in three state and local agencies has a data set that has grown too large to work with given their current capacity limitations

Data challenges are daunting

The survey looked not only at the state of big data adoption but also at other challenges state and local governments are having in managing the above big waves of data they are required to manage. The results are illuminating. The challenges in order were:

  • Storage capacity (46 percent)
  • Speed of analysis/processing (34 percent)
  • Analysis (32 percent)

Where things get complicated is that respondents are unclear about who owns the data.  47 percent believe that IT owns the data and 31 percent believe ownership belongs to the department that generated it.

The bad news for vendors in the big data space is that these rather sophisticated users believe there is a gap between big data’s promise and big data reality.  The numbers are sobering. State and local agencies estimate that they have just 46 percent of the data storage/access, 42 percent of the computational power, and 35 percent of the personnel they need to successfully leverage big data.  In addition, 57 percent say their current enterprise architecture is not able to support big data initiatives.

The survey did have at least one somewhat bright spot. Despite the technology challenges, and the current funding issues which create planning challenges of their own and make new projects difficult to justify even with the ROI and TCO projects are impressive, some state and local agencies understand that employing next-generation technology is important. This is reflected in the fact that 39 percent of respondents are investing in IT systems/solutions to improve data processing, 39 percent are improving the security of stored data, and 37 percent are investing in IT infrastructure to improve data storage.

What the study reveals is that while big data is on state and local government IT professional wish lists, the more practical matters of data processing, storage and data integrity/security remain top of mind. Plus, data ownership challenges stand in the way and the business case, which must be driven by all of the stakeholders sitting at the table and agreeing to cooperate and then agreeing on a path forward, are reflected in the numbers about there being limited action and even limited discussion about big data implementations in this sector. This is unfortunate since a good case can be made that a more efficient and effective government, particularly at the levels where they are most experienced, is where big data solutions could be invaluable both in terms of service delivery and the resultant customer satisfaction. 

It will be interesting to see if these results change markedly the next time MeriTalk takes a look.    




Edited by Jamie Epstein
Get stories like this delivered straight to your inbox. [Free eNews Subscription]
SHARE THIS ARTICLE
Related Articles

ChatGPT Isn't Really AI: Here's Why

By: Contributing Writer    4/17/2024

ChatGPT is the biggest talking point in the world of AI, but is it actually artificial intelligence? Click here to find out the truth behind ChatGPT.

Read More

Revolutionizing Home Energy Management: The Partnership of Hub Controls and Four Square/TRE

By: Reece Loftus    4/16/2024

Through a recently announced partnership with manufacturer Four Square/TRE, Hub Controls is set to redefine the landscape of home energy management in…

Read More

4 Benefits of Time Tracking Software for Small Businesses

By: Contributing Writer    4/16/2024

Time tracking is invaluable for every business's success. It ensures teams and time are well managed. While you can do manual time tracking, it's time…

Read More

How the Terraform Registry Helps DevOps Teams Increase Efficiency

By: Contributing Writer    4/16/2024

A key component to HashiCorp's Terraform infrastructure-as-code (IaC) ecosystem, the Terraform Registry made it to the news in late 2023 when changes …

Read More

Nightmares, No More: New CanineAlert Device for Service Dogs Helps Reduce PTSD for Owners, Particularly Veterans

By: Alex Passett    4/11/2024

Canine Companions, a nonprofit organization that transforms the lives of veterans (and others) suffering PTSD with vigilant service dogs, has debuted …

Read More