New MeriTalk Study Finds State and Local Agencies Not Prepared to Leverage Big Data


One of the ways I stay on top of industry trends in the U.S. is by monitoring what is going on in the government sector. The reason is that not only are governments really big spenders on IT in general, but due to their very nature they spend a lot on the creation, storage, management and analysis of an extraordinary amount of structured as well as unstructured data. State and local government IT professionals, like their colleagues in the enterprise world, are currently staring at a data explosion. A question regarding these big challenges is whether they have a strategy for using all of the capabilities of “Big Data” to improve decision making and meet mission objectives?

A new study by the wonderful folks at MeriTalk, the government IT network (underwritten by NetApp), called, “The State and Local Big Data Gap,” in a word says NO! It reveals that despite the known advantages of big data, few state and local agencies are taking action to harness and analyze it.

The study is based on a survey of 150 state and local government CIOs and IT managers that take place from November to December 2012. On one hand, it found that state and local IT professionals see the value in big data. 

  • Respondents cited improving overall agency efficiency (57 percent)
  • Improving speed and accuracy of decisions (54 percent)
  • Achieving a greater understanding of citizen needs and how to meet them (37 percent) as the top advantages to tapping into big data 

Where things get problematic are on the strategy, evaluation and implementation side of things. Interest is not being translated into action, and this is not a reflection of a trickle down impact of the current challenges facing federal IT people because of the sequestration. 

While state and local IT professionals recognize the benefits of big data, few are moving on using it. The survey found that:  

  • 79 percent of state and local IT professionals say they are just somewhat or not very familiar with the term “big data”
  • Only two percent say they have a complete big data strategy
  • Big data isn’t on the radar screen for 44 percent of state and local agencies – they are not even discussing it

While they understand the promise of big data, just 59 percent of state and local agencies are analyzing the data they collect and less than half are using it to make strategic decisions. 

On average, state and local IT professionals report that it will take their agencies at least three years to take full advantage of big data

“State and local agencies have made great strides in consolidating applications and data into fewer physical resources,” said Regina Kunkle, vice president, State & Local Government, NetApp. “Storage efficiencies like de-duplication and compression help to manage the explosive storage growth by reducing the amount of storage required and simplifying data management.  However, agencies still have data silos, and they are just beginning to explore how to effectively analyze this disparate data.  To help them unlock this valuable wealth of information, agencies should look toward big data solutions.”

There is a reason it is called big data

As MeriTalk notes, “The average state and local agency stores 499 terabytes of data.  State and local IT professionals expect that amount of data to continue to grow.”

  • 87 percent of state and local agencies say the size of their stored data has grown in the last two years
  • 97 percent expect data to grow by an average of 53 percent in the next two years
  • One in three state and local agencies has a data set that has grown too large to work with given their current capacity limitations

Data challenges are daunting

The survey looked not only at the state of big data adoption but also at other challenges state and local governments are having in managing the above big waves of data they are required to manage. The results are illuminating. The challenges in order were:

  • Storage capacity (46 percent)
  • Speed of analysis/processing (34 percent)
  • Analysis (32 percent)

Where things get complicated is that respondents are unclear about who owns the data.  47 percent believe that IT owns the data and 31 percent believe ownership belongs to the department that generated it.

The bad news for vendors in the big data space is that these rather sophisticated users believe there is a gap between big data’s promise and big data reality.  The numbers are sobering. State and local agencies estimate that they have just 46 percent of the data storage/access, 42 percent of the computational power, and 35 percent of the personnel they need to successfully leverage big data.  In addition, 57 percent say their current enterprise architecture is not able to support big data initiatives.

The survey did have at least one somewhat bright spot. Despite the technology challenges, and the current funding issues which create planning challenges of their own and make new projects difficult to justify even with the ROI and TCO projects are impressive, some state and local agencies understand that employing next-generation technology is important. This is reflected in the fact that 39 percent of respondents are investing in IT systems/solutions to improve data processing, 39 percent are improving the security of stored data, and 37 percent are investing in IT infrastructure to improve data storage.

What the study reveals is that while big data is on state and local government IT professional wish lists, the more practical matters of data processing, storage and data integrity/security remain top of mind. Plus, data ownership challenges stand in the way and the business case, which must be driven by all of the stakeholders sitting at the table and agreeing to cooperate and then agreeing on a path forward, are reflected in the numbers about there being limited action and even limited discussion about big data implementations in this sector. This is unfortunate since a good case can be made that a more efficient and effective government, particularly at the levels where they are most experienced, is where big data solutions could be invaluable both in terms of service delivery and the resultant customer satisfaction. 

It will be interesting to see if these results change markedly the next time MeriTalk takes a look.    

Edited by Jamie Epstein
Related Articles

Coding and Invention Made Fun

By: Special Guest    10/12/2018

SAM is a series of kits that integrates hardware and software with the Internet. Combining wireless building blocks composed of sensors and actors con…

Read More

Facebook Marketplace Now Leverages AI

By: Paula Bernier    10/3/2018

Artificial intelligence is changing the way businesses interact with customers. Facebook's announcement this week is just another example of how this …

Read More

Oct. 17 Webinar to Address Apache Spark Benefits, Tools

By: Paula Bernier    10/2/2018

In the upcoming webinar "Apache Spark: The New Enterprise Backbone for ETL, Batch and Real-time Streaming," industry experts will offer details on clo…

Read More

It's Black and White: Cybercriminals Are Spending 10x More Than Enterprises to Control, Disrupt and Steal

By: Cynthia S. Artin    9/26/2018

In a stunning new report by Carbon Black, "Hacking, Escalating Attacks and The Role of Threat Hunting" the company revealed that 92% of UK companies s…

Read More

6 Challenges of 5G, and the 9 Pillars of Assurance Strategy

By: Special Guest    9/17/2018

To make 5G possible, everything will change. The 5G network will involve new antennas and chipsets, new architectures, new KPIs, new vendors, cloud di…

Read More