New MeriTalk Study Finds State and Local Agencies Not Prepared to Leverage Big Data

By Peter Bernstein May 01, 2013

One of the ways I stay on top of industry trends in the U.S. is by monitoring what is going on in the government sector. The reason is that not only are governments really big spenders on IT in general, but due to their very nature they spend a lot on the creation, storage, management and analysis of an extraordinary amount of structured as well as unstructured data. State and local government IT professionals, like their colleagues in the enterprise world, are currently staring at a data explosion. A question regarding these big challenges is whether they have a strategy for using all of the capabilities of “Big Data” to improve decision making and meet mission objectives?

A new study by the wonderful folks at MeriTalk, the government IT network (underwritten by NetApp), called, “The State and Local Big Data Gap,” in a word says NO! It reveals that despite the known advantages of big data, few state and local agencies are taking action to harness and analyze it.

The study is based on a survey of 150 state and local government CIOs and IT managers that take place from November to December 2012. On one hand, it found that state and local IT professionals see the value in big data. 

  • Respondents cited improving overall agency efficiency (57 percent)
  • Improving speed and accuracy of decisions (54 percent)
  • Achieving a greater understanding of citizen needs and how to meet them (37 percent) as the top advantages to tapping into big data 

Where things get problematic are on the strategy, evaluation and implementation side of things. Interest is not being translated into action, and this is not a reflection of a trickle down impact of the current challenges facing federal IT people because of the sequestration. 

While state and local IT professionals recognize the benefits of big data, few are moving on using it. The survey found that:  

  • 79 percent of state and local IT professionals say they are just somewhat or not very familiar with the term “big data”
  • Only two percent say they have a complete big data strategy
  • Big data isn’t on the radar screen for 44 percent of state and local agencies – they are not even discussing it

While they understand the promise of big data, just 59 percent of state and local agencies are analyzing the data they collect and less than half are using it to make strategic decisions. 

On average, state and local IT professionals report that it will take their agencies at least three years to take full advantage of big data

“State and local agencies have made great strides in consolidating applications and data into fewer physical resources,” said Regina Kunkle, vice president, State & Local Government, NetApp. “Storage efficiencies like de-duplication and compression help to manage the explosive storage growth by reducing the amount of storage required and simplifying data management.  However, agencies still have data silos, and they are just beginning to explore how to effectively analyze this disparate data.  To help them unlock this valuable wealth of information, agencies should look toward big data solutions.”

There is a reason it is called big data

As MeriTalk notes, “The average state and local agency stores 499 terabytes of data.  State and local IT professionals expect that amount of data to continue to grow.”

  • 87 percent of state and local agencies say the size of their stored data has grown in the last two years
  • 97 percent expect data to grow by an average of 53 percent in the next two years
  • One in three state and local agencies has a data set that has grown too large to work with given their current capacity limitations

Data challenges are daunting

The survey looked not only at the state of big data adoption but also at other challenges state and local governments are having in managing the above big waves of data they are required to manage. The results are illuminating. The challenges in order were:

  • Storage capacity (46 percent)
  • Speed of analysis/processing (34 percent)
  • Analysis (32 percent)

Where things get complicated is that respondents are unclear about who owns the data.  47 percent believe that IT owns the data and 31 percent believe ownership belongs to the department that generated it.

The bad news for vendors in the big data space is that these rather sophisticated users believe there is a gap between big data’s promise and big data reality.  The numbers are sobering. State and local agencies estimate that they have just 46 percent of the data storage/access, 42 percent of the computational power, and 35 percent of the personnel they need to successfully leverage big data.  In addition, 57 percent say their current enterprise architecture is not able to support big data initiatives.

The survey did have at least one somewhat bright spot. Despite the technology challenges, and the current funding issues which create planning challenges of their own and make new projects difficult to justify even with the ROI and TCO projects are impressive, some state and local agencies understand that employing next-generation technology is important. This is reflected in the fact that 39 percent of respondents are investing in IT systems/solutions to improve data processing, 39 percent are improving the security of stored data, and 37 percent are investing in IT infrastructure to improve data storage.

What the study reveals is that while big data is on state and local government IT professional wish lists, the more practical matters of data processing, storage and data integrity/security remain top of mind. Plus, data ownership challenges stand in the way and the business case, which must be driven by all of the stakeholders sitting at the table and agreeing to cooperate and then agreeing on a path forward, are reflected in the numbers about there being limited action and even limited discussion about big data implementations in this sector. This is unfortunate since a good case can be made that a more efficient and effective government, particularly at the levels where they are most experienced, is where big data solutions could be invaluable both in terms of service delivery and the resultant customer satisfaction. 

It will be interesting to see if these results change markedly the next time MeriTalk takes a look.    

Edited by Jamie Epstein
Related Articles

Mist Applies AI to Improve Wi-Fi

By: Paula Bernier    11/9/2017

Mist has created an AI-driven wireless platform that puts the user and his or mobile device at the heart of the wireless network. Combining machine le…

Read More

International Tech Innovation Growing, Says Consumer Technology Association

By: Doug Mohney    11/8/2017

The Consumer Technology Association (CTA) is best known for the world's largest trade event, but the organization's reach is growing far beyond the CE…

Read More

Broadcom Makes Unsolicited $130B Bid for Qualcomm

By: Paula Bernier    11/6/2017

In what could result in the biggest tech deal in history, semiconductor company Broadcom has made an offer to buy Qualcomm for a whopping $130 billion…

Read More

How Google's 'Moonshot' Could Benefit Industrial Markets

By: Kayla Matthews    10/30/2017

The term "moonshot" encapsulates the spirit of technological achievement: an accomplishment so ambitious, so improbable, that it's equivalent to sendi…

Read More

After Cisco/Broadsoft, Who's Next for M&A?

By: Doug Mohney    10/27/2017

Cisco's trail of acquisition tears over the decades includes the Flip video camera, Cerent, Scientific Atlantic, Linksys, and a couple of others. The …

Read More