When it comes to being a rich resource of information, U.S. Federal government IT operations and the public-private partnership MeriTalk has few peers. To paraphrase a financial services firm tag (News - Alert) line from years ago, “When Meritalk talks, people listen.” One can only hope that based on its new report (underwritten by Actifio), “Consolidation Aggravation: Tip of the Data Management Iceberg,” that those in charge are not only listening but will react.
The study, based on a poll of 150 federal IT decision makers reveals that by 2024, agencies will spend as much as $16.5 billion storing redundant copies of non-production data – working directly against the Federal Data Center Consolidation Initiative (FDDCI). You read correctly, $16.5 billion on what should be considered unnecessary data storage.
The poll revealed that while Federal agencies have prioritized consolidation and transitioned to more efficient and agile cloud-based systems, 72 percent of Federal IT managers said their agency has maintained or increased their number of data centers since FDCCI launched in 2010. Only 6 percent gave their agency an “A” for consolidation efforts against FDCCI’s 2015 deadline.
This is not a pretty picture to say the least, and the report found more than a few barriers preventing consolidation. In fact, disturbing was the revelation that overall resistance, data management challenges, and data growth are preventing data center optimization and are, “actually driving copy data growth, resulting in increased storage costs.”
Respondents indicated they are focused on both managing data growth and consolidating data centers as top priorities for next year. Of interest was that the study found that agencies don’t necessarily have too many servers or too much space – they have too many systems creating redundant copies of data for multiple purposes.
Just how bad is it?
On average, more than one in four agencies utilize 50 to 88 percent of agency data storage to store copy or non-primary data – and storing these copies is costly. In fact, 27 percent of the average agency’s storage budget went toward non-primary data in 2013. That number is expected to grow to 31 percent for the full 2014 year. This translates to a $2.7 billion cost in 2014, a $3.1 billion cost in 2015, and as much as $16.5 billion over the next ten years. It is reminiscent of a remark attributed to the late U.S. Senator Everett Dirksen of Illinois who reportedly said, "A billion here, a billion there, and pretty soon you're talking real money."
A growing number of applications and multiple data owners are propelling growth in the number of data copies – and one in three agencies admit that they do not vary the number of copies based on an original copy’s significance or the likelihood that it will be used again. In fact, 40 percent of Federal data assets exist four or more times.
The pain points highlighted
Ironically, when asked for the top pain points associated with copy management, respondents listed regulatory requirements, culture challenges, and storage shortfalls – all ahead of data growth. In fact, the chart on pain points is instructive.
Source (News - Alert): Consolidation Aggravation: Tip of the Data Management Iceberg, MeriTalk (Click to enlarge)
In fact, the quote is important as it shows the impact culture can have on change. This is not just a government issue but also holds true in large enterprises globally—old habits are hard to break.
“We’ve seen the dramatic impact of a more holistic approach to copy data management in the private sector for years now,” said Ash Ashutosh, Founder and CEO of Actifio. “Frankly I’m not surprised by the magnitude of the potential savings at the Federal level, or that this has now come to light as a significant barrier to FDCCI. Copy data virtualization is today where server virtualization was 10 years ago. We’re thrilled it’s now been identified as a strategy that can dramatically accelerate the process of data center consolidation, and get FDCCI back on track.”
The majority of survey respondents said better management of copy data will help make their agency’s consolidation efforts under FDCCI successful, though just 9 percent of agencies have implemented projects to better manage storage and data growth today.
The lesson articulated by the authors is as agencies work toward the FDCCI deadline, and ultimately, a transition to the cloud, they must shift the discussion of FDCCI from server virtualization to enhanced data management and virtualization.
“With the public flogging that is healthcare.gov, agencies’ IT departments have a siege mentality,” said Steve O’Keeffe, founder of MeriTalk. “Leaders like Terry Halverson, the new CIO for the Department of Defense, are showing real leadership – going at the root causes for today’s Federal IT malaise. Data and application sprawl are the enemies of government IT efficiency. We need leadership to empower Federal IT innovators to change the failing equation. We need a cultural and acquisition shift to enable new models and the shared services that will unlock new efficiencies and real savings.”
As noted at the top, and in the previous quote, this is not just about technology, although clearly federal IT professionals would like some addition tools. It is about leadership and affecting cultural change. As the U.S. government is committed to a cloud-based future it will be interesting to see if the “Federal IT malaise” can be overcome.