Murphy's Revenge: Higher Complexity Networks Have Higher Chances for Failure

July 17, 2014
By: Steve Anderson

One of the most universal precepts ever devised by man is the precept known as Murphy's Law. Though it's been said dozens of different ways over the years—some believe the law goes as far back as the 1800s—it boils down to the same essential concept: whatever can go wrong, will go wrong. Over those same decades, there have been plenty of chances for Murphy's Law to get proven out, and a new report from MeriTalk shows that, indeed, the more complex a network is, the higher the chances exist for something within that network to go very wrong.

The MeriTalk report, titled “The Federal Simplicity Report: Navigating Network Complexity” and underwritten by Brocade (News - Alert), showed that Murphy was indeed alive and well and working, particularly in the federal government's network systems. The study showed that when more complex networks are involved, the chance of failure causing “frequent disruptions” is actually three times higher than those organizations that have networks that are comparatively simpler. Moreover, 94 percent of the agencies in question have actually experienced such downtime in the last 12 months—here expanded to include “downtime that has impacted agency mission”--and the problem may well get worse from there.

A majority—54 percent—of federal IT managers say that network complexity has actually increased in the last year, while 68 percent believe that such complexity will only continue to increase. The reason behind all these increases? New technologies like server virtualization, which 33 percent of users suggest will call for further complexity in the network, as well as cloud computing which 32 percent believe will do likewise, and even the move toward mobile devices and the bring your own device (BYOD) philosophy which claim 30 percent of those believing further complexity is afoot.

That increasing complexity and accompanying downtime doesn't come without costs. Those responding to the study say that, should network complexity be cut in half from its current levels, the savings could be, reportedly, as much as $14.8 billion throughout the entire government, around 18 percent of its total IT budget. Given that a further 81 percent of network managers believe network complexity can actually slow performance, and 68 percent believe it completely halts the ability to bring out new technologies at all, that can be a serious problem.

But what to do about it? There's one common thread on which the agencies agree: moving to open, non-proprietary standards will be a huge help in dropping the overall complexity of the network. Beyond that, there were some other points that were less universally agreed on: 44 percent called for more bandwidth, 28 percent called for further redundancy augmentation, and 22 percent suggested increasing virtualization. Any of these would be a reasonably sound idea. While this may sound like just more spending from the government, most businesses understand the value of redundancy and extra bandwidth; it's never a bad idea to have a spare around for when something goes awry, and it's also a good idea to have just a little more capacity than is strictly needed for those unforeseen events. 

There's a lot of value in simplicity. The simpler something is, the easier it is to fix when something goes wrong. It's true of cars, it's true of plumbing, it's true of networks. The simpler something can be kept, the easier—the cheaper—it is to maintain. Murphy's Law is alive and well, so remember its inverse: the fewer things that can go wrong, the fewer things will go wrong.

 


Original Page