Is There a Silver Bullet for Ransomware?
CEOs, CIOs, CROs and business owners. Listen up for a minute. If you have never been in the situation where your entire IT infrastructure is down, it’s hard to understand the complete and useless feeling. You can sympathize, but you can’t understand.
If I could only convey the feeling of uselessness…of second guessing. What could we have done better? How are our backups? Is Best Buy (News - Alert) open at 3 a.m. to get jump drive needed to restore a critical system? At that moment when you realize “OH NO!*@# … Every system you have is down.” If I could convey the feeling, the topic of ransomware would be very easy.
Until you are there, you cannot imagine what it feels like. And if you don’t have good backups, your systems are never coming back up. Imagine that.
Tales from a Petya Outbreak
We were just called in to assist with another ransomware incident. Today is a different strain, but the circumstances are mostly the same.
The story is pretty repetitive and I can tell it in my sleep. It goes like this: the technical people have the expertise, the tools are available, the best practices are clear, but the culture doesn’t allow for basic competence.
Today, all of the technical people down in the weeds are super frustrated. They have been for years, but the job is OK and it pays the bills. They all joke about the status of their network. They joke when the most talented of them leave for a better environment. They joke when the headhunters send them emails in the middle of the crisis. They joke because the solutions to their problems are obvious. No one will give them authority to do what they need to do.
They can’t even get a maintenance window to patch systems.
Today is much more frustrating than most. Today the Petya virus affected all of their systems. All of their systems.
The Petya virus payload had encrypted the MBR (master boot record). It's the file that is needed to boot the operating system. This means that for the next 72 hours there will be very little sleep, a lot of free food (because management feeds them) and a whole lot of frustration.
Why is there frustration? Because as I stated above this could have been easily avoided. All of the technical people know (or knew before being poached by the headhunter) exactly what to do. The fault lies with the executive management team.
CEOs, CIOs, CROs and business owners, listen up. Your technical people know what to do. If you got hit by the latest strain of ransomware or any ransomware, it means that there were failures on many levels.
Is network, data, email, cloud, apps important to you in running your business? Then stop treating it as an expense line item. These apps and interfaces are more and more the face of your company to the customer.
How is the culture in your IT department? Are they telling you what you want to hear? Ask them what the real problems are. Get HR involved, do 360 feedbacks and review them, seriously.
The IT community is unique in that they have a vibrant online community. They troubleshoot issues and post their solutions. They have developed best practices and standards. They help each other with issues. They are always improving their skill set. Encourage them.
What is your status today?
Ransomware is actually not difficult to stop. It simply takes support from the top. I’m going to describe the situation I just found myself in and you can ask yourself if you are in a similar situation.
Any NAs - If you don’t know the answer to these questions, I would recommend revaluating your entire IT function.
Is it difficult to stop ransomware? What is the silver bullet?
Is there a silver bullet? Well, yes and no. There is not one simple thing that you can do to protect yourself; there are many. If you don’t do any, you are at a very high risk, and if you do four out of five you are probably ok.
At the place I left yesterday, the 12 hours of brain-storming, testing, validating could have all been prevented with a few simple steps. However, it was the perfect storm. Patches were not applied. The network design is flawed. No advanced threat protections were applied. There was no visibility to east/west traffic. There were multiple ingress/egress points and at the remote sites Internet traffic was not even logged. And to top it all off, there were “any, accept” rules in the firewall (meaning you have a firewall that looks like swiss cheese).
How can we take steps today?
To reconfigure a network after 20 years of existence with hundreds of applications is a very hard thing to do. I completely understand that. But if you can’t reconfigure your network properly, take other precautions.
CEOs, CIOs, CROs and business owners. Listen up. You can prevent this from occurring, but it starts with you.
I don’t know of a single company that has a security mandate from the top that has gotten encrypted.
As always, if you have any questions or concerns about this or if you want to know if you are at risk, please contact us. We exist to secure your network. We can at least point you in the right direction.
About the Author
Paul Warnagiris is the CEO and Senior Security Analyst of The Teneo Group, an IT Security Services company. Paul has 20+ years of experience in IT security. This time includes the positions of Network Security at UUNet and CISO at Promontory Interfinancial Network. Paul is the author of Teneo’s mission: To secure the networks and data of their clients.