We recently had the opportunity to sit down with Adam Geisser, CISSP, to discuss the SolarWinds hack. Adam has over 25 years of experience managing and delivering systems architecture, security auditing, disaster recovery and operations of enterprise financial systems.
Is there a diminishment of trust in vendors when they have breaches like the SolarWinds experience?
Adam: I would say there has been a lot of brand damage done with this breach. The main problem is that SolarWinds is a widely used networking management software within the federal government. These hackers were able to get a foothold in the software, place a malicious code in the latest revisions and then have those revisions downloaded by thousands of entities trying to update their software to the newest code. It gave the attackers a backdoor into those systems. They had visibility and control over the entire environment. A lot of the companies and federal agencies that had these versions of SolarWinds were forced to either disconnect those systems entirely from the internet or had to find an emergency patch from the vendor immediately. Many companies were also trying to make the difficult decision of whether or not to keep these systems off since no one knew how deep-seated the compromise was.
This hack will damage the company since it received so much attention on a national stage and affected federal government agencies. I believe that if this were to have happened mostly in the private sector, it wouldn’t have been as bad. Since it affected the federal government, everybody’s records in the United States could have been compromised, and that’s a big deal.
Why were some companies not affected by the SolarWinds hack?
Adam: My company was not affected by what is being called the “Sunburst” or “Solar Gate” hack because we were using older versions of SolarWinds – the hack only affected the 19.4 and 20.4 codebases. We also did what most companies did after the news broke, in which we put in additional safeguards for our EDR (Endpoint Detection Response) platform.
What is the difference between the SolarWinds and FireEye hacks? Was one worse than the other?
Adam: When it comes to the SolarWinds hack, most people don’t understand that the FireEye breach was the bigger issue. When FireEye was compromised, all of their penetration testing and application security testing was compromised too. So, anyone who had FireEye as their endpoint detection solution potentially had bigger issues than anything that could have come out of the SolarWinds attack. Once hackers got into those endpoint detection tools, they could reverse engineer and then hack into other companies. The FireEye hack, in my opinion, was much bigger and more dangerous to companies out there than the SolarWinds hack.
Concerning SolarWinds – Is there additional work from a compliance perspective with vendors who can access a corporate environment who may use these tools?
Adam: With the vendor assessment process we had in place before SolarWinds, they already had to go through some level of scrutiny. We do a security assessment on vendors to understand if they have the same controls and regulations that we have in place. In most cases, we require our vendors to have at least similar security regulatory controls to manage PII data, social security numbers, banking account information, etc. Some questions we use (and most companies use) to determine this are:
- What are you doing with our data?
- Is it encrypted in transit and at rest?
- Can you provide evidence that it does this?
These questions are particularly important to me because I was at a company five years ago when Experian was hacked. We were the ones who had to reveal to our customers that their accounts were compromised since we utilized Experian for our customer credit card verifications. Even though Experian was the one that was hacked, we were the ones that had to deal with reputational risk. The third-party vendor did all of our verifications and, apparently had held onto all of the credit card numbers and the names associated with our customers in the compromised database.
When it comes to notifying customers about the hack, a company has to send out millions of notifications telling customers about this hack, that their data was potentially compromised and mitigate the associated customer compromise. Back in the 90s, this might cost around $3-4 per notification. However, today, this would cost about $200-300 per notification. That’s why companies are now very reluctant to reveal to their customer base that they have been hacked before they have a clear understanding of the scope of the hack.
Why don’t companies typically invest in security?
Adam: There are a lot of companies out there that don’t invest enough dollars and people into security. Most of the time, that’s because all they care about is how it will affect their bottom line and what revenue will be generated by putting security in place when, in reality, there isn’t any. However, the companies that fair best in these types of security breaches have leadership and boards that are entirely invested in ensuring they have the proper security controls in place. The companies that don’t are the ones that typically get compromised.
Regarding the Experian hack, they should have had much more rigid security controls in place to prevent what happened as they had just forgot to patch a server. The server that they didn’t patch, or didn’t get around to patching, had a known vulnerability which made it easier to compromise and exfiltrate millions of records from their database. This resulted in all of the companies that used Experian for their credit card processing to have to notify their customers that their information had been compromised. It’s not the company’s fault that used Experian, but they are sort of held accountable. There were some interesting conversations between our legal team and their legal team for months after the hack happened. We assumed that they had the proper controls in place. When we did the assessment, they said they did do these things, but they were compromised because, in reality, they didn’t.
Why is patching such an issue?
Adam: Patching is one of the biggest, if not the biggest, issue that keeps CISOs up at night because most companies tend to not patch well. It seems like the most simplistic thing to do but is often just pushed under the rug.
For example, how often do you upload the latest iOS or Android version on your phone when the new version comes out? Many of those updates have to do with security updates, but most people don’t see updating as a priority. If your phone, tablet, or computer gets compromised because you didn’t put the latest and greatest on there, it’s on you.
That’s what companies have done. They haven’t focused their attention on the one thing they can prevent: keeping systems patched to the latest version or putting the appropriate security controls in place to mitigate some of those risks. However, most companies don’t see this as a priority, so we will continue to see compromises because of patching.
About Adam
“I have over 25 years of Cybersecurity, Information Security and Security Operations in my background. I have worked in Banking and Financial Services, Telecommunications, Travel and Staffing Service industries over that time. My areas of focus have been in Vulnerability Management, Application Security, Penetration Testing, Incident Response, Threat Hunting, Risk Management and Regulatory Compliance. I have worked as a Security Analyst, Security Engineer, Security Architect, Manager of Operations and most recently have taken on management of the Systems Administration team along with being responsible for all Cybersecurity Operations from cradle to grave.
I am the father of four and have been married to HR (my wife) for that last 27 years.”
