Whenever there is a breach or security incident, the infosec quarterbacks are out in full force, speculating what went wrong and pointing out what "should" have been done. There is a group who quietly commiserates with the security team working round the clock to mitigate the issue, but for the most part, the blame game is strong.
In response to reports that Boeing had been hit by malware, the armchair CISOs were ready to skewer the aviation giant for missing security basics. Since the report suggested Boeing had been infected with WannaCry, the ransomware worm that used an exploit allegedly developed by the National Security Agency to infect Windows machines, critics wondered why Boeing hadn't already patched all its systems. The exploit used by WannaCry targets a Windows vulnerability which had been patched a year ago.
Lots of unknowns here: whether it was actually WannaCry or a variant (Boeing said media reports were "overstated and inaccurate"), what systems were infected ("small number of systems"), how the malware got on the network ("limited intrusion"), and how it spread. Pretty much the only known thing is that Boeing resolved the situation ("Remediations were applied, and this is not a production or delivery issue.").
Treating all victims with dignity.
If the tendency is to blame the organization for not prioritizing security, or to assume the organization did something obviously wrong, then the focus isn't on understanding the root cause of the incident/breach. Every organization has a different set of circumstances, Security is not a one-size-fits-all, but that also means the root causes in security failures also vary from organization to organization. The root cause—why something happened—depends on a long list of things, including the company structure, the technology, the culture, and the people.
The city of Atlanta was infected by SamSam ransomware, which crippled several city services. Part of the reason was the city's technology debt—long-standing issues that had not been addressed because there wasn't the time or resources. But that doesn't mean the city deserved to get infected.
"One of the most difficult lessons I learned over 25 years of EMS/rescue was having empathy for patients without excusing or condoning poor choices. Treating all victims with dignity. Made me a much better paramedic," Rich Mogull, founder and CEO of analyst firm Securosis, wrote on Twitter. "Hint hint infosec..."
It is easy from the outside to look at the most recent chain of events and the vulnerabilities involved, and then make judgements about why the security incident happened. It is easy to blame the victim for the security failure, for doing something differently. But everyone's situation is different, so the right choices for one won't be the same for someone else. Organizations can be in the same industry, face similar threats, and have the exact same technology stack, but still have very different security programs and risk profiles because the people, the culture, and processes will be different.
Criticizing the organization for not patching ignores the reality, that there are valid reasons for not being able to patch. Noting that a certain piece of technology was not in place ignores all the events that happened previously that justified that decision.
Let's go back to Boeing for a second. The company has defense contracts with the United States and other countries around the world. The man who runs Boeing's application security program, John Martin, is an information security veteran who has done a lot to improve the industry, such as helping found the Aviation ISAC. The security team understands how to manage supply chain security. This is a company that is under attack all the time.
Boeing being hit by fast-moving malware isn't the moment to degenerate the company's efforts, but to realize that if Boeing can get infected, then so can anyone else. Instead of asking what Boeing didn't do, a better question is to figure out if other organizations can respond as effectively and quickly as Boeing appears to have done. Do the security teams know how to respond and mitigate? Is the environment architected in a way that the damage can be contained?
Having empathy for the victims doesn't mean don't discuss what happened, or don't learn from it. There is a way to discuss what worked, what helped with defense as well as recovery, and what didn't work without resorting to smug superiority. Apply the lessons learned.