The revelations contained in the whistleblower disclosure by a former Twitter security executive paint a bleak picture of the company’s security culture and data governance practices, but many of the problems and shortcomings are not unique and reflect broader challenges many large organizations face while trying to get a handle on what data they’re collecting and who may have access to it. Those issues are attracting the attention of lawmakers and may lead to new legislation or further regulation of tech companies.
In his disclosure last month and in testimony Tuesday before the Senate Judiciary Committee, Peiter Zatko described serious security issues, including a lack of meaningful security controls, broad access to user data by thousands of employees with no need for such access, and no logging on critical internal systems. Zatko, who was in charge of information security, privacy, IT, and physical security at Twitter from November 2020 until he was fired in January 2022, said that when he first joined the company he discovered that it had at least 10 years of technical debt and was well behind its peers on security and technical innovation. Perhaps most worryingly, Zatko, known as Mudge in the security community, described an environment in which Twitter engineers and officials did not know how much data the company gathered on users, where that data was stored, or who had access to it.
“They don’t know what data they have, where it lives, or where it came from, and so unsurprisingly, they can’t protect it. So employees have to have too much access to too much data,” Zatko said in his testimony Tuesday.
“Twitter didn’t even know what it was collecting. Why do they keep having the same incidents year after year? What is fundamentally broken under the hood?”
Zatko said shortly after he joined Twitter, engineers raised concerns about some of these issues to him, and he in turn brought them to the attention of senior executives several times.
“This was a ticking bomb of security vulnerabilities. Staying true to my ethical disclosure philosophy, I repeatedly disclosed those security failures to the highest levels of the company. It was only after my reports went unheeded that I submitted my disclosures to government agencies and regulators,” Zatko said in his testimony.
A Twitter spokesperson said that the company manages access to data with access controls, monitoring, and detection systems.
Some of the problems Zatko describes in his disclosures are applicable to just a small handful of companies. The misinformation and disinformation operations, impersonations, and mass influence campaigns are serious issues for Twitter and Facebook, but they don’t really apply to most enterprises, even at the largest scale. But the broader concerns Zatko raised about excess data collection, mishandling of data, and a lack of security controls, are everyday challenges for many organizations, regardless of the industry they’re in.
“The lack of ability to internally identify inappropriate access in our own systems, it was extremely difficult to track people. The lack of logging, what info was accessed, or to contain activities let alone set steps for reconstitution or remediation,” Zatko said.
“Trying to understand what an adversary is doing would be pretty challenging without logs.”
“I’m basically risking my career and reputation, and if something good comes from this five or ten years down the road, then it’s worth it."
The idea that you can’t protect what you don't know you have is axiomatic in security and it applies not just to devices, but to the information an organization collects and stores. Knowing where user and customer data is, what it's used for and who can get to it and why are all difficult things to address.
"It’s not a tech problem, it’s a hard thing to overcome years of neglect. We’ve underestimated and underinvested in privacy for decades because privacy is justa air and no one wants to invest in air," said Michelle Finneran Dennedy, co-founder of Privacy Code and co-author of The Privacy Engineer's Manifesto.
"Privacy is contextual and time based, it’s storytelling. If you haven’t built data intentionality and data flows, you get that answer that we don't know where things are."
Platforms such as Twitter, Facebook, Instagram, and others collect huge amounts of data for a variety of purposes, and protecting that information is no small task. Many companies have discovered this firsthand, including Twitter, which in 2011 signed a consent decree with the Federal Trade Commission as a result of incidents in which attackers were able to gain admin privileges on Twitter systems in 2009. Under the terms of the agreement, Twitter was subject to 10 years of independent audits of its security practices.
Zatko said that in his time at Twitter, the consent decree did not seem to be a huge concern.
“Foreign regulators were much more feared than the FTC. One-time fines are priced in. Regulators do have tools, but they don’t know if they’re working. The laws get gamed by companies’ ability to answer questions in the affirmative without having done the work. They’re grading their own homework in a sense,” he said.
“In big tech, (regulators) are absolutely outgunned. From what I have seen, the tools that are used out of the toolbelt aren’t working. Other tools in the toolbelt do work but the regulators haven’t been able to quantify them in order to use them.”
During Tuesday’s hearing, several senators raised the possibility of increased regulation of large tech platforms, and Sen. Richard Blumenthal (D-Conn.) going so far as to suggest the creation of a new federal agency to handle the job, an idea that Sen. Lindsey Graham (R-SC) said he supported.
“We’re going to create a regulatory agency with teeth. The regulatory environment is insufficient for the task. It’s time to up our game,” Graham said.
The idea of a data privacy agency is not a new one and has been part of several privacy bills over the years, but Finneran Dennedy said it is not a simple conept to bring to fruitition.
"I'm not sure I want to see another giant bureaucracy in Washington and staffing it would be difficult. It’s such a specialty and you have to be a zealot to really do privacy well. We’re in a place where we’re making up for years of neglect. We’re in the clean up the abattoirs phase. We have to do the dirty, boring work," she said.
"We have real risk on the table and real tech debt. It's a wicked problem and there are no easy answers. There are going to be compromises all along the way."
For Zatko, who has spent 30 years working in the private sector and government trying to raise awareness about endemic security vulnerabilities, better oversight of data practices could be one positive outcome of an unpleasant process.
“I’m basically risking my career and reputation, and if something good comes from this five or ten years down the road, then it’s worth it,” he said.