Security news that informs and inspires

Disclose.io Offers Security Researchers Safe Harbor

By

Vulnerability research has its ups and downs. On the upside, the researchers are making things more secure by finding and reporting flaws. On the downside, some companies respond with lawsuits and cease-and-desists. Enter Disclose.io, a legal framework intended to protect researchers reporting vulnerabilities responsibly.

Disclose.io is a “collaborative, open-sourced and vendor-agnostic project to standardize best practices for providing a safe harbor for security researchers within bug bounty and vulnerability disclosure programs.” The goal is to protect those engaged in good-faith security research from legal action.

Some organizations run bug bounty programs and give researchers clear guidelines on how to report issues, what kind of research to submit, and what kind of rewards to expect. Others work with platforms like Bugcrowd to run these programs. But there are no common rules or standard guidelines on what researchers are expected to do or how organizations should work with these researchers.

“Some bug bounties and vulnerability disclosure programs still put white hat hackers at legal risk, shifting the risk for liability towards the hacker instead of authorizing access and creating legal safe harbors,” wrote Amit Elazari, a doctoral candidate at the University of California at Berkeley School of Law and a grantee of Berkeley’s Center for Long-Term Cybersecurity. It’s not just about authorizing hackers, but also providing clear guidelines and communication expectations.

Many programs don’t include “safe harbor” language in their rules to protect researchers trying to do the right thing. “Yet, this is the very language necessary to allow hackers to find and responsibly disclose software vulnerabilities legally,” Elazari said.

The goal is to protect those engaged in good-faith security research from legal action.

To encourage more companies to adopt best practices around vulnerability disclosure, Bugcrowd partnered with Elazari to launch Disclose.io. The project combines Bugcrowd’s past work with customers to standardize programs with Elazari’s #LegalBugBounties project and the Open Source Vulnerability Disclosure Framework, a legal effort launched jointly by Bugcrowd and law firm CipherLaw. Elements of Dropbox’s pledge to protect security researchers are also present.

Encouraging white-hat hackers to find vulnerabilities “can be a frightening concept for people who build, run and protect software, but it's necessary to compete against the intelligence of the adversaries that are out there,” said Casey Ellis, founder and CTO of Bugcrowd. “Standardization is the best way to negate any legal or reputational blowback while still attracting the best hunters to your program.”

Companies supporting the Disclose.io framework will display the logo and commit to responding to reports in a timely manner. The are required to provide clear definitions regarding research scope, establish one or more official communication channels, and publish a formal disclosure policy. They agree not to take legal action—creating a safe harbor—against researchers acting in good faith. Researchers promise to stay within scope, disclose responsibly, and access only the minimum amount of data required to create a working proof-of-concept.

The framework is ambitious: clear, concise language and readability for people without a legal background, safe harbor elements for both program owners and researchers, and legal requirements. The plain rules are especially important since many researchers don’t speak English as a first language, and obscure legalese can trip them up inadvertently.

Not Extortion

While many organizations are beginning to embrace bug bounties and vulnerability disclosure programs, there are some organizations that continue to view vulnerability reports as the infosec-equivalent of a shakedown.

Recently, a security company filed a lawsuit against a Google security researcher and reporters over a security vulnerability. A Hungarian transportation authority called the police on an 18-year old for reporting a bug in its travel ticket system. A drone vendor threatened to charge a researcher who found the company had committed credentials for Amazon Web Services accounts into its publicly-accessible GitHub repository with prosecution under the Computer Fraud and Abuse Act.

Threats of lawsuits and jail time can discourage cooperation and have a chilling effect on the community. In a September 2015 survey of 414 security researchers by the National Telecommunications and Information Administration (NTIA), 60 percent said threat of legal action was a reason they may not work with a vendor to disclose vulnerabilities. The US Computer Fraud and Abuse Act (CFAA) and Digital Millennium Copyright Act (DMCA) can trip up even the most well-intentioned researcher.

Just Getting Started

As Elazari noted, there is no consensus on who defines the rules to safeguard researchers’ legal interests. The project just launched and there is a lot to figure out, but it’s not clear who will be in charge of enforcement and maintenance.

"Who will update terms when new laws are introduced? Who enforces penalties due to infractions by a vendor or researcher?” asked Dustin Childs, communications manager for Trend Micro’s Zero Day Initiative. ZDI pays independent researchers who find vulnerabilities and then work with companies to get the flaws fixed. “With this sort of project, there doesn’t seem to be any top-level agency providing accountability to either researcher or vendor.”

About 20 companies running bug bounty and vulnerability disclosure programs have adopted language that follow current Department of Justice guidelines of legal safe harbor for security research and address DMCA, Bugcrowd said. ZDI’s Childs also noted that Disclose.io’s goals overlap with the ISO 29147 standard, which covers vendor disclosure.

"[If] I have limited resources, I would expect vendors to focus on being ISO compliant first. Since there’s so much overlap, you could likely do both,” Childs said. Open source, battle-tested boilerplate contracts guiding how disclosure and bug bounty programs are created and run may be a lot more effective than letting the government regulate security research.

"If we don’t actively push towards adoption of better standards that minimize the legal risks for this [white hat hacker] community, across the industry, we risk undermining the fundamental trust relationship needed to sustain it,” Elazari said.