It’s a good time to be a hacker. Maybe.
Sure, ethical hackers are earning more than ever through corporate “bug bounty” programs – in rare cases netting them a six-figure bounty for a single security flaw – but legal protections for security researchers haven’t caught up. Researchers are still getting sued in disputes over the terms of bounty programs and continue to run afoul of intellectual property safeguards or anti-hacking laws.
It’s time for both sides to get their house in order.
Amit Elazari, a doctoral candidate at the University of California at Berkeley who has reviewed “hundreds” of Bug Bounty agreements, says the rules need to be changed and standardised to better protect white hats who are trying to do the right thing.
Elazari has a simple plan. At a recent cybersecurity conference, she called on hackers to boycott bug bounty programs with less-than-perfect terms.
Companies with bounty programs should adopt the relatively new guidelines drawn up by the U.S. Justice Department to include such protections. These guidelines help companies establish best practices for their bug bounty programs. Ultimately, they boil down to the need for complete clarity around scope and an acceptance that users who operate correctly within that scope must receive legal protection for their actions.
Elazari and others point to white hat Kevin Finisterre’s experience last year when he discovered a major breach in Chinese drone-maker DJI’s systems. As his probing increased, the company threatened him with the Computer Fraud and Abuse Act. Finesterre walked away from the $30,000 bounty after DJI refused to provide legal protections, and went public with his claim that DJI threatened him—a claim DJI denies.
Adam Bacchus, director of program operations for HackerOne, a bug bounty management platform, summed up the issue in a recent interview.
“If you see your neighbor’s door is unlocked, you might say, ‘Hey, your door is unlocked’. You don’t expect your neighbor to open up the door with a shotgun and try to attack you.”
It’s a good soundbite but it’s a little disingenuous. You don’t really see that your neighbor’s door is unlocked. To find out it’s unlocked, you go next door, and turn the handle. If someone I don’t know is walking around my house, turning door handles to see if one is open, I assume they have bad intentions.
It’s this complexity that means it will be an uphill battle to persuade companies and platforms to incorporate rigorous protections for hackers into their bounty terms. Elazari said that of the dozens of bug bounty programs she’s studied, only four currently offer sufficient legal protections.
Meanwhile, even as tens of thousands of hackers participate in bug bounty programs, there’s a legacy of suspicion. In 2015, 60 percent of them cited legal liability as a reason they might not work with a vendor to disclose a liability.
On the other side, anyone who has worked for an enterprise cyber security team will have a story of being approached with a request along the lines of “I have found vulnerabilities in your systems, pay me and I’ll tell you about them.”
We regularly have clients get extorted by “researchers” with requests for payment for vulnerabilities they’ve supposedly found. Equally, staff have received threats from vendors for vulnerabilities they have tried to report responsibly.
In many ways there are three layers of change that are required to clarify the situation.
First of all, legitimate bug bounty programs need sufficient protection for the researchers using them. Platforms such as Bugcrowd and HackerOne are significantly accelerating this maturation process, standardising the industry and improving relations between both parties.
Secondly, researchers must understand that security testing environments without a bug bounty program is generally going to be illegal. Disclosures won’t be well received unless unquestionably done for the greater good – that is, genuinely responsible and without the expectation of payment
And finally, to increase the number of companies in the first group, the market needs to start demanding bug bounty programs be in place. As a consumer, I want the systems I use each day to be subject to testing by the best minds out there. Anything that encourages and supports that, is a good thing.
A well-established bounty program ensures not only the safety of a business, but also of that business’ customers. Paying white-hat hackers to effectively run penetration tests on your entire environment is much cheaper than recruiting the talent to work in-house. It’s also infinitely cheaper than the potential fallout if a black hat finds the flaw first.