Another Victim Of The GDPR & CCPA: Security Researchers No Longer Can Get Anonymous Access To Internet Attack Data
from the not-understanding-privacy... dept
We’ve pointed out before that we’re generally bad at regulating privacy because we don’t understand privacy. All of the regulations around privacy seem to treat a set of information as “special” that must be locked up and hidden. However, as we’ve pointed out over and over again, privacy is actually a set of trade-offs, and in some situations, certain information should be shared, and in others not. But it requires an awful lot of context — and no privacy regulations that I’ve seen seem to take that context question into account. Because of that we end up with nonsensical results that often do more harm than good.
The latest example of this comes from Rapid7, a cybersecurity company that, among other things, tracks network activity and attacks to help fend off attackers. Back in 2018, Rapid7 launched the Open Data project to enable more researchers to have access to important data generated from its Project Sonar and Project Heisenberg research efforts. Via the Open Data Initiative researchers could access important useful data for improving security and understanding various threats online.
The Open Data Project offered two forms of access to researchers — both free. The first required signup, at which point registrants were “subject to light vetting and terms of service” before being able to access current and historical data. The second was free access “to a one-month window of recent data” from Project Sonar. However… thanks to laws like the GDPR in the EU and the CCPA in California, apparently sharing that information is becoming a liability. So Rapid7 is doing away with the second type of access, the kind that was widely available (the one month snapshot) for anyone to see on their website.
Reading between the lines, it sounds like Rapid7 was facing some threats under the GDPR and/or the CCPA, claiming that the very, very useful service it provided for anyone to look at the data, was possibly revealing… IP addresses. And, once again, this gets into the trade-off nature of privacy. In some cases, IP address information might reveal sensitive information — but the fact is that, in most instances, it absolutely does not. However, courts are getting aggressive about this — as you may recall from our recent story about a German court fining a company for… using Google’s fonts. The violation? Passing IP address info back to Google.
Rapid7 has noticed that this means its data service is potentially a liability:
During the past few years, we have also seen an evolving regulatory environment for data protection. Back in 2018, GDPR was just coming into effect, and everyone was trying to figure out its implications. In 2020, we saw California join the party with the introduction of CCPA. It seems likely we will see more privacy regulations follow.
The surprising thing is not this focus on privacy, which we wholeheartedly support, but rather the inclusion and control of IP addresses as personal data or information. We believe security research supports better security outcomes, which in turn enables better privacy. It’s fundamentally challenging to maintain privacy without understanding and addressing security challenges.
Yet IP addresses make up a significant portion of the data being shared in our security research data. While we believe there is absolutely a legitimate interest in processing this kind of data to advance cybersecurity, we also recognize the need to take appropriate balancing controls to protect privacy and ensure that the processing is “necessary and proportionate” — per the language of Recital 49.
The company says it will still work to make the data available, but from now on it’s going to require registration, rather than just being openly available on the website.
Once again, this seems like it will mostly likely have a negative impact on actual online security and privacy… all to comply with rules that are supposed to be improving our privacy. Some of us have warned regulators of these kinds of consequences, and are always brushed off, but we keep seeing this kind of thing happening.