Algorithm Might Protect Non-Targets Caught In Surveillance, But Only If The Government Cares What Happens To Non-Targets

from the something-it-has-yet-to-show dept

Ashkat Rathi at Quartz points to an interesting algorithm developed by Michael Kearns of the University of Pennsylvania — one that might give the government something to consider when conducting surveillance. It gauges the possibility of non-targets inadvertently being exposed during investigations, providing intelligence/investigative agencies with warnings that perhaps other tactics should be deployed.

Rathi provides a hypothetical situation in which this algorithm might prove to be of use. A person with a rare medical condition they’d like to keep private visits a clinic that happens to be under investigation for fraud. This person often calls another family member for medical advice (an aunt who works at another clinic). This second person’s clinic is also under investigation.

When the investigation culminates in a criminal case, there’s a good chance the patient — a “non-target” — may have their sensitive medical information exposed.

If the government ends up busting both clinics, there’s a risk that people could find out about your disease. Some friends may know about your aunt and that you visit some sort clinic in New York; government records related to the investigation, or comments by officials describing how they built their case, may be enough for some people to draw connections between you, the specialized clinic, and the state of your health.

Even though this person isn’t targeted by investigators, the unfortunate byproduct is diminished privacy. This algorithm, detailed in a paper published by the National Academy of Sciences, aims to add a layer of filtering to investigative efforts. As Kearns describes it, the implementation would both warn of potential collateral damage as well as inject “noise” to make accidental exposure of non-targets minimal.

For such cases where there are only a few connections between people or organizations under suspicion, Kearns’s algorithm would warn investigators that taking action could result in a breach of privacy for selected people. If a law were to require a greater algorithmic burden of proof for medical-fraud cases, investigators would need to find alternative routes to justify going after the New York clinic.

But if there were lots of people who could serve as links between the two frauds, Kearns’s algorithm would let the government proceed with targeting and exposing both clinics. In this situation, the odds of comprising select individuals’ privacy is lower.

Potentially useful, but it suffers from a major flaw: the government.

Of course, if an investigation focused on suspected terrorism instead of fraud, the law may allow the government to risk compromising privacy in the interest of public safety.

Terrorism investigations will trump almost everything else, including privacy protections supposedly guaranteed by our Constitution. Courts have routinely sided with the government’s willingness to sacrifice its citizens’ privacy for security.

It’s highly unlikely investigative or intelligence agencies have much of an interest in protecting the privacy of non-targeted citizens, even in non-terrorist-related surveillance — not if it means using alternate (read: “less effective”) investigative methods or techniques. It has been demonstrated time and time again that law enforcement is more interested in the most direct route to what it seeks, no matter how much collateral damage is generated.

The system has no meaningful deterrents built into it. Violations are addressed after the fact, utilizing a remedy process that can be prohibitively expensive for those whose rights have been violated. On top of that, multiple layers of immunity shield government employees from the consequences of their actions and, in some cases, completely thwart those seeking redress for their grievances.

The algorithm may prove useful in other areas — perhaps in internal investigations performed by private, non-state parties — but our government is generally uninterested in protecting the rights it has granted to Americans. Too many law enforcement pursuits (fraud, drugs, terrorism, etc.) are considered more important than the rights (and lives) of those mistakenly caught in the machinery. If the government can’t be talked out of firing flashbangs through windows or predicating drug raids on random plant matter found in someone’s trash can, then it’s not going to reroute investigations just because a piece of software says a few people’s most private information might be exposed.

Filed Under: , , , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Algorithm Might Protect Non-Targets Caught In Surveillance, But Only If The Government Cares What Happens To Non-Targets”

Subscribe: RSS Leave a comment
15 Comments
That One Guy (profile) says:

"Something that will tell us when we might violate someone's privacy? What possible use is that?"

Yeah, they’re willing to ignore laws when following them might bar them from grabbing as much as they want(everything), there’s no chance whatsoever that they would voluntarily use a system or algorithm that might restrict, even if only with a warning of ‘this search might violate the privacy of non-targets’, their ability to collect everything.

That Anonymous Coward (profile) says:

Because Terrorism...

We have people arriving at airports and being turned away, because terrorism. They are on full public display for this and the justification is because terrorism.

The target might have attended a house of worship that our expansive 17 degrees of guilt connections show that you must be a terrorist. It doesn’t matter that to average people the connection is ridiculously tenuous, we have a connection and put them on our secret list.

The target might share a name with a Congressman, because just a name alone is enough to get on the secret list.

The target might fall under 1 of 100 other scenarios where an innocent person is entangled & is left to deal with that fall out with no help.

They don’t care about the people they claim to protect, they care about the large production numbers that get them coverage and headlines.

So what if we out you having AIDS, we stopped corruption.
So what if we out you getting hormone therapy for gender reassignment, we stopped corruption.

This is the system that bent over backwards to justify throwing a flash bang into an occupied crib, they don’t care about the lives they ruin… only the headlines they imagine they will get for winning.

Anonymous Coward says:

the law may allow the government to risk compromising privacy in the interest of public safety

So can anyone demonstrate how any of the mass surveillance programs have benefited public safety? Has one instance of terrorism been prevented due to mass surveillance? Has one violent criminal been apprehended due to mass surveillance having been the key in breaking the case?

That One Guy (profile) says:

Re: Re:

As far as I know, the best and only example they’ve been able to put forward is that they caught someone sending a small amount of money to a foreign group, though I’m not sure if it was a terrorist group, or just one that’s been categorized as a being ‘questionable’.

That’s a fair trade for the mass violations of rights and privacy, right? /s

That Anonymous Coward (profile) says:

Re: Re:

They generate awesome reports after the fact about people they should have seen before the bad act happened but in the shuffle of watching everyone they missed important thing. If only we pass a bill giving them more power & more cash to spend can we truly avoid these things happening the next time, when the demand for more power & cash will be repeated.

GEMont (profile) says:

100%

If the government can’t be talked out of firing flashbangs through windows or predicating drug raids on random plant matter found in someone’s trash can, then it’s not going to reroute investigations just because a piece of software says a few people’s most private information might be exposed.

Its always nice when there is no need to post the obvious in response to an article about “responsible government”.

Exactly – that.

algorithsmus says:

Drug smuggling and collateral damage

It smuggles drugs into major cities using military planes at the same time it locks up users for long periods. On the other side of the world, it targets religious radicals with drones after giving them weapons. It’s pretty terrible in accuracy as well, and if bystander killings were counted, the ratio would horrify.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...