School Security Software Decides Innocent Parent Is Actually A Registered Sex Offender

from the you-can't-argue-with-(search)-results dept

An automated system is only as good as its human backstop. If the humans making the final judgment call are incapable of using good judgment, the system is useless.

School personnel allowed a machine to do all of their critical thinking, resulting in this unfortunate turn of events.

Staff in an Aurora school office mistakenly flagged a man as a registered sex offender when he and his family went to his son’s middle school for a recent event.

Larry Mitchell said he was humiliated Oct. 27 when Aurora Hills Middle School office staff scanned his driver license into a software system used to screen visitors to Aurora Public Schools district schools.

The system, provided by a private company, flagged Mitchell as a potential match with a registered sex offender in a nation-wide database. Staff compared Mitchell’s information with the potential match and determined that match was correct, even though there are no offenders in the national sex offender registry with his exact name and date of birth.

Not only did these stats not match, but the photos of registered sex offenders with the same name looked nothing like Larry Mitchell. The journalists covering the story ran Mitchell’s info through the same databases — including Mitchell’s birth name (he was adopted) — and found zero matches. What it did find was a 62-year-old white sex offender who also sported the alias “Jesus Christ,” and a black man roughly the same age as the Mitchell, who is white.

School administration has little to say about this botched security effort, other than policies and protocols were followed. But if so, school personnel need better training… or maybe at least an eye check. Raptor, which provides the security system used to misidentify Mitchell, says photo-matching is a key step in the vetting process [PDF].

In order to determine a False Positive Match the system operator will:

i. Compare the picture from the identification to the picture from the database.

ii.If the picture is unclear, we will check the date of birth, middle name, and other identifying information such as height and eye color.

iii. The Raptor System has a screen for the operator to view and compare photos.

iv. If the person or identifying characteristics are clearly not from the same person, the person will then be issued a badge and established procedures will be followed.

Even if you move past the glaring mismatch in photos (the photos returned in the Sentinel’s search of Raptor’s system are embedded in the article), neither the school nor Raptor can explain how Raptor’s system returned results that can’t be duplicated by journalists.

Mitchell said he was adopted, and his birth name is Lawrence Michael Evans. The Sentinel did not find a match with that or his legal name and date of birth in the national sex offender registry.

Raptor says its system is reliable, stating it only returned one false positive in that county last year. (And now the number has doubled!) That’s heartening, but that number will only increase as system deployment expands. Raptor’s self-assessment may be accurate, but statements about the certainty of its search results are hardly useful.

The company’s sales pitch likely includes its low false positive rate, which, in turn, leads school personnel to believe the system rather than the person standing in front of them — one who bears no resemblance (physical or otherwise) to the registry search results. Mitchell still isn’t allowed into the building without a security escort and is hoping that presenting school admins with his spotless criminal background check will finally jostle their apparently unshakeable belief in Raptor’s search results.

This failure is also an indictment of the security-over-sanity thinking. The Sentinel asked government officials if there were any incidents in which sex offenders had gained access to schools, thus necessitating this $100,000+ investment in Raptor’s security system. No results were returned.

Neither local school or state public safety or education officials could point to data showing how many registered offenders try to seek access to schools, or if a registered offender visiting a school has ever harmed a student in Aurora or Colorado.

Given this history, Raptor’s system is always going to be better known — at least at this school — for locking out non-criminals than catching sex offenders trying to be somewhere they shouldn’t. If the schools haven’t seen activity that necessitates the use of this system, it will always produce more false positives than actual hits. When there’s no one to catch, you’re only going to end up stigmatizing innocent parents. It’s a lot of money to pay for solving a problem that doesn’t exist. The school has purchased a tiger-proof rock and somehow managed to hurt someone with it.

Filed Under: , , , , ,
Companies: raptor

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “School Security Software Decides Innocent Parent Is Actually A Registered Sex Offender”

Subscribe: RSS Leave a comment
58 Comments
A. "Nomma" Lee says:

And you want to let GOOGLE keep data on everyone?

What if Google swaps your searches with, oh, Hitler’s?

Or is Google’s software perfect, besides its motives, so should be allowed — and it IS entirely the choice of “natural” persons whether that or any corporate fiction is even written on paper, but we do need to inform our elected servants when it’s clearly become a hazard to our privacy — to track everyone everywhere forever?

PaulT (profile) says:

Re: Re

“What if Google swaps your searches with, oh, Hitler’s?”

Well, first I’d be either laughing at the obviously erroneous data or congratulating whoever came up with the first time machine.

But, the data being stored is not a problem. It’s what’s done with the data that’s the issue. Here, the issue was not the false flagging. The problem was the school acting on it even though 5 seconds of due diligence would have to told them it was not correct.

Similarly, Google having false information about me is not a problem. It’s when the people you defend insist they have to automatically process that data in case I watch 2 seconds of video without them collecting a toll when it becomes problematic. It’s when governments insist they have to become defect state censors because they can’t be bothered to train or fund law enforcement properly when it becomes an issue.

Surely, even someone as rabidly committed to being an idiot as you are can see the difference?

That Anonymous Coward (profile) says:

“Raptor says its system is reliable, stating it only returned one false positive in that county last year.”

Are they only deployed in 1 county? Or are they trying to keep the numbers low by subdividing incidents into tiny categories to bolster their claims of being awesome?

When “Do Something!!!!!!!!” meets snake oil salesmen.

We picked a boogeyman & did something, at the expense of your children’s education. Don’t judge us by this absolute failure of our system & users, stay focused on the imagined stories of sex offenders grouping up to kidnap hundreds of children at once from schools under the unwitting noses of staff.

This doesn’t even begin to touch on the wildly different ways one can end up on the sex offender registry – abusing a child all the way to peed in an alley to mooned people from a bus to his girlfriend was a year younger & the parents screamed DO SOMETHING!!!!!!!!!

Wouldn’t this money have been better spent on active shooter drills & teaching kids how to apply pressure to gunshots to save lives??

Anonymous Coward says:

Re: Re:

Wouldn’t this money have been better spent on active shooter drills & teaching kids how to apply pressure to gunshots to save lives??

Or here’s a novel idea, maybe use the money on paying teachers more and buying school supplies so the teachers don’t have to provide them themselves?

I know, it’s far fetched. What school district would do such a thing?

Anonymous Coward says:

Re: Re: Re: Re:

“Thats why we pay those who educate them so shittily & pay administrators who sign these contracts way more than they are worth.”

The school administrators who sign these contracts need to be paid a king’s ransom, because otherwise they’ll quit and work for one of the companies who sold them some multi-millon dollar polished turd.

Anonymous Anonymous Coward (profile) says:

To think or not to think, but the machine told me so!

There’s the concept of doing things right, or doing the right thing, a concept many polititians have a hard time with, and seemingly bureaucrats as well.

This incident also points out the lack of critical thinking traing in our educational systems. If the person running the machine couldn’t decide that two unlike photographs meant a machine error, they they shouldn’t be able to get a license to walk, let alone a position of responsibility with any school system.

PaulT (profile) says:

Re: To think or not to think, but the machine told me so!

Indeed, this is where the real problem lies. A system that’s obviously wrong on a regular basis keeps people vigilant. A system that’s 100% correct every time is impossible.

So, we end up in dangerous ground, where people trust the machine even when it’s obviously wrong, because it’s so rarely in that state. People drive their cars off cliffs because the sat nav told them to, even though their own eyes tell them it’s wrong.

At least in those cases we can laugh at the idiots, but when the person is a victim because other people are gullible morons it’s less entertaining. Gilliam’s Brazil and Kafka’s The Trial are not places we want to people to actually be held in real life.

That One Guy (profile) says:

When ego and profits are on the line

Mitchell still isn’t allowed into the building without a security escort and is hoping that presenting school admins with his spotless criminal background check will finally jostle their apparently unshakeable belief in Raptor’s search results.

The match was bogus and couldn’t be repeated. Rather than admit that the system and the staff screwed up, they instead doubled down and are acting as though the system was right the first time.

There’s pigheadedness, and then there’s outright denial of reality, and I suspect I know why(beyond just a simple refusal to admit to having screwed up)…

The Sentinel asked government officials if there were any incidents in which sex offenders had gained access to schools, thus necessitating this $100,000+ investment in Raptor’s security system. No results were returned.

Admitting that it and they were wrong here, blatantly so, could lead to people asking just what how good the product and ‘procedures’ involved are, and since that would require admitting that both are notably flawed, well, just easier to just pretend it was spotless all along, even if it leads to an innocent man being treated as a sex offender.

Anonymous Anonymous Coward (profile) says:

Re: When ego and profits are on the line

There is that statistic that says most sexual abuse comes from people the abused knew and/or even family members, not strangers. Just what activity at a school event, with lots of other adults and one would think faculty members around, is a sex offender going to do? The odds tell us that they already know their victim, most of the time.

DB (profile) says:

Re: Re: When ego and profits are on the line

It’s difficult to estimate the percentage of stranger sexual abuse of children because we can’t know what incidents aren’t reported.

But a reasonable proxy is abductions. It’s not an exact analogy, but it’s somewhat informative. Children taken by strangers or slight acquaintances represent only one-hundredth of 1 percent of all reported missing children.

Christenson says:

Re: Re: Re: When ego and profits are on the line

Here’s another approximate proxy:
Number of people in your life that have been in positions of trust and you have withdrawn that trust.

I count one of those, but no cases have I heard of a stranger abusing someone. Publicly, lots of catholic priests. We all know of Larry Nasser. Ariel Castro is the only abduction case I can think of.

Plenty of ways to ensure there is no hanky panky at a school event; simply keeping an eye on someone specific will do that.

Matthew Cline (profile) says:

A possible scenario...

A possible scenario to explain what happened: whoever was running the system realized that Larry Mitchell wasn’t the person in the database. However, that person thought

What if, by pure coincidence, the Larry Mitchell in front of me is a sex offender who has so far evaded detection. If that turns out to be the case, then stupid people will blame me for letting in a sex offender, and I’ll probably be fired by higher ups covering their asses. I can avoid that by pretending to think that false positives are impossible.

Kitsune106 says:

Please please please

Tell me this system is separate, and not WiFi enabled. And hopefully the lists are kept up to date. Otherwise, sounds like could easily have issues if people can hack, or insert false data?

but seriosuly, how is list updated, what is process to check it, and can data be manipulated? Since data like that only as good as weakest link. And since just following orders apparently is okay (saracasm), well….

All I can think of is the earth 2140 thing. Where the computer overlord causes earth’s destruction due to bad data!

Anonymous Coward says:

To think that only 20 years ago, no one ever needed to show a driver’s license to board an airplane, and now everything about us is suspect and needs to be checked, from state-issued ID to luggage to to genitals. And if the TSA’s unaccountable No Fly List says you can’t fly, then that’s that, there’s no way to prove your innocence, or even know what you’re accused of doing.

It’s sad to see that the same sort of airport-grade security that we’ve become accustomed to is now trickling down into public schools, apparently to prevent some sort of rare crime that’s in all probability never happened before.

But when do other, more common crimes get added to the list? And how long before visiting parents, in addition to being ID’d, get radiated and body searched upon entering a school, in addition to all the children?

Anonymous Coward says:

Favourite quote from the Sentinel article is the Aurora Public Schools spokesperson subtly shifting the blame to the innocent parent:

“Safety is a top priority,” Christiansen said. “This was just a matter of us following our district protocol. Unfortunately the parent who showed up this morning declined the escort and left.”

That One Guy (profile) says:

Re: Re:

Translation: ‘He refused to play along with the whole ‘I’m a sex offender’ accusation and be paraded around with a visible escort to make it crystal clear that he could not be trusted.’

Yeah, can’t imagine why he wasn’t willing to go along with that, though nice of them to admit that it’s apparently ‘district protocol’ to double-down on groundless accusations rather than admit fault.

Anonymous Coward says:

Minimizing

Raptor says its system is reliable, stating it only returned one false positive in that county last year.

I love that minimization. One false positive in the county…and just how many systems did they install in that county?

So why did they go by county? Well there are three counties in Aurora Colorado. How many false positives were there in Aurora? In each of its two school districts?

Far be it that they should discuss false positive rates, or mention how many false positives they had nationwide.

Anonymous Coward says:

and is he gonna sue? if not, why not? the school must still be using the result the software gave as he has to be escorted when going to the school, so regardless of anything else, as far as the school is concerned, he IS a sex offender. this stigma MUST be removed and going to court is the only way for that to happen! shame but it means that the public money wasted on this software will now increase in compensation to Mr Mitchell

Anonymous Coward says:

When the school’s administration decided what to spend money on, they had to weigh the risks and consequences:
– A mass shooting at the school.
– Kids getting a lousy education because of underpaid teachers and lack of materials.
– A visit by a registered sex offender.

Apparently, the latter was deemed most harmful — which might actually be correct, if you only consider the harm to the administration’s reputation.

Personanongrata says:

You are a Slave to the Device and then You Die!

An automated system is only as good as its human backstop. If the humans making the final judgment call are incapable of using good judgment, the system is useless.

Therein lies the problem a lot of people relish the opportunity of ceding their own thought-process in favor of an algorithm or device doing the "thinking" (comment author used scare quotes to alert any potential readers that the word "thinking" is being used out of context as algorithms/devices do not think) for them.

Critical thinking and problem solving skills – whatever are they good for in rote memorization nation?

Smartassicus the Roman says:

Wait a Cotton-Pickin Minute!

Our school system is going to start using Raptor Technologies systems. I’ve asked Raptor SEVENTEEN TIMES (via social media, email, phone, and return-receipt mail which they did receive) for information concerning what they do with the information they log on people forced to use the system and have never received a reply.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...