Chicago PD Believes It Can See The Future, Starts Warning Citizens About Crimes They Might Commit

from the buttle/tuttle dept

We’ve talked a lot over the years about the attempts to get out “ahead of crime” by using computer programs and algorithms to try and predict who might commit a crime. Predictive computing can then either target specific areas or specific people that might be in need of some extra law enforcement attention. Except as we’ve noted repeatedly, these programs are only as valuable as the data they use. Garbage in, garbage out, but in this case you’ve got a human being on the other end of the equation whose life can be dramatically impacted by law enforcement holding what they believe is “proof” that you’ll soon be up to no good.

With that in mind there’s growing concerns about efforts in Chicago to use predictive analytical systems to generate a “heat list” — or a list of 400 or so individuals most likely to be involved in violent crime. The Chicago efforts are based on a Yale sociologist’s studies and use an algorithm created by an engineer at the Illinois Institute of Technology. People who find themselves on the list get personal visits from law enforcement warning them that they better be nice. The result is a collision between law enforcement that believes in the righteousness of these efforts and those who worry that they could, as an EFF rep states, create “an environment where police can show up at anyone’s door at any time for any reason.”

Law enforcement and the code creators, as you’d expect, argue that it’s only the bad guys that need to worry about a system like this:

“A press liaison for the NIJ explains in an email: “These are persons who the model has determined are those most likely to be involved in a shooting or homicide, with probabilities that are hundreds of times that of an ordinary citizen.” Commander Steven Caluris, who also works on the CPD’s predictive policing program, put it a different way. “If you end up on that list, there’s a reason you’re there.”

Unless law enforcement makes a mistake, your data is wrong (which it often will be), or we decide to expand the program significantly, right? Another concern bubbling up in Chicago is that the programs are effectively using racial profiling to target already-troubled areas where crime naturally would be greater due to poverty, without anybody bothering to perform a deeper analysis of why those areas might be having problems (aka targeting symptoms, not disease):

“…how are we deciding who gets on the list and who decides who gets on the list?” (EFF staff attorney Hanni) Fakhoury asks…”Are people ending up on this list simply because they live in a crappy part of town and know people who have been troublemakers? We are living in a time when information is easily shareable and easily accessible,” Fakhoury says. “So, let’s say we know that someone is connected to another person who was arrested. Or, let’s say we know that someone’s been arrested in the past. Is it fair to take advantage of that information? Are we just perpetuating the problem?” He continues: “How many people of color are on this heat list? Is the list all black kids? Is this list all kids from Chicago’s South Side? If so, are we just closing ourselves off to this small subset of people?”

Chicago PD denies that there’s any “racial, neighborhood, or other such information” being used in their heat list calculations, but a FOIA request to actually confirm that was denied, under the pretense that releasing such information could “endanger the life or physical safety of law enforcement personnel or any other person.” So yeah, there’s great transparency at work here as well.

Predictive computing is excellent for a good many things, from improving traffic congestion to designing sewer networks, but calculating the future movements of highly complicated and emotional human beings is a bridge too far. It’s not particularly difficult to imagine a future where law enforcement (not always known for nuanced thinking or honest crime stat record keeping) starts using their belief in the infallibility of mathematics as the underpinnings for bad behavior, with the horrible experiences of the falsely accused dismissed as anecdotal experiences (“well shucks, most of the time the system is right, so its existence is justified”). It might just be time for a re-watch of Terry Gilliam’s Brazil with an eye on reminding ourselves what a simple clerical error can do to the Archibald Buttles of the world.

Filed Under: , , , ,

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Chicago PD Believes It Can See The Future, Starts Warning Citizens About Crimes They Might Commit”

Subscribe: RSS Leave a comment
Ninja (profile) says:

Re: I got a GREAT idea!

Another concern bubbling up in Chicago is that the programs are effectively using racial profiling to target already-troubled areas where crime naturally would be greater due to poverty, without anybody bothering to perform a deeper analysis of why those areas might be having problems (aka targeting symptoms, not disease)

Actually, it’s the laziness idea. Go for the low hanging fruit instead of doing proper investigative work. Pretty much the default for law enforcement these days.

Ed Allen (profile) says:

Re: Re: ...

We all saw a real world example of the infallability of math when derivatives crashed the finance system didn’t we.

And of course, EVERY police organization will assure us that the fall under the “TOO BIG to fail” umbrella. If we disband them or even just cut back on their military weaponry the “Lives will be lost !!!”.

Watch for some idiot to propose being on the list automatically labels you a felon so you can no longer own a gun.

trollificus (profile) says:

Re: Re: Re: ...


No “math” was at fault in that crash. The fact is, the math was applied to financial instruments that contained wildly over-valued components (and some fraudulently mis-valued). Now, was there some kind of math error involved there? No.

It was fraud. Too many 1%ers making too much money off the overvaluation of homes, and when the only way to maintain the impetus of the bubble was to find new buyers…well, the government, under the guise of “helping the poor” was perfectly willing to front your tax dollars to MAKE new buyers out of people who really couldn’t afford homes.

Those ridiculous mortgages, lumped together, were the “rotten apples” (or ‘tranches’, I believe they were called) of the derivative market.

The math itself, and the fallibility thereof, had nothing to do with it.

Anonymous Coward says:

Seems to me that if the public were to make “a list” based on actions and past homicides most law enforcement institutions and government officials would rank pretty high on the “heat list” ,This is a very slippery slope they’ve decided to descend and need to be careful or they may just end up being on the wrong side of it. I hope they make this a law pass it let us be the judges and jury of our elected officials ,the heat list will soon be a flaming pit of horror.
it would be much easier for the people to watch each party or each agency ,so bring it.

Anonymous Coward says:

This whole profiling algorithm system, sounds like a huge waste of time and money. Even if this alleged prediction system were miraculously correct with it’s predictions (highly unlikely), it still doesn’t predict the exact time and place a supposed future crime would occur.

The only thing this profiling system is good at, is adding people who have yet to commit a crime to a list. I hope this list isn’t available for employers, otherwise this system will simply compound crime, by making it hard for people to find jobs. Which pushes them towards crime!

vastrightwing (profile) says:

Minority Report Redux

At this point, it doesn’t take too much insight to understand what’s going on. The system’s been in place for decades already. Now that the light has been cast on it, they are threating us with it. They don’t even need to tell us how they’re doing it because we all know, thanks to Snowden.

Thanks to grants to small cities and towns, they have military force available to carry out their threats. Deep Politics is coming to the surface, or at least is beginning to show its fa?ade. Keep in mind: this is only the tip of ice burg. They are scared. Their mass has become too big and is out of balance. Be cautious around scared creatures; you don’t know what they will do when frightened.

You need to extricate yourself from the system: stop feeding it. Stop using credit. In fact, avoid using federal reserve notes. This is the blood. The future is neighbor to neighbor.

DogBreath says:

Re: Re: Re:2 Idea stolen from Futurama

Yes, but only ideas from the future.

This will occur only when the Future Crimes Division learn how to predict future ideas and who would have really thought of them, then they can know whom to arrest when that person comes up with an idea they couldn’t have possibly come up with on their own in a million years.

Ideas stolen from the past will still only be investigated by the Past Crimes Division. For what purpose, no one knows.

Anonymous Coward says:

If we don’t fight against this NOW, they WILL want to recreate the Minority Report future. Because that’s how they think, and that’s what you get when you allow the government and authorities to put “safety” above EVERYTHING ELSE.

“Knock, knock – it’s the police. We’ve been alerted by our software system that you have an 80 percent chance to punch someone this week. We’re here to arrest you.”

This stuff WILL happen, if we LET IT happen. But it’s going to be a trend, not something that happens all of the sudden. That’s why it’s important to kill it before it becomes too strong of a trend, and before the lobbying for such technologies becomes too strong in Congress, whether from law enforcement themselves or from companies making money off it.

Anonymous Coward says:

It beats working

Chicago’s queue of unsolved homicides, rapes, assaults, and other violent crimes is enormous. But working through those is tedious and unglamorous, plus it takes intelligence and diligence.

But this — hey now, THIS is sexy and fun. Much better than pulling out an old file and trying to find a fresh insight in it or attempting to chase down a lead. Never mind all the actual real live crime victims and all the actual real live criminals on the streets, no no no, they’re not important. Let’s worry about theoretical possibilities, about hypothetical situations, about could-be maybe futures that may never come to pass.

slow clap Congratulations, Chicago PD. Well done.

musterion (profile) says:

Watching too much TV

I think they’ve been watching too much “Person of Interest”.

They usual trope comes out:

“Law enforcement and the code creators, as you’d expect, argue that it’s only the bad guys that need to worry about a system like this:”

And I provide the usual response:
So what constitutes a crime? Say, at sometime in Dearborn MI, walking around with your head uncovered if you are a woman becomes a punishable offence. You can no doubt think of other behaviours that are not criminal now, but could be made criminal in the future. And given that there is a new federal regulation about every 3 hours (I really don’t know on this–could be every 3 min) crime becomes a creeping thing.

Mega1987 (profile) says:

Bad apple analogy...

those guys are Declaring a barrel of apples bad due to the fact that they found a handful rotten while the rest are still fresh….

jeez… very discriminating for the whole group of people for a few who are REALLY the cause of the problem….

I say… They are TOO LAZY to do investigation and wanted to clean up in the fastest way possible with greatest rist in it… -_-

John Fenderson (profile) says:

I'm reminded of a TED talk

I forget who it was that gave the talk, but it centered around his being a victim of a hit & run by a drunk driver (who hit a telephone pole shortly after his car, and so was apprehended on the scene).

The police report was terribly confused, mixing up the two people and cars in a way that obviously made no sense. However, the cop made a single mistake that screwed the speaker: he checked the wrong box for who was at fault, and so the speaker was facing a bill of around $10,000 to repair damages to the drunk driver’s car.

The speaker obsessed over this and tried to rectify it — to no avail. His insurance company said nothing could be done unless the report was corrected, and the police were totally unwilling to revisit the issue, telling him “do the right thing and pay for the guy’s car.”

If there are such extreme problems with correctly handling crimes that have already been committed, just imagine how accurately predicting crimes that haven’t been committed will work. It’s all coming from the same data, after all.

Nick (profile) says:

“Another concern bubbling up in Chicago is that the programs are effectively using racial profiling to target already-troubled areas where crime naturally would be greater due to poverty, without anybody bothering to perform a deeper analysis of why those areas might be having problems (aka targeting symptoms, not disease)”

Well, it’s not the police’s job to prevent poverty (or the disease as we use in the analogy), their job is supposed to be treatment of the symptoms. Their local legislature is supposed to “regulate” away the disease through social programs.

art guerrilla (profile) says:

Re: Re:

that’s great, nick at nite, but you are missing the forest for the trees: whites use -say- illegal drugs at a slightly HIGHER rate than blacks, and yet blacks are FAR MORE likely to be arrested and jailed for possession of said illegal drugs…
one might be -if one were not obeisant to authority- curious as to why that is so…
well, i’ll tell you why: IF the piggies went no-knocking in more affluent and/or whiter neighborhoods and picked up all the white people with coke/etc, there would be such screaming and kvetching that you would have thought some sort of real injustice was going on…
so, IF the piggies were actually doing their jobs without fear or favor, instead of picking on the poor and benighted who can’t get out of the way of piggies, you would find equal numbers of drug crimes in each of the neighborhoods, and thus their ‘predictions’ should present them each as being equivalent…
they do not…
they will not…
they dare not…
they are picking on the poor and weak because they can; EVEN IF the donut eaters wanted to go after wealthier/whiter people, they will not do so for fear of losing their jobs…

That One Guy (profile) says:

Re: Re:

Thing is though, with a system like this in place, if you don’t answer for any reason, whether you’re distracted/occupied by something else, or you don’t feel like talking to a group of thugs doing house calls to tell people how bad they are, since the system has tagged you as a ‘likely threat’, odds are they’d feel fully justified in breaking in to stop whatever nefarious crime you were obviously planning.

Anonymous Coward says:

Garbage in, garbage out... but what about the algorithm itself?

“Unless law enforcement makes a mistake, your data is wrong (which it often will be), or we decide to expand the program significantly, right?”

Are you assuming the algorithm itself is scientifically sound? Even if it is, what if the police have more confidence in the algorithm’s output than is statistically warranted?

Consider how such a system would work if it were fed all the metadata NSA is routinely collecting and search warrants were issued based on its output. We’d end up with a police state propped up by questionable science. We’d find a lot of bad guys, to be sure, but we’d also harass and invade the privacy of many innocents.

Even if the system were 99% effective, its output should not be taken as evidence of future wrongdoing. Being considered likely to commit a crime in the future is not probable cause for anything.

Anonymous Coward says:

Re: Garbage in, garbage out... but what about the algorithm itself?

“Being considered likely to commit a crime in the future is not probable cause for anything.”

This should read “likely to commit some crime in the future”. If there’s reason to believe a specific crime is about to be committed in the very near future, greater suspicion is warranted. If it’s merely likely that some kind of crime is likely to be committed in the future, then the confidence is poor, no matter the statistics.

Anonymous Coward says:


Errors are not the only problem. What about the person who maliciously gets added to the list simply because the police want an excuse to harass him or the police simply blaming the harassment of a citizen on a predictive error of the computer that never actually existed? Oh wait. What am I thinking. Law enforcement would never do anything so unethical, malicious, and manipulative as that. Would they?

Anonymous Coward says:

I don’t see the problem in a law enforcement officer showing up and reminding someone to follow the law.

A program like this shouldn’t be used as evidence in a prosecution or justification for a search, but I don’t think there’s anything preventing police from randomly talking to people (much less talking to people based on data analysis).

Anonymous Coward says:

Re: Another Biased Journalist?

Look, all algorithmic predictions will come up with wrong conclusions at least some of the time. Look at weather models predicting things. Google is often wrong about what people are looking for. It’s the nature of algorithmic predictions. The difference though is that you don’t have the police making judgements about whether or not a person needs to be arrested or not based on a computer program that sometimes gets things wrong.

Anonymous Coward says:

Re: Re: Another Biased Journalist?

And if you look at the response given by the person in charge of the system, it is quite obvious that he doesn’t understand (or doesn’t care about) this:

“If you end up on that list, there’s a reason you’re there.”

He believes that a computer system, that by it’s very nature cannot be perfect, is in fact perfect. Either that or he doesn’t care about when it is not and an innocent person is affected.

That is a BIG problem.

Anonymous Coward says:

Re: Re: Re: Another Biased Journalist?

Of course they aren’t. NYPD isn’t arresting people with Stop and Frisk either, are they? They are just harassing people based on profiling. This is profiling that they’ve just automated with a computer with the convenient excuse to say, “Oh it’s not our fault, the computer must have made a mistake.” That’s how it always happens to. They say say how perfect and infallible the system is when they are trying to sell it to the public, but conveniently forget all of that when innocent people get wronged later once it’s implemented and they want to avoid accountability.

Anonymous Coward says:

Re: Re: Re:3 Another Biased Journalist?

Let’s say that Stop and Frisk was instead Stop and Talk. Then MAYBE that would be ok. That is a cop walking down the street and happening upon someone in public and addressing them. It’s completely different if they decide to go to someone’s house to “talk” to them based on the fact that a computer gave them some statistical analysis data. If you or I did something like that to someone more than once, we would be quickly slapped with a restraining order for stalking them.

Anonymous Coward says:

Re: Re: Re:6 Another Biased Journalist?

There is nothing inherently wrong with “profiling”, especially if it is just used to talk with someone (not a stop, not a search, not an arrest).

If you fit the “profile” of a guy running down the street with a gun and one hand and a bank bag in the other, it’s completely reasonable to stop you on suspicion that you may have committed a crime.

If you are “profiling” someone based on a prohibited bases (e.g., race), that’s a different question. Even then, though, it still depends on that actions the “profiling” is used for. You want to talk to “community leaders” on an issue, maybe you want to make sure you talk to leaders that racially/ethnically represent the community, for example.

Anonymous Coward says:

Re: Re: Re:5 Another Biased Journalist?

Did you also.stop to think where they might get some of this data that they plug into the algorithm? Given the reported problems of warrantless spying generating large amounts of data on individuals, and effectively no oversight on this collection or how the data is used,it is not a stretch that this sort of information will be fed into such programs, which will then point law enforcement to profile people based on such pointers. Then take into account that law enforcement seems to have no problem with the practice of evidence laundering which is what they euphemistically call parallel construction, and the whole thing becomes an efficient gutting of the fourth amendment.

DogBreath says:

Re: Re: Re:7 Another Biased Journalist?

But by that logic (X might not be bad, but X might lead to or be correlated with Y, which is bad), we should get rid of police departments (since they might lead to police abuse).

Nice argument, but like a bucket with a hole in the bottom, it fails to hold water.

With law enforcement sharing access to all kinds of databases these days, it’s not so hypothetical after all:

Parallel Construction Revealed: How The DEA Is Trained To Launder Classified Surveillance Info

Anonymous Coward says:

Re: Re: Re: Another Biased Journalist?

And I didn’t say that they were arresting them based on that. I said that they were deciding whether the person NEEDS to be arrested based on that. A police officer may decide that someone committed a crime but not have enough evidence to make a strong case so they don’t arrest them. They are still making the judgement that the person must be guilty beforehand based on limited information that could be faulty even if they choose not to arrest them.

art guerrilla (profile) says:

They've already stopped 12 active shooters....

1. in spite of my criticism to follow, thank you for your post and point…
2. um, i call ‘bullshit’ on the 12 ‘active shooters’ (whatever, when LEO start in on their own lingo, perverting and changing the meanings of normal words, you know they are hiding shit they are up to)…
and the feebs have ‘foiled’ a dozen ‘terrorist’ plots, too, doncha know ! ! ! /sarc off
3. simply put: an academic who is ‘selling’ (whether literally or not) a system (‘his’ system?) which predicts stuff and he says others say it did and stopped 12 shooters ? ? ? well, by gosh and by golly that surely must be true, then…
4. not only does LEO and the (in)justice department have a license to kill, they have a license to lie; i don’t trust ANY OF THEM, EVER… (they have earned that distrust)
5. you know what, i spy on 100% of the pre-criminal citizens, maybe i can stop a few crimes too…
they lose their souls, and we lose our freedom…
NOT a ‘trade-off’ i want to make…
even remotely…
our brief *SHOULD* be to maximize freedom, not minimize it… we are going the wrong way…

Chicago Citizen says:

I think they should first apply this technology to predict which politicians are most likely to accept bribes, kickbacks and other corrupt forms of compensation.

They can follow that up by using to it to predict which police officers, prosecutors and other members of the justice department are most likely to accept bribes, manufacute evidence, torture suspects, etc…

After that, if there is anyone left to manage such a program they can direct it at “potential criminals”

John Fenderson (profile) says:

They've already stopped 12 active shooters....

The claim that this already stopped “12 shooters” seems very specious at best, for a number of reasons — the most important one being that you’ve defined “active shooter” as someone who is not, in fact, an active shooter:

active shooters defined as those whose online behavior in conjunction with information collected from other sources manifested ?intent?

In other words, the definition is: an active shooter is one that has been designated as such by this system. It’d be more impressive if they were, you know, actually active shooters.

There are 2.6 billion people on the internet world wide so far that voluntarily provide through social media and internet use open source intelligence.

This is a mischaracterization. I doubt that a sizable percentage of people using social media are voluntarily donating their data for this use. It’s more likely that they don’t really understand that they’re being actively spied on.

By the way, this reads like you’re saying that everyone who uses the internet is voluntarily donating data for this use. That can’t be what you mean, but I wanted to double-check.

When I listen to what they are attempting to do, on some level as a citizen I am comforted.

You and I are diametrically opposed in our reactions to this. I am the exact opposite of comforted by this effort. In fact, it’s pretty terrifying. Certainly scarier than terrorism or random nutcases shooting up schools.

Didit says:

Re: They've already stopped 12 active shooters....

Active shooters appears to be their term, not the poster’s definition. I’m getting the vibe they mean people who clearly were in the stages of putting together some terrible event and just short of acting the event out.

As to people ‘voluntarily’ providing information, it is considered open source in that when we comment – like now, there is no expectation of privacy and we voluntarily participate on the internet in so many ways. No one is forcing us to to do. That the majority don’t understand the lack of privacy doesn’t change the voluntary nature of the participation.

The video tapes mention the open source intelligence as social media. I also read the post and the comment ‘comforted’ was part of the response. The other part was the author was less than comforted by the reality it could go too far.

John Fenderson (profile) says:

Re: Re: They've already stopped 12 active shooters....

“Active shooters appears to be their term, not the poster’s definition.”

Yes, but one the commenter accepted and embraced in her essay.

“I’m getting the vibe they mean people who clearly were in the stages of putting together some terrible event and just short of acting the event out.”

You don’t need to go with a “vibe”. They specifically defined the term, and I quoted their definition.

” it is considered open source in that when we comment – like now, there is no expectation of privacy and we voluntarily participate on the internet in so many ways.”

In a venue like this, yes. But not on the internet as a whole, which is what it sounded like she was saying — thus my request for clarification.

“That the majority don’t understand the lack of privacy doesn’t change the voluntary nature of the participation.”

That’s a different thing. Not understanding that you’re being spied on absolutely does change the voluntary nature of “donating your data” to the spies. Nobody is volunteering to donate their data to the NSA, for example.

“The video tapes mention the open source intelligence as social media”

Yes, my question was about the commenter’s inclusion of “the internet” along with social media. “The internet” is a much larger sphere and includes lots of things that are unambiguously private.

Anonymous Coward says:

“If you end up on that list, there’s a reason you’re there.”

Ask Rahinah Ibrahim about that. Sure there was a reason she was on that list. Someone fucked up and everyone believed that she was there for a good reason. And when it finally became obvious that it was because someone fucked up, they did everything they could to avoid accountability.

Anonymous Coward says:

What if they started feeding the personal information and entire employment history including every case they were involved with, partners they had, training, etc. into a computer model that was designed to predict the likelihood of officers to engage in corruption and/or abuse their authority with regards to the public for the purposes of flagging potentially bad officers that needed to be watched for those sorts of things. Do you think the Chicago PD would support such a program if they were assured that the algorithm was really good, and “if you end up on that list, there’s a reason you’re there.” If not, then why would it be ok for such a program to be used to analyze the public in such a way?

Sunhawk (profile) says:

Speaking as someone who works with cognitive models of human thinking and is quite familiar with the benefits and detriments of computer-run algorithms…

Whoever is championing the code should be taken out back and given some wall-to-wall therapy before being tied to a chair at the university’s “Programmer Ethics” equivalent. And probably a basic class on algorithms, heuristics and the like.

And if their university doesn’t cover such material… then that explains the problems, I suppose.

Computer decision-making should only be an *adjunct* to human decision-making in non-trivial circumstances. An algorithm is only as good as the worst of the people who created it AND the people who implemented it.

And while human decision-making, with on-the-fly application and development of new heuristics as needed, can generally adapt to situations with incomplete or incorrect information at least reasonably well (all things considered), computer decision-making just sits there and craps itself.

Sunhawk (profile) says:

Re: Re:

Al algorithm that works as described (identifies individuals at risk of committing a crime) is working with extremely incomplete information for its modeling.

Now, using it to predict future lawbreaking and then comparing to future data (ie, do they commit violent crimes) could lead (assuming the model is clean) to pinning down innocuous causal factors that can be fixed without human risk or cost, but the way they’re using it is most certainly Doing It Wrong.

Anonymous Coward says:

This would be a violation of the 4th amendment, as it is illegal search and seizure. Collecting information and storing it before a crime has been committed. This has already been tested in the Supreme Court on mass collections of DNA for a database that would quickly pinpoint the perp of a crime. The Supreme Court says, not no, but hell no.

trollificus (profile) says:

"Slippery Slope" is usually a fallacy...BUT

But in every instance of government action, it’s a valid predictive method. I mean, people are worried the algorithm might not be accurate? What the hell, they use dogs ‘alerting’ as evidence to warrant sometimes destructive searches, and just suppress studies that place the accuracy of dope-sniffing dogs at around 60%.

So yeah, everything about this “It’s just to give those people a warning.” system is just teetering on a steep, greased hill. Combined with militarization of PDs across the country and an ever intensifying ‘us against them’ mentality, I don’t think the citizenry will be well-served (or protected) by these pre-crime efforts.

That said, a couple of points:
a) I very much doubt a single person on the list is without previous violent violations. They are NOT going to anticipate someone’s first crime.

b) IF we were to allow for the possibility of good intentions from the CPD (not a given), whatever the underlying causes (or ‘disease’ in the analogy) of crime, if the ‘symptom’ is murder or violent assault, the police are obliged to deal with it, without any obligation to make sure every demographic is happy, prosperous, well-educated and non-violent.

Just sayin’…

Chewybeea (user link) says:

cops and robbers

I always thought that cops loved crime. Don’t firemen love a fire? Don’t painters love a peeling or unpainted wall? Doesn’t a seamstress love a fraying seam of cloth?

Cops are the purported repairmen of societal fray. But since all municipal police departments ultimately become unionized, they are also intent on making sure that their livelihood is maintained and expanded. They do this in tandem with the bureaucratic/political unions, whom also use crime as a platform for the justification of their existence, to a partial extent.

Show me one police officer that wants crime to end, and I’ll then show you an axeman that wants the forest to die.

Add Your Comment

Your email address will not be published.

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...