All the people who have always brushed off concerns about surveillance tech, please come get your kids. And then let someone else raise them.
Lots of people are fine with mass surveillance because they believe the horseshit spewed by the immediate beneficiaries of this tech: law enforcement agencies that claim every encroachment on your rights might (MIGHT!) lead to the arrest of a dangerous criminal.
Running neck and neck with government surveillance state enthusiasts are extremely wealthy Americans. When they’re not adding new levels of surveillance to the businesses they own, they’re scattering cameras all around their gated communities and giving cops unfettered access to any images these cameras record.
Here’s how it plays out at the ground level: parents can’t get their kids enrolled in the nearest school because of surveillance tech. In one recent case, license plate reader data was used to deny enrollment because the data collected claimed the parent didn’t actually reside in the school district.
Just over a year ago, Thalía Sánchez became the proud owner of a home in Alsip. She decided to leave the bustle of the city for a quiet neighborhood setting and the best possible education for her daughter.
However, to this day, despite providing all required paperwork including her driver’s license, utility bills, vehicle registration, and mortgage statement, the Alsip Hazelgreen Oak Lawn School District 126 has repeatedly denied her daughter’s enrollment.
Why would the district do this? Well, it’s apparently because it has decided to trust the determinations made by its surveillance tech partner, rather than documents actually seen in person by the people making these determinations.
According to the school district, her daughter’s new student enrollment form was denied due to “license plate recognition software showing only Chicago addresses overnight” in July and August. In an email sent to Sánchez in August, the school district told her, “Although you are the owner on record of a house in our district boundaries, your license plate recognition shows that is not the place where you reside.”
But that’s obviously not true. Sanchez says the only reason plate reader data would have shown her car as “staying” in Chicago was because she lent it to a relative during that time period. The school insists this data is enough to overturn the documents she’s provided because… well, it doesn’t really say. It just claims it “relies” on this information gathering to determine residency for students.
All of this can be traced back to Thompson Reuters, which apparently has branched out into the AI-assisted, ALPR-enabled business of denying enrollment to students based on assumptions made by its software.
Here’s what little there is of additional information, as obtained by The Register while reporting on this case:
Thomson Reuters Clear, which more broadly is an AI-assisted records investigation tool, has a page dedicated to its application for school districts. It sells Clear as a tool for residency verification, claiming that it can “automate” such tasks with “enhanced reliability,” and can take care of them “in minutes, not months.”
One of the particular things the Clear page notes is its ability to access license plate data “and develop pattern of life information” that helps identify whether those who are claiming they’re residents for the sake of getting a kid enrolled in school are lying or not.
Thomson Reuters does not specify where it gets its license plate reader data and did not respond to questions.
We’ll get to the highlighted sentence in a moment, but let’s just take a beat and consider how creepy and weird this Thomson Reuters promotional pitch is:
The text reads:
Gain deeper insights into mismatched data to support meaningful conversations with families and ensure students are where they need to be. Identify where cars have been seen to establish pattern of life information.
No one expects a law enforcement agency to do this (at least without a warrant or reasonable suspicion), much less a school district. Government agencies shouldn’t have unfettered access to “pattern of life” information just because. It’s not like the people being surveilled here are engaged in criminal activity. They’re just trying to make sure their kids receive an education. And while there will always be people who game the system to get their kids into better schools, that’s hardly justification for subjecting every enrolling student’s family to expansive surveillance-enabled background checks.
And while Thomson Reuters (and the district itself) has refused to comment on the source of its plate reader data, it can safely be assumed that it’s Flock Safety. Flock Safety has never shown any concern about who accesses the data it compiles, much less why they choose to do it. Flock is swiftly becoming the leading provider of ALPR cameras and given its complete lack of internal or external oversight, it’s more than likely the case that its feeding this data to third parties like Thomson Reuters that are willing to pay a premium for data that simply can’t be had elsewhere.
We’re not catching criminals with this tech. Sure, it may happen now and then. But the real value is repeated resale of “pattern of life” data to whoever is willing to purchase it. That’s a massive problem that’s only going to get worse… full stop.
According to AdWeek, the price for a 30-second commercial during Super Bowl LX has soared to $8 million, after NBC opened in the summer by offering spots for $7 million. As AdWeek notes, “due to demand, the company has already reached its cap for the number of spots that were available for advertisers to buy during the upfront season.”
$8 million for 30 seconds sometimes means turning a niche product into a national phenomena. The 30 seconds purchased by Ring went the other way. If you want to see how $8 million can be used to promote mass surveillance enabled by consumer products, here you go:
Sure, it looks pretty innocuous. And what could be better than turning Ring and Flock Safety’s network of cameras into a digital proxy for posting “LOST DOG” signs all over the neighborhood? Well, as it turns out, pretty much everyone saw how problematic this offering was, especially considering what’s already known about Ring, Flock Safety, and both companies’ rather cavalier attitude towards privacy and other aspects of the Fourth Amendment.
To begin with, the “Search Party” feature that allows people to access recordings and images captured by other people’s cameras is already on, which likely comes as a surprise to owners of these devices. Here’s what The Verge’s Jennifer Tuohy discovered last October, shortly after Ring announced its partnership with Flock Safety — a company best known for allowing cops to hunt down people seeking abortions and/or allowing federal officers to perform nationwide searches for whoever they might be looking for (which, of course, would be anyone looking kinda like an immigrant).
[I]t turns out that Search Party is enabled by default. In an email to customers this week, Siminoff wrote that the feature is rolling out to Ring outdoor cameras in November and noted, “You can always turn off Search Party.”
I checked my cameras this morning, and they were all automatically set to enable Search Party. And I’m not alone; Ring users on Reddit have also reported that their cameras have been enabled for Search Party.
This under-reported “feature” was exposed by Ring’s Super Bowl ad, which resulted in enough backlash that Flock Safety no longer has a Ring to wear. Back to Jennifer Tuohy and The Verge:
In a statement published on Ring’s blog and provided to The Verge ahead of publication, the company said: “Following a comprehensive review, we determined the planned Flock Safety integration would require significantly more time and resources than anticipated. We therefore made the joint decision to cancel the integration and continue with our current partners … The integration never launched, so no Ring customer videos were ever sent to Flock Safety.”
While that last sentence may be true, it appears sharing was on by default when it came to Ring’s own cameras. That Flock Safety never got a chance to participate is good to know, but “Search Party” has apparently been active since its implementation last year, even if it was limited to Ring devices.
And while Ring claims the Search Party feature can’t be used to search for “human biometrics,” that’s hardly comforting when it appears Ring definitely wants to add more of this kind of thing to its existing cameras.
On top of this, the company recently launched a new facial recognition feature, Familiar Faces. Combined with Search Party, the technological leap to using neighborhood cameras to search for people through a mass-surveillance network suddenly seems very small.
Ring insists this is not another mass surveillance tool, but rather something that attempts to recognize who’s at any user’s door when sending alerts, in order to differentiate friends and family members from strangers who might be within camera range. Again, there’s some utility to this offering, but the tech lends itself to surveillance abuses, especially when law enforcement may only be a subpoena away from accessing images and recordings captured by privately-owned devices.
Finally, the statement given by Ring only states that this won’t be happening right now, which is a wise choice considering its unpopularity at the moment. But that doesn’t mean Ring and Flock won’t seek to consummate this marriage of surveillance tech, albeit in a more private fashion that doesn’t involve alarming hundreds of millions of sports viewers simultaneously.
Surveillance technology vendors, federal agencies, and wealthy private donors have long helped provide local law enforcement “free” access to surveillance equipment that bypasses local oversight. The result is predictable: serious accountability gaps and data pipelines to other entities, including Immigration and Customs Enforcement (ICE), that expose millions of people to harm.
The collection and sharing of our data quietly generates detailed records of people’s movements and associations that can be exposed, hacked, or repurposed without their knowledge or consent. Those records weaken sanctuary and First Amendment protections while facilitating the targeting of vulnerable people.
Cities can and should use their power to reject federal grants, vendor trials, donations from wealthy individuals, or participation in partnerships that facilitate surveillance and experimentation with spy tech.
If these projects are greenlit, oversight is imperative. Mechanisms like public hearings, competitive bidding, public records transparency, and city council supervision aid to ensure these acquisitions include basic safeguards — like use policies, audits, and consequences for misuse — to protect the public from abuse and from creeping contracts that grow into whole suites of products.
Clear policies and oversight mechanisms must be in place before using any surveillance tools, free or not, and communities and their elected officials must be at the center of every decision about whether to bring these tools in at all.
Here are some of the most common methods “free” surveillance tech makes its way into communities.
Trials and Pilots
Police departments are regularly offered free access to surveillance tools and software through trials and pilot programs that often aren’t accompanied by appropriate use policies. In many jurisdictions, trials do not trigger the same requirements to go before decision-makers outside the police department. This means the public may have no idea that a pilot program for surveillance technology is happening in their city.
In Denver, Colorado, the police department is running trials of possible unmanned aerial vehicles (UAVs) for a drone-as-first-responder (DFR) program from two competing drone vendors: Flock Safety Aerodome drones (through August 2026) and drones from the company Skydio, partnering with Axon, the multi-billion dollar police technology company behind tools like Tasers and AI-generated police reports. Drones create unique issues given their vantage for capturing private property and unsuspecting civilians, as well as their capacity to make other technologies, like ALPRs, airborne.
Functional, Even Without Funding
We’ve seen cities decide not to fund a tool, or run out of funding for it, only to have a company continue providing it in the hope that money will turn up. This happened in Fall River, Massachusetts, where the police department decided not to fund ShotSpotter’s $90,000 annual cost and its frequent false alarms, but continued using the system when the company provided free access.
Importantly, police technology companies are developing more features and subscription-based models, so what’s “free” today frequently results in taxpayers footing the bill later.
Gifts from Police Foundations and Wealthy Donors
Police foundations and the wealthy have pushed surveillance-driven agendas in their local communities by donating equipment and making large monetary gifts, another means of acquiring these tools without public oversight or buy-in.
In Atlanta, the Atlanta Police Foundation (APF) attempted to use its position as a private entity to circumvent transparency. Following a court challenge from the Atlanta Community Press Collective and Lucy Parsons Labs, a Georgia court determined that the APF must comply with public records laws related to some of its actions and purchases on behalf of law enforcement. In San Francisco, billionaire Chris Larsen has financially supported a supercharging of the city’s surveillance infrastructure, donating $9.4 million to fund the San Francisco Police Department’s (SFPD) Real-Time Investigation Center, where a menu of surveillance technologies and data come together to surveil the city’s residents. This move comes after the billionaire backed a ballot measure, which passed in March 2025, eroding the city’s surveillance technology law and allowing the SFPD free rein to use new surveillance technologies for a full year without oversight.
Free Tech for Federal Data Pipelines
Federal grants and Department of Homeland Security funding are another way surveillance technology appears free to, only to lock municipalities into long‑term data‑sharing and recurring costs.
Through the Homeland Security Grant Program, which includes the State Homeland Security Program (SHSP) and the Urban Areas Security (UASI) Initiative, and Department of Justice programs like Byrne JAG, the federal government reimburses states and cities for “homeland security” equipment and software, including including law‑enforcement surveillance tools, analytics platforms, and real‑time crime centers. Grant guidance and vendor marketing materials make clear that these funds can be used for automated license plate readers, integrated video surveillance and analytics systems, and centralized command‑center software—in other words, purchases framed as counterterrorism investments but deployed in everyday policing.
Vendors have learned to design products around this federal money, pitching ALPR networks, camera systems, and analytic platforms as “grant-ready” solutions that can be acquired with little or no upfront local cost. Motorola Solutions, for example, advertises how SHSP and UASI dollars can be used for “law enforcement surveillance equipment” and “video surveillance, warning, and access control” systems. Flock Safety, partnering with Lexipol, a company that writes use policies for law enforcement, offers a “License Plate Readers Grant Assistance Program” that helps police departments identify federal and state grants and tailor their applications to fund ALPR projects.
Grant assistance programs let police chiefs fast‑track new surveillance: the paperwork is outsourced, the grant eats the upfront cost, and even when there is a formal paper trail, the practical checks from residents, councils, and procurement rules often get watered down or bypassed.
On paper, these systems arrive “for free” through a federal grant; in practice, they lock cities into recurring software, subscription, and data‑hosting fees that quietly turn into permanent budget lines—and a lasting surveillance infrastructure—as soon as police and prosecutors start to rely on them. In Santa Cruz, California, the police department explicitly sought to use a DHS-funded SHSP grant to pay for a new citywide network of Flock ALPR cameras at the city’s entrances and exits, with local funds covering additional cameras. In Sumner, Washington, a $50,000 grant was used to cover the entire first year of a Flock system — including installation and maintenance — after which the city is on the hook for roughly $39,000 every year in ongoing fees. The free grant money opens the door, but local governments are left with years of financial, political, and permanent surveillance entanglements they never fully vetted.
The most dangerous cost of this “free” funding is not just budgetary; it is the way it ties local systems into federal data pipelines. Since 9/11, DHS has used these grant streams to build a nationwide network of at least 79–80 state and regional fusion centers that integrate and share data from federal, state, local, tribal, and private partners. Research shows that state fusion centers rely heavily on the DHS Homeland Security Grant Program (especially SHSP and UASI) to “mature their capabilities,” with some centers reporting that 100 percent of their annual expenditures are covered by these grants.
Civil rights investigations have documented how this funding architecture creates a backdoor channel for ICE and other federal agencies to access local surveillance data for their own purposes. A recent report by the Surveillance Technology Oversight Project (S.T.O.P.) describes ICE agents using a Philadelphia‑area fusion center to query the city’s ALPR network to track undocumented drivers in a self‑described sanctuary city.
Ultimately, federal grants follow the same script as trials and foundation gifts: what looks “free” ends up costing communities their data, their sanctuary protections, and their power over how local surveillance is used.
Protecting Yourself Against “Free” Technology
The most important protection against “free” surveillance technology is to reject it outright. Cities do not have to accept federal grants, vendor trials, or philanthropic donations. Saying no to “free” tech is not just a policy choice; it is a political power that local governments possess and can exercise. Communities and their elected officials can and should refuse surveillance systems that arrive through federal grants, vendor pilots, or private donations, regardless of how attractive the initial price tag appears.
For those cities that have already accepted surveillance technology, the imperative is equally clear: shut it down. When a community has rejected use of a spying tool, the capabilities, equipment, and data collected from that tool should be shut off immediately. Full stop.
And for any surveillance technology that remains in operation, even temporarily, there must be clear rules: when and how equipment is used, how that data is retained and shared, who owns data and how companies can access and use it, transparency requirements, and consequences for any misuse and abuse.
“Free” surveillance technology is never free. Someone profits or gains power from it. Police technology vendors, federal agencies, and wealthy donors do not offer these systems out of generosity; they offer them because surveillance serves their interests, not ours. That is the real cost of “free” surveillance.
A California police department is none too happy that its license plate reader records were accessed by federal employees it never gave explicit permission to peruse. And, once again, it’s Flock Safety shrugging itself into another PR black eye.
Mountain View police criticized the company supplying its automated license plate reader system after an audit turned up “unauthorized” use by federal law enforcement agencies.
At least six offices of four agencies accessed data from the first camera in the city’s Flock Safety license-tracking system from August to November 2024 without the police department’s permission or knowledge, according to a press release Friday night.
Flock has been swimming in a cesspool of its own making for several months now, thanks to it being the public face of “How To Hunt Down Someone Who Wanted An Abortion.” That debacle was followed by even more negative press (and congressional rebuke) for its apparent unwillingness to place any limits at all on access to the hundreds of millions of license plate records its cameras have captured, including those owned by private individuals.
Mountain View is in California. And that’s only one problem with everything in this paragraph:
The city said its system was accessed by Bureau of Alcohol, Tobacco, Firearms and Explosives offices in Kentucky and Tennessee, which investigate crimes related to guns, explosives, arson and the illegal trafficking of alcohol and tobacco; the inspector general’s office of the U.S.. General Services Administration, which manages federal buildings, procurement, and property; Air Force bases in Langley, Virginia, and in Ohio; and the Lake Mead National Recreation Area in Nevada.
Imagine trying to explain this to anyone. While it’s somewhat understandable that the ATF might be running nationwide searches on Flock’s platform, it’s almost impossible to explain why images captured by a single camera in Mountain View, California were accessed by the Inspector General for the GSA, much less Lake Mead Recreation Area staffers.
This explains how this happened. But it doesn’t do anything to explain why.
They accessed Mountain View’s system for one camera via a “nationwide” search setting that was turned on by Flock Safety, police said.
Apparently, this is neither opt-in or opt-out. It just is. The Mountain View police said they “worked closely” with Flock to block out-of-state access, as well as limit internal access to searches expressly approved by the department’s police chief.
Flock doesn’t seem to care what its customers want. Either it can’t do what this department asked or it simply chose not to because a system that can’t be accessed by government randos scattered around the nation is much tougher to sell than a locked-down portal that actually serves the needs of the people paying for it.
The privacy protection that Flock promised to Oregonians — that Flock software will automatically examine the reason provided by law enforcement officers for terms indicating an abortion- or immigration-related search — is meaningless when law enforcement officials provide generic reasons like “investigation” or “crime.” Likewise, Flock’s filters are meaningless if no reason for a search is provided in the first place. While the search reasons collected by Flock, obtained by press and activists through open records requests, have occasionally revealed searches for immigration and abortion enforcement, these are likely just the tip of the iceberg. Presumably, most officers using Flock to hunt down immigrants and women who have received abortions are not going to type that in as the reason for their search. And, regardless, given that Flock has washed its hands of any obligation to audit its customers, Flock customers have no reason to trust a search reason provided by another agency.
I now believe that abuses of your product are not only likely but inevitable, and that Flock is unable and uninterested in preventing them.
Flock just keeps making Wyden’s points for him. The PD wanted limited access with actual oversight. Flock gave the PD a lending library of license plate/location images anyone with or without a library card (so to speak) could check out at will. Flock is part of the surveillance problem. And it’s clear it’s happy being a tool that can be readily and easily abused, no matter what its paying customers actually want from its technology.
“Flock Safety” may be the brand name, but this company’s earliest sales successes had nothing to with safety. Its target audience was homeowners associations and people running gated communities in upscale neighborhoods. The purpose of the cameras (and, eventually, the attached license plate reader tech) was to make sure people who were plenty safe already weren’t annoyed by occasional intrusions by the rest of the world outside of their gates.
Then it went the Ring route, offering cheap cameras to cops. It was just inkjet printers all over again. The cameras were affordable. Subscription fees for access to footage and the company’s search engine were the real moneymaker.
And, much like Ring, Flock has ended up on the wrong side of public opinion. While it hasn’t quite generated the amount of negative press Ring’s cozy relationship with cop shops has (yet!), it’s been getting eyeballed pretty fiercely by people who aren’t fans of its access-it-all-from-anywhere attitude. A report from 404 Media showed Texas law enforcement officers utilizing the nationwide network of Flock ALPR data to hunt down someone who had engaged in a medication abortion. Weeks later, it was discovered this search was performed on behalf of her vengeful boyfriend, who sought to press criminal charges against her.
Other news has surfaced as well, making Flock Safety look even worse. It has placed almost no restrictions on access by anyone from anywhere, which has resulted in a lot of local law enforcement agencies performing searches federal agencies like CBP, US Border Patrol, and ICE can’t perform themselves. In some cases, Flock’s lack of restraint and nonexistent privacy policies has made their cameras pretty much illegal. In other cases, local lawmakers are finally reining in use of this camera network due to its steady abuse by federal officers.
Police departments in Redmond and Lynnwood have temporarily shut down their Flock license plate reader systems following growing public concerns about privacy and system access, according to city officials.
Redmond’s City Council voted unanimously Monday to turn off its Automated License Plate Reader (ALPR) cameras after learning that U.S. Border Patrol improperly accessed Auburn’s Flock system last month.
Redmond’s police chief, Darrell Lowe, insists no improper/proxy access has happened on his watch. But that doesn’t mean all that much, because it’s unclear whether or not Flock Safety would inform local cops if these agencies did. For that matter, proxy searches for federal agencies generally have access to any records generated anywhere in the country. So, it’s hardly comforting to assure people your agency hasn’t been approached directly by federal officers.
That was the point Senator Ron Wyden made in his letter to Flock Safety — one in which he pointed out that Flock has zero desire to deter abuse of its camera network, much less engage in good faith discussions about how it could go about siloing its networks so searches are restricted to areas directly overseen by local law enforcement.
The police chief in Lynnwood, however, didn’t try to make excuses. He actually attempted to do something when these concerns were first raised.
“Flock cameras have already proven to be an invaluable investigative tool in solving crimes and keeping our community safe,” Lynnwood Police Chief Cole Langdon said. “However, it’s equally important that we maintain the public’s trust.”
The ALPR program in Lynnwood launched June 29, 2025, with 25 cameras funded through a Washington Auto Theft Prevention Authority grant.
Shortly after implementation, the department learned a vendor-enabled “nationwide search” feature allowed broader access than Lynnwood authorized.
Police said they worked with Flock Safety to disable that feature on July 8.
While Flock pitched in there to respect its customer’s request, it has also gone the other way just as frequently. The company has previously been caught illegally installing cameras. In September, it was caught reinstalling cameras the city of Evanston, Illinois had ordered removed because the network (and Flock’s access options) violated the state’s privacy laws.
Private surveillance vendor Flock Safety reinstalled all of its stationary license plate cameras in Evanston that had previously been removed, apparently doing so without authorization from the city, which sent the company a cease-and-desist order Tuesday afternoon demanding that the cams be taken back down.
The city previously ordered Flock to shut down 19 automated license plate readers (18 stationary and one flex camera that can be attached to a squad car) provided by the company and put its contract with Flock on a 30-day termination notice on Aug. 26.
Predictably, this push-back against Flock is generally occurring in areas already being threatened/invaded on a daily basis by the US military and swarms of federal officers. But that’s to be expected. Those most threatened by federal abuse of local camera networks are always going to be the first to fight back. The reason it’s not happening in “red” states is because the people running those states honestly don’t care what route enables authoritarianism, just so long as it does so while their party still holds power.
More than 80 law enforcement agencies across the United States have used language perpetuating harmful stereotypes against Romani people when searching the nationwide Flock Safety automated license plate reader (ALPR) network, according to audit logs obtained and analyzed by the Electronic Frontier Foundation.
When police run a search through the Flock Safety network, which links thousands of ALPR systems, they are prompted to leave a reason and/or case number for the search. Between June 2024 and October 2025, cops performed hundreds of searches for license plates using terms such as “roma” and “g*psy,” and in many instances, without any mention of a suspected crime. Other uses include “g*psy vehicle,” “g*psy group,” “possible g*psy,” “roma traveler” and “g*psy ruse,” perpetuating systemic harm by demeaning individuals based on their race or ethnicity.
These queries were run through thousands of police departments’ systems—and it appears that none of these agencies flagged the searches as inappropriate.
These searches are, by definition, racist.
Word Choices and Flock Searches
We are using the terms “Roma” and “Romani people” as umbrella terms, recognizing that they represent different but related groups. Since 2020, the U.S. federal government has officially recognized “Anti-Roma Racism” as including behaviors such as “stereotyping Roma as persons who engage in criminal behavior” and using the slur “g*psy.” According to the U.S. Department of State, this language “leads to the treatment of Roma as an alleged alien group and associates them with a series of pejorative stereotypes and distorted images that represent a specific form of racism.”
Nevertheless, police officers have run hundreds of searches for license plates using the terms “roma” and “g*psy.” (Unlike the police ALPR queries we’ve uncovered, we substitute an asterisk for the Y to avoid repeating this racist slur). In many cases, these terms have been used on their own, with no mention of crime. In other cases, the terms have been used in contexts like “g*psy scam” and “roma burglary,” when ethnicity should have no relevance to how a crime is investigated or prosecuted.
A “g*psy scam” and “roma burglary” do not exist in criminal law separate from any other type of fraud or burglary. Several agencies contacted by EFF have since acknowledged the inappropriate use and expressed efforts to address the issue internally.
“The use of the term does not reflect the values or expected practices of our department,” a representative of the Palos Heights (IL) Police Department wrote to EFF after being confronted with two dozen searches involving the term “g*psy.” “We do not condone the use of outdated or offensive terminology, and we will take this inquiry as an opportunity to educate those who are unaware of the negative connotation and to ensure that investigative notations and search reasons are documented in a manner that is accurate, professional, and free of potentially harmful language.”
Of course, the broader issue is that allowing “g*psy” or “Roma” as a reason for a search isn’t just offensive, it implies the criminalization an ethnic group. In fact, the Grand Prairie Police Department in Texas searched for “g*psy” six times while using Flock’s “Convoy” feature, which allows an agency to identify vehicles traveling together—in essence targeting an entire traveling community of Roma without specifying a crime.
At the bottom of this post is a list of agencies and the terms they used when searching the Flock system.
Anti-Roma Racism in an Age of Surveillance
Racism against Romani people has been a problem for centuries, with one of its most horrific manifestations during the Holocaust, when the Third Reich and its allies perpetuated genocide by murdering hundreds of thousands of Romani people and sterilizing thousands more. Despite efforts by the UN and EU to combat anti-Roma discrimination, this form of racism persists. As scholars Margareta Matache and Mary T. Bassett explain, it is perpetuated by modern American policing practices:
In recent years, police departments have set up task forces specialised in “G*psy crimes”, appointed “G*psy crime” detectives, and organised police training courses on “G*psy criminality”. The National Association of Bunco Investigators (NABI), an organisation of law enforcement professionals focusing on “non-traditional organised crime”, has even created a database of individuals arrested or suspected of criminal activity, which clearly marked those who were Roma.
Thus, it is no surprise that a 2020 Harvard University survey of Romani Americans found that 4 out of 10 respondents reported being subjected to racial profiling by police. This demonstrates the ongoing challenges they face due to systemic racism and biased policing.
Notably, many police agencies using surveillance technologies like ALPRs have adopted some sort of basic policy against biased policing or the use of these systems to target people based on race or ethnicity. But even when such policies are in place, an agency’s failure to enforce them allows these discriminatory practices to persist. These searches were also run through the systems of thousands of other police departments that may have their own policies and state laws that prohibit bias-based policing—yet none of those agencies appeared to have flagged the searches as inappropriate.
The Flock search data in question here shows that surveillance technology exacerbates racism, and even well-meaning policies to address bias can quickly fall apart without proper oversight and accountability.
Cops In Their Own Words
EFF reached out to a sample of the police departments that ran these searches. Here are five representative responses we received from police departments in Illinois, California, and Virginia. They do not inspire confidence.
1. Lake County Sheriff’s Office, IL
In June 2025, the Lake County Sheriff’s Office ran three searches for a dark colored pick-up truck, using the reason: “G*PSY Scam.” The search covered 1,233 networks, representing 14,467 different ALPR devices.
In response to EFF, a sheriff’s representative wrote via email:
“Thank you for reaching out and for bringing this to our attention. We certainly understand your concern regarding the use of that terminology, which we do not condone or support, and we want to assure you that we are looking into the matter.
Any sort of discriminatory practice is strictly prohibited at our organization. If you have the time to take a look at our commitment to the community and our strong relationship with the community, I firmly believe you will see discrimination is not tolerated and is quite frankly repudiated by those serving in our organization.
We appreciate you bringing this to our attention so we can look further into this and address it.”
2. Sacramento Police Department, CA
In May 2025, the Sacramento Police Department ran six searches using the term “g*psy.” The search covered 468 networks, representing 12,885 different ALPR devices.
In response to EFF, a police representative wrote:
“Thank you again for reaching out. We looked into the searches you mentioned and were able to confirm the entries. We’ve since reminded the team to be mindful about how they document investigative reasons. The entry reflected an investigative lead, not a disparaging reference.
We appreciate the chance to clarify.”
3. Palos Heights Police Department, IL
In September 2024, the Palos Heights Police Department ran more than two dozen searches using terms such as “g*psy vehicle,” “g*psy scam” and “g*psy concrete vehicle.” Most searches hit roughly 1,000 networks.
In response to EFF, a police representative said the searches were related to a singular criminal investigation into a vehicle involved in a “suspicious circumstance/fraudulent contracting incident” and is “not indicative of a general search based on racial or ethnic profiling.” However, the agency acknowledged the language was inappropriate:
“The use of the term does not reflect the values or expected practices of our department. We do not condone the use of outdated or offensive terminology, and we will take this inquiry as an opportunity to educate those who are unaware of the negative connotation and to ensure that investigative notations and search reasons are documented in a manner that is accurate, professional, and free of potentially harmful language.
We appreciate your outreach on this matter and the opportunity to provide clarification.”
4. Irvine Police Department, CA
In February and May 2025, the Irvine Police Department ran eight searches using the term “roma” in the reason field. The searches covered 1,420 networks, representing 29,364 different ALPR devices.
In a call with EFF, an IPD representative explained that the cases were related to a series of organized thefts. However, they acknowledged the issue, saying, “I think it’s an opportunity for our agency to look at those entries and to use a case number or use a different term.”
5. Fairfax County Police Department, VA
Between December 2024 and April 2025, the Fairfax County Police Department ran more than 150 searches involving terms such as “g*psy case” and “roma crew burglaries.” Fairfax County PD continued to defend its use of this language.
In response to EFF, a police representative wrote:
“Thank you for your inquiry. When conducting searches in investigative databases, our detectives must use the exact case identifiers, terms, or names connected to a criminal investigation in order to properly retrieve information. These entries reflect terminology already tied to specific cases and investigative files from other agencies, not a bias or judgment about any group of people. The use of such identifiers does not reflect bias or discrimination and is not inconsistent with our Bias-Based Policing policy within our Human Relations General Order.”
A National Trend
Roma individuals and families are not the only ones being systematically and discriminatorily targeted by ALPR surveillance technologies. For example, Flock audit logs show agencies ran 400 more searches using terms targeting Traveller communities more generally, with a specific focus on Irish Travellers, often without any mention of a crime.
Across the country, these tools are enabling and amplifying racial profiling by embedding longstanding policing biases into surveillance technologies. For example, data from Oak Park, IL, show that 84% of drivers stopped in Flock-related traffic incidents were Black—despite Black people making up only 19% of the local population. ALPR systems are far from being neutral tools for public safety and are increasingly being used to fuel discriminatory policing practices against historically marginalized people.
The racially coded language in Flock’s logs mirrors long-standing patterns of discriminatory policing. Terms like “furtive movements,” “suspicious behavior,” and “high crime area” have always been cited by police to try to justify stops and searches of Black, Latine, and Native communities. These phrases might not appear in official logs because they’re embedded earlier in enforcement—in the traffic stop without clear cause, the undocumented stop-and-frisk, the intelligence bulletin flagging entire neighborhoods as suspect. They function invisibly until a body-worn camera, court filing, or audit brings them to light. Flock’s network didn’t create racial profiling; it industrialized it, turning deeply encoded and vague language into scalable surveillance that can search thousands of cameras across state lines.
The Path Forward
U.S. Sen. Ron Wyden, D-OR, recently recommended that local governments reevaluate their decisions to install Flock Safety in their communities. We agree, but we also understand that sometimes elected officials need to see the abuse with their own eyes first.
We know which agencies ran these racist searches, and they should be held accountable. But we also know that the vast majority of Flock Safety’s clients—thousands of police and sheriffs—also allowed those racist searches to run through their Flock Safety systems unchallenged.
Elected officials must act decisively to address the racist policing enabled by Flock’s infrastructure. First, they should demand a complete audit of all ALPR searches conducted in their jurisdiction and a review of search logs to determine (a) whether their police agencies participated in discriminatory policing and (b) what safeguards, if any, exist to prevent such abuse. Second, officials should institute immediate restrictions on data-sharing through Flock’s nationwide network. As demonstrated by California law, for example, police agencies should not be able to share their ALPR data with federal authorities or out-of-state agencies, thus eliminating a vehicle for discriminatory searches spreading across state lines.
Ultimately, elected officials must terminate Flock Safety contracts entirely. The evidence is now clear: audit logs and internal policies alone cannot prevent a surveillance system from becoming a tool for racist policing. The fundamental architecture of Flock—thousands of cameras feeding into a nationwide searchable network—makes discrimination inevitable when enforcement mechanisms fail.
As Sen. Wyden astutely explained, “local elected officials can best protect their constituents from the inevitable abuses of Flock cameras by removing Flock from their communities.”
Table Overview and Notes
The following table compiles terms used by agencies to describe the reasons for searching the Flock Safety ALPR database. In a small number of cases, we removed additional information such as case numbers, specific incident details, and officers’ names that were present in the reason field.
We removed one agency from the list due to the agency indicating that the word was a person’s name and not a reference to Romani people.
In general, we did not include searches that used the term “Romanian,” although many of those may also be indicative of anti-Roma bias. We also did not include uses of “traveler” or “Traveller” when it did not include a clear ethnic modifier; however, we believe many of those searches are likely relevant.
A text-based version of the spreadsheet is available here.
I’m not saying this unholy matrimony wouldn’t have occurred under any other regime, but it’s definitely the sort of thing that plays well with the Oval Office while it’s housing Donald Trump.
Both Flock Safety and Ring have weathered plenty of negative press, largely because they were doing the sort of thing they’re going back to doing now: turning private cameras into extensions of government surveillance networks.
Flock Safety began by pitching its products to some of the most secure people in the nation: wealthy white homeowners. Flock Safety became just another way for gated communities and HOAs to keep a tab on residents while also casting a skeptical eye towards anyone (or any vehicles) those running the cameras didn’t immediately recognize.
Then it invited cops to play with its equipment and install some of their own. It went from keeping black people out of white neighborhoods to becoming a tool to be wielded by cops as they searched for a woman who had terminated a pregnancy — not because cops cared about her well-being, but at the behest of her apparently abusive boyfriend. Law enforcement investigators and officials claimed the nationwide searches for the person seeking an abortion was all about finding her safely. Even after internal documents revealed it was actually about finding her in hopes of pressing charges for violating Texas’s abortion ban, Flock Safety has continued to criticize journalists for reporting on this apparent abuse of its camera network.
Ring democratized front door surveillance, for better or worse. It gave people a cheap option for keeping crime off their literal doorstep. But it also invited cops along for the ride, giving them free cameras to hand out to citizens with the implied suggestion a free camera would result in warrantless access to footage any time the cops felt like looking at it.
Ring finally rolled back its carte blanche cop access and demanded a bit more paperwork from law enforcement before allowing it to raid its cloud storage. Flock Safety — in response to congressional criticism — made vague statements about limiting abuse of camera access by law enforcement. Of course, those words were meaningless, as Senator Ron Wyden recently pointed out in a letter to Flock Safety CEO Garret Langley:
In August, 9 News in Denver revealed that Flock granted U.S. Customs and Border Protection (CBP) access to its systems, enabling the agency to search data collected by Flock’s cameras, including using the National Lookup Tool. Officials from Flock subsequently confirmed to my office in September that the company provided access to CBP, Homeland Security Investigations (HSI), the Secret Service, and the Naval Criminal Investigative Service as part of a pilot earlier this year. Flock told my office that during the pilot, which has now ended, CBP and HSI conducted approximately 200 and 175 searches respectively. Flock also confirmed that itmisled its state and local law enforcement customers, telling my office that “due to internal miscommunication, customers were inaccurately informed that Flock did not have any relationship with DHS, while pilot programs with sub-agencies of DHS were briefly active.”
The abortion investigation described above is also mentioned in the letter, which closes with Ron Wyden telling the company that no one should trust what Flock Safety says because when it’s not misleading people, it’s both incapable and unwilling to place meaningful restrictions on law enforcement access to its nationwide network of cameras:
The privacy protection that Flock promised to Oregonians — that Flock software will automatically examine the reason provided by law enforcement officers for terms indicating an abortion- or immigration-related search — is meaningless when law enforcement officials provide generic reasons like “investigation” or “crime.” Likewise, Flock’s filters are meaningless if no reason for a search is provided in the first place. While the search reasons collected by Flock, obtained by press and activists through open records requests, have occasionally revealed searches for immigration and abortion enforcement, these are likely just the tip of the iceberg. Presumably, most officers using Flock to hunt down immigrants and women who have received abortions are not going to type that in as the reason for their search. And, regardless, given that Flock has washed its hands of any obligation to audit its customers, Flock customers have no reason to trust a search reason provided by another agency.
I now believe that abuses of your product are not only likely but inevitable, and that Flock is unable and uninterested in preventing them.
Law enforcement agencies will soon have easier access to footage captured by Amazon’s Ring smart cameras. In a partnership announced this week, Amazon will allow approximately 5,000 local law enforcement agencies to request access to Ring camera footage via surveillance platforms from Flock Safety.
[…]
According to Flock’s announcement, its Ring partnership allows local law enforcement members to use Flock software “to send a direct post in the Ring Neighbors app with details about the investigation and request voluntary assistance.” Requests must include “specific location and timeframe of the incident, a unique investigation code, and details about what is being investigated,” and users can look at the requests anonymously, Flock said.
[…]
Flock said its local law enforcement users will gain access to Ring Community Requests in “the coming months.”
We absolutely didn’t need these two major players in the private surveillance market to team up and offer expanded access to US law enforcement — especially when so much of US law enforcement is focused on the “criminal” acts listed in Wyden’s letter: abortions and immigration.
According to Ars Technica’s reporting, Ring is the most active participant in this new surveillance dragnet. First, Ring rolled back its promise to limit law enforcement access to Ring footage by partnering with Axon, a heavy-hitter in the US body camera marketplace. Then it decided to court one of the rivals in its own marketplace, which means both companies can still pretend to hold unique ideals while ensuring the bastard child of this coupling will render those ideals irrelevant.
Flock says that its cameras don’t use facial recognition, which has been criticized for racial biases. But local law enforcement agencies using Flock will soon have access to footage from Ring cameras with facial recognition.
Both companies will be able to blame each other the next time abusive access is revealed. And Ring’s network will presumably gain features it doesn’t have currently via its meshing with Flock, like license plate recognition and an algorithm that can be applied to Ring footage that allows cops to do things it can’t with Ring alone, like search for suspects using nothing but vehicle or clothing descriptions.
And this assurance is especially meaningless, given what’s already known about both of these companies:
Amazon and Flock say their collaboration will only involve voluntary customers and local enforcement agencies.
When both companies store recordings in their own clouds, “voluntary” is beside the point. Law enforcement can just approach either company directly with warrants or subpoenas and get what has been denied to them by these companies’ customers. And restraining searches to “local law enforcement” agencies is impossible if neither company is interested in limiting searches to local areas and/or taking steps to prevent local agencies from performing searches on behalf of federal officers.
Even if both companies take heat for doing this, they’ll still do it. After all, they’ve got an entire administration standing behind them that’s willing to call anyone who questions or criticizes this unofficial merger a friend of criminals, if not an actual enemy of the nation.
New documents and court records obtained by EFF show that Texas deputies queried Flock Safety’s surveillance data in an abortion investigation, contradicting the narrative promoted by the company and the Johnson County Sheriff that she was “being searched for as a missing person,” and that “it was about her safety.”
The new information shows that deputies had initiated a “death investigation” of a “non-viable fetus,” logged evidence of a woman’s self-managed abortion, and consulted prosecutors about possibly charging her.
Johnson County Sheriff Adam King repeatedly denied the automated license plate reader (ALPR) search was related to enforcing Texas’s abortion ban, and Flock Safety called media accounts “false,” “misleading” and “clickbait.” However, according to a sworn affidavit by the lead detective, the case was in fact a death investigation in response to a report of an abortion, and deputies collected documentation of the abortion from the “reporting person,” her alleged romantic partner. The death investigation remained open for weeks, with detectives interviewing the woman and reviewing her text messages about the abortion.
The documents show that the Johnson County District Attorney’s Office informed deputies that “the State could not statutorily charge [her] for taking the pill to cause the abortion or miscarriage of the non-viable fetus.”
An excerpt from the JCSO detective’s sworn affidavit.
The records include previously unreported details about the case that shocked public officials and reproductive justice advocates across the country when it was first reported by 404 Media in May. The case serves as a clear warning sign that when data from ALPRs is shared across state lines, it can put people at risk, including abortion seekers. And, in this case, the use may have run afoul of laws in Washington and Illinois.
A False Narrative Emerges
Last May, 404 Media obtained data revealing the Johnson County Sheriff’s Office conducted a nationwide search of more than 83,000 Flock ALPR cameras, giving the reason in the search log: “had an abortion, search for female.” Both the Sheriff’s Office and Flock Safety have attempted to downplay the search as akin to a search for a missing person, claiming deputies were only looking for the woman to “check on her welfare” and that officers found a large amount of blood at the scene – a claim now contradicted by the responding investigator’s affidavit. Flock Safety went so far as to assert that journalists and advocates covering the story intentionally misrepresented the facts, describing it as “misreporting” and “clickbait-driven.”
As Flock wrote of EFF’s previous commentary on this case (bold in original statement):
Earlier this month, there was purposefully misleading reporting that a Texas police officer with the Johnson County Sheriff’s Office used LPR “to target people seeking reproductive healthcare.” This organization is actively perpetuating narratives that have been proven false, even after the record has been corrected.
According to the Sheriff in Johnson County himself, this claim is unequivocally false.
… No charges were ever filed against the woman and she was never under criminal investigation by Johnson County. She was being searched for as a missing person, not as a suspect of a crime.
That sheriff has since been arrested and indicted on felony counts in an unrelated sexual harassment and whistleblower retaliation case. He has also been charged with aggravated perjury for allegedly lying to a grand jury. EFF filed public records requests with Johnson County to obtain a more definitive account of events.
The newly released incident report and affidavit unequivocally describe the case as a “death investigation” of a “non-viable fetus.” These documents also undermine the claim that the ALPR search was in response to a medical emergency, since, in fact, the abortion had occurred more than two weeks before deputies were called to investigate.
In recent years, anti-abortion advocates and prosecutors have increasingly attempted to use “fetal homicide” and “wrongful death” statutes – originally intended to protect pregnant people from violence – to criminalize abortion and pregnancy loss. These laws, which exist in dozens of states, establish legal personhood of fetuses and can be weaponized against people who end their own pregnancies or experience a miscarriage.
In fact, a new report from Pregnancy Justice found that in just the first two years since the Supreme Court’s decision in Dobbs, prosecutors initiated at least 412 cases charging pregnant people with crimes related to pregnancy, pregnancy loss, or birth–most under child neglect, endangerment, or abuse laws that were never intended to target pregnant people. Nine cases included allegations around individuals’ abortions, such as possession of abortion medication or attempts to obtain an abortion–instances just like this one. The report also highlights how, in many instances, prosecutors use tangentially related criminal charges to punish people for abortion, even when abortion itself is not illegal.
By framing their investigation of a self-administered abortion as a “death investigation” of a “non-viable fetus,” Texas law enforcement was signaling their intent to treat the woman’s self-managed abortion as a potential homicide, even though Texas law does not allow criminal charges to be brought against an individual for self-managing their own abortion.
The Investigator’s Sworn Account
Over two days in April, the woman went through the process of taking medication to induce an abortion. Two weeks later, her partner–who would later be charged with domestic violence against her–reported her to the sheriff’s office.
The documents confirm that the woman was not present at the home when the deputies “responded to the death (Non-viable fetus).” As part of the investigation, officers collected evidence that the man had assembled of the self-managed abortion, including photographs, the FedEx envelope the medication arrived in, and the instructions for self-administering the medication.
Another Johnson County official ran two searches through the ALPR database with the note “had an abortion, search for female,” according to Flock Safety search logs obtained by EFF. The first search, which has not been previously reported, probed 1,295 Flock Safety networks–composed of 17,684 different cameras–going back one week. The second search, which was originally exposed by 404 Media, was expanded to a full month of data across 6,809 networks, including 83,345 cameras. Both searches listed the same case number that appears on the death investigation/incident report obtained by EFF.
After collecting the evidence from the woman’s partner, the investigators say they consulted the district attorney’s office, only to be told they could not press charges against the woman.
An excerpt from the JCSO detective’s sworn affidavit.
Nevertheless, when the subject showed up at the Sheriff’s office a week later, officers were under the impression that she came to “to tell her side of the story about the non-viable fetus.” They interviewed her, inspected text messages about the abortion on her phone, and watched her write a timeline of events.
Only after all that did they learn that she actually wanted to report a violent assault by her partner–the same individual who had called the police to report her abortion. She alleged that less than an hour after the abortion, he choked her, put a gun to her head, and made her beg for her life. The man was ultimately charged in connection with the assault, and the case is ongoing.
This documented account runs completely counter to what law enforcement and Flock have said publicly about the case.
Johnson County Sheriff Adam King told 404 media: “Her family was worried that she was going to bleed to death, and we were trying to find her to get her to a hospital.” He later told the Dallas Morning News: “We were just trying to check on her welfare and get her to the doctor if needed, or to the hospital.”
The account by the detective on the scene makes no mention of concerned family members or a medical investigator. To the contrary, the affidavit says that they questioned the man as to why he “waited so long to report the incident,” and he responded that he needed to “process the event and call his family attorney.” The ALPR search was recorded 2.5 hours after the initial call came in, as documented in the investigation report.
The Desk Sergeant’s Report—One Month Later
EFF obtained a separate “case supplemental report” written by the sergeant who says he ran the May 9 ALPR searches.
The sergeant was not present at the scene, and his account was written belatedly on June 5, almost a month after the incident and nearly a week after 404 Media had already published the sheriff’s alternative account of the Flock Safety search, kicking off a national controversy. The sheriff’s office provided this sergeant’s report to Dallas Morning News.
In the report, the sergeant claims that the officers on the ground asked him to start “looking up” the woman due to there being “a large amount of blood” found at the residence—an unsubstantiated claim that is in conflict with the lead investigator’s affidavit. The sergeant repeatedly expresses that the situation was “not making sense.” He claims he was worried that the partner had hurt the woman and her children, so “to check their welfare,” he used TransUnion’s TLO commercial investigative database system to look up her address. Once he identified her vehicle, he ran the plate through the Flock database, returning hits in Dallas.
Two abortion-related searches in the JCSO’s Flock Safety ALPR audit log
The sergeant’s report, filed after the case attracted media attention, notably omits any mention of the abortion at the center of the investigation, although it does note that the caller claimed to have found a fetus. The report does not explain, or even address, why the sergeant used the phrase “had an abortion, search for female” as the official reason for the ALPR searches in the audit log.
It’s also unclear why the sergeant submitted the supplemental report at all, weeks after the incident. By that time, the lead investigator had already filed a sworn affidavit that contradicted the sergeant’s account. For example, the investigator, who was on the scene, does not describe finding any blood or taking blood samples into evidence, only photographs of what the partner believed to be the fetus.
One area where they concur: both reports are clearly marked as a “death investigation.”
Correcting the Record
Since 404 Media first reported on this case, King has perpetuated the false narrative, telling reporters that the woman was never under investigation, that officers had not considered charges against her, and that “it was all about her safety.”
But here are the facts:
The reports that have been released so far describe this as a death investigation.
The lead detective described himself as “working a death investigation… of a non-viable fetus” at the time he interviewed the woman (a week after the ALPR searches).
The detective wrote that they consulted the district attorney’s office about whether they could charge her for “taking the pill to cause the abortion or miscarriage of the non-viable fetus.” They were told they could not.
Investigators collected a lot of data, including photos and documentation of the abortion, and ran her through multiple databases. They even reviewed her text messages about the abortion.
The death investigation was open for more than a month.
The death investigation was only marked closed in mid-June, weeks after 404 Media’s article and a mere days before the Dallas Morning News published its report, in which the sheriff inaccurately claimed the woman “was not under investigation at any point.”
Flock has promoted this unsupported narrative on its blog and in multimediaappearances. We did not reach out to Flock for comment on this article, as their communications director previously told us the company will not answer our inquiries until we “correct the record and admit to your audience that you purposefully spread misinformation which you know to be untrue” about this case.
Consider the record corrected: It turns out the truth is even more damning than initially reported.
The Aftermath
In the aftermath of the original reporting, government officials began to take action. The networks searched by Johnson County included cameras in Illinois and Washington state, both states where abortion access is protected by law. Since then:
The Illinois Secretary of State has announced his intent to “crack down on unlawful use of license plate reader data,” and urged the state’s Attorney General to investigate the matter.
In California, which also has prohibitions on sharing ALPR out of state and for abortion-ban enforcement, the legislature cited the case in support of pending legislation to restrict ALPR use.
Ranking Members of the House Oversight Committee and one of its subcommittees launched a formal investigation into Flock’s role in “enabling invasive surveillance practices that threaten the privacy, safety, and civil liberties of women, immigrants, and other vulnerable Americans.”
Senator Ron Wyden secured a commitment from Flock to protect Oregonians’ data from out-of-state immigration and abortion-related queries.
In response to mounting pressure, Flock announced a series of new features supposedly designed to prevent future abuses. These include blocking “impermissible” searches, requiring that all searches include a “reason,” and implementing AI-driven audit alerts to flag suspicious activity. But as we’ve detailed elsewhere, these measures are cosmetic at best—easily circumvented by officers using vague search terms or reusing legitimate case numbers. The fundamental architecture that enabled the abuse remains unchanged.
Meanwhile, as the news continued to harm the company’s sales, Flock CEO Garrett Langley embarked on a press tour to smear reporters and others who had raised alarms about the usage. In an interview with Forbes, he even doubled down and extolled the use of the ALPR in this case.
So when I look at this, I go “this is everything’s working as it should be.” A family was concerned for a family member. They used Flock to help find her, when she could have been unwell. She was physically okay, which is great. But due to the political climate, this was really good clickbait.
Nothing about this is working as it should, but it is working as Flock designed.
The Danger of Unchecked Surveillance
This case reveals the fundamental danger of allowing companies like Flock Safety to build massive, interconnected surveillance networks that can be searched across state lines with minimal oversight. When a single search query can access more than 83,000 cameras spanning almost the entire country, the potential for abuse is staggering, particularly when weaponized against people seeking reproductive healthcare.
The searches in this case may have violated laws in states like Washington and Illinois, where restrictions exist specifically to prevent this kind of surveillance overreach. But those protections mean nothing when a Texas deputy can access cameras in those states with a few keystrokes, without external review that the search is legal and legitimate under local law. In this case, external agencies should have seen the word “abortion” and questioned the search, but the next time an officer is investigating such a case, they may use a more vague or misleading term to justify the search. In fact, it’s possible it has already happened.
ALPRs were marketed to the public as tools to find stolen cars and locate missing persons. Instead, they’ve become a dragnet that allows law enforcement to track anyone, anywhere, for any reason—including investigating people’s healthcare decisions. This case makes clear that neither the companies profiting from this technology nor the agencies deploying it can be trusted to tell the full story about how it’s being used.
States must ban law enforcement from using ALPRs to investigate healthcare decisions and prohibit sharing data across state lines. Local governments may try remedies like reducing data retention period to minutes instead of weeks or months—but, really, ending their ALPR programs altogether is the strongest way to protect their most vulnerable constituents. Without these safeguards, every license plate scan becomes a potential weapon against a person seeking healthcare.
Flock Safety, the police technology company most notable for their extensive network of automated license plate readers spread throughout the United States, is rolling out a new and troubling product that may create headaches for the cities that adopt it: detection of “human distress” via audio. As part of their suite of technologies, Flock has been pushing Raven, their version of acoustic gunshot detection. These devices capture sounds in public places and use machine learning to try to identify gunshots and then alert police—but EFF has long warned that they are also high powered microphones parked above densely-populated city streets. Cities now have one more reason to follow the lead of many other municipalities and cancel their Flock contracts, before this new feature causes civil liberties harms to residents and headaches for cities.
In marketing materials, Flock has been touting new features to their Raven product—including the ability of the device to alert police based on sounds, including “distress.” The online ad for the product, which allows cities to apply for early access to the technology, shows the image of police getting an alert for “screaming.”
It’s unclear how this technology works. For acoustic gunshot detection, generally the microphones are looking for sounds that would signify gunshots (though in practice they often mistake car backfires or fireworks for gunshots). Flock needs to come forward now with an explanation of exactly how their new technology functions. It is unclear how these devices will interact with state “eavesdropping” laws that limit listening to or recording the private conversations that often take place in public.
Flock is no stranger to causing legal challenges for the cities and states that adopt their products. In Illinois, Flock was accused of violating state law by allowing Immigration and Customs Enforcement (ICE), a federal agency, access to license plate reader data taken within the state. That’s not all. In 2023, a North Carolina judge halted the installation of Flock cameras statewide for operating in the state without a license. When the city of Evanston, Illinois recently canceled its contract with Flock, it ordered the company to take down their license plate readers–only for Flock to mysteriously reinstall them a few days later. This city has now sent Flock a cease and desist order and in the meantime, has put black tape over the cameras. For some, the technology isn’t worth its mounting downsides. As one Illinois village trustee wrote while explaining his vote to cancel the city’s contract with Flock, “According to our own Civilian Police Oversight Commission, over 99% of Flock alerts do not result in any police action.”
Gunshot detection technology is dangerous enough as it is—police showing up to alerts they think are gunfire only to find children playing with fireworks is a recipe for innocent people to get hurt. This isn’t hypothetical: in Chicago a child really was shot at by police who thought they were responding to a shooting thanks to a ShotSpotter alert. Introducing a new feature that allows these pre-installed Raven microphones all over cities to begin listening for human voices in distress is likely to open up a whole new can of unforeseen legal, civil liberties, and even bodily safety consequences.
Langley offers a prediction: In less than 10 years, Flock’s cameras, airborne and fixed, will eradicate almost all crime in the U.S.
That would be Flock Safety CEO (and co-founder) Garrett Langley speaking to Thomas Brewster of Forbes. Flock Safety has grown a lot over the past few years, following paths paved by Amazon’s doorbell surveillance camera acquisition, Ring, and other upstarts in the public/private surveillance mesh network field.
Like Ring, Flock has sold a bunch of products to regular people, starting with the people most likely to have discretionary income and the desire to wield that against other humans beings: homeowners associations and residents of gated communities.
Like Ring, Flock has allowed racists to convert their bigotry into action. And it has also allowed (and encouraged) law enforcement agencies to treat privately-owned cameras as extensions of their own surveillance networks.
Not content to add license plate reader tech to cameras owned by non-cops, Flock now wants to fill the air with another mesh network of public/private ownership via its latest offering:
Since its founding in 2017, Flock, which was valued at $7.5 billion in its most recent funding round, has quietly built a network of more than 80,000 cameras pointed at highways, thoroughfares and parking lots across the U.S. They record not just the license plate numbers of the cars that pass them, but their make and distinctive features—broken windows, dings, bumper stickers. Langley estimates its cameras help solve 1 million crimes a year. Soon they’ll help solve even more. In August, Flock’s cameras will take to the skies mounted on its own “made in America” drones.
Sure, Flock and its CEO may be worth billions. But that doesn’t actually make Langley smart. It just makes him opportunistic enough to take advantage of perpetual false narratives (some perpetuated by Flock itself!) about crime rates. Some people say “Orwellian.” Others, like Garrett Langley, just say “year-over-year growth.”
“I’ve talked to plenty of activists who think crime is just the cost of modern society. I disagree,” Langley says. “I think we can have a crime-free city and civil liberties. . . . We can have it all.” In municipalities in which Flock is deployed, he adds, the average criminal—those between 16 and 24 committing nonviolent crime—“will most likely get caught.”
And there it is: a person in the surveillance tech business refusing to discuss civil liberty concerns honestly, choosing instead to wave them away with a statement that indicates anything that stands in the way of Flock’s continued profitability (or the pipe dream of removing any and all crime and/or 16-24-year-old citizens from US streets) isn’t worth his attention.
But maybe he should be paying more attention to the law, especially the stuff about civil liberties. His company has already been accused of ignoring local laws while selling and/or installing cameras. And Flock recently got dragged back into whatever the opposite of the limelight is earlier this year, when it was discovered Texas cops were able to access Flock license plate reader data all over the nation as they tried to locate a Texas resident who had apparently left the state to obtain an abortion — something that’s illegal in Texas.
Flock didn’t have much to say about this turn of events at the time. And the officers who performed the search claimed their only interest was in locating this person to ensure she was safe, even as they work for a state whose anti-abortion laws make extremely clear that the state government doesn’t actually care about the safety of women.
Meanwhile, in Illinois, Flock is rapidly backpedaling on its information sharing agreements after multiple lawmakers alleged the company broke state data privacy laws by allowing pretty much any government agency from anywhere in the nation to access Flock ALPR data.
Flock Safety, whose cameras are mounted in more than 4,000 communities nationwide, put a hold last week on pilot programs with the Department of Homeland Security’s Customs and Border Protection and its law enforcement arm, Homeland Security Investigations, according to a statement by its founder and CEO, Garrett Langley.
Among officials in other jurisdictions, Illinois Secretary of State Alexi Giannoulias raised concerns. He announced Monday that an audit found Customs and Border Protection had accessed Illinois data, although he didn’t say that the agency was seeking immigration-related information. A 2023 law the Democrat pushed bars sharing license plate data with police investigating out-of-state abortions or undocumented immigrants.
“This sharing of license plate data of motorists who drive on Illinois roads is a clear violation of the state law,” Giannoulias said in a statement. “This law, passed two years ago, aimed to strengthen how data is shared and prevent this exact thing from happening,”
That has led to contracts being cancelled in the state of Illinois, which certainly isn’t going to contribute to Langley’s fantasies of “ending crime” via massively profitable mass surveillance systems sold by his company.
Oak Park voted to terminate its contract with Flock earlier this month.
Tuesday, following the state’s audit, the city of Evanston did the same, saying in a statement, in part:
“The findings of the Illinois Secretary of State’s audit, combined with Flock’s admission that it failed to establish distinct permissions and protocols to ensure local compliance while running a pilot program with federal users, are deeply troubling. As a result, the City has deactivated the cameras and issued a termination notice to Flock, effective September 26, 2025.”
And it’s not just limited to a state with some of the most robust privacy laws in the nation. It’s also happening in Texas, the same state that kicked this backlash off when officers decided it was okay to use Flock’s system to try to locate someone who might have been considering violating the state’s anti-abortion laws.
Austin organizers turned out to rebuke the city’s misguided contract with Flock Safety— and won. This successful pushback from the community means at the end of the month Austin police will no longer be able to use the surveillance network of automated license plate readers (ALPRs) across the city.
It’s pretty rich to claim you can stop all crime when you can’t even stop breaking the law or enabling users of your tech to break the law. What’s listed above are the words of a salesman, not a person truly concerned about crime rates or making communities safer. It thrives on the American constant of believing crime rates are really worse than they actually are. And it depends on the wealth of people and governments who also feel civil liberties are privileges that should only be enjoyed by the richest (and, obviously, whitest) people in the nation. Everyone else should just learn to accept their loss of rights and privacy gratefully, for the good of the nation.