According to AdWeek, the price for a 30-second commercial during Super Bowl LX has soared to $8 million, after NBC opened in the summer by offering spots for $7 million. As AdWeek notes, “due to demand, the company has already reached its cap for the number of spots that were available for advertisers to buy during the upfront season.”
$8 million for 30 seconds sometimes means turning a niche product into a national phenomena. The 30 seconds purchased by Ring went the other way. If you want to see how $8 million can be used to promote mass surveillance enabled by consumer products, here you go:
Sure, it looks pretty innocuous. And what could be better than turning Ring and Flock Safety’s network of cameras into a digital proxy for posting “LOST DOG” signs all over the neighborhood? Well, as it turns out, pretty much everyone saw how problematic this offering was, especially considering what’s already known about Ring, Flock Safety, and both companies’ rather cavalier attitude towards privacy and other aspects of the Fourth Amendment.
To begin with, the “Search Party” feature that allows people to access recordings and images captured by other people’s cameras is already on, which likely comes as a surprise to owners of these devices. Here’s what The Verge’s Jennifer Tuohy discovered last October, shortly after Ring announced its partnership with Flock Safety — a company best known for allowing cops to hunt down people seeking abortions and/or allowing federal officers to perform nationwide searches for whoever they might be looking for (which, of course, would be anyone looking kinda like an immigrant).
[I]t turns out that Search Party is enabled by default. In an email to customers this week, Siminoff wrote that the feature is rolling out to Ring outdoor cameras in November and noted, “You can always turn off Search Party.”
I checked my cameras this morning, and they were all automatically set to enable Search Party. And I’m not alone; Ring users on Reddit have also reported that their cameras have been enabled for Search Party.
This under-reported “feature” was exposed by Ring’s Super Bowl ad, which resulted in enough backlash that Flock Safety no longer has a Ring to wear. Back to Jennifer Tuohy and The Verge:
In a statement published on Ring’s blog and provided to The Verge ahead of publication, the company said: “Following a comprehensive review, we determined the planned Flock Safety integration would require significantly more time and resources than anticipated. We therefore made the joint decision to cancel the integration and continue with our current partners … The integration never launched, so no Ring customer videos were ever sent to Flock Safety.”
While that last sentence may be true, it appears sharing was on by default when it came to Ring’s own cameras. That Flock Safety never got a chance to participate is good to know, but “Search Party” has apparently been active since its implementation last year, even if it was limited to Ring devices.
And while Ring claims the Search Party feature can’t be used to search for “human biometrics,” that’s hardly comforting when it appears Ring definitely wants to add more of this kind of thing to its existing cameras.
On top of this, the company recently launched a new facial recognition feature, Familiar Faces. Combined with Search Party, the technological leap to using neighborhood cameras to search for people through a mass-surveillance network suddenly seems very small.
Ring insists this is not another mass surveillance tool, but rather something that attempts to recognize who’s at any user’s door when sending alerts, in order to differentiate friends and family members from strangers who might be within camera range. Again, there’s some utility to this offering, but the tech lends itself to surveillance abuses, especially when law enforcement may only be a subpoena away from accessing images and recordings captured by privately-owned devices.
Finally, the statement given by Ring only states that this won’t be happening right now, which is a wise choice considering its unpopularity at the moment. But that doesn’t mean Ring and Flock won’t seek to consummate this marriage of surveillance tech, albeit in a more private fashion that doesn’t involve alarming hundreds of millions of sports viewers simultaneously.
Surveillance technology vendors, federal agencies, and wealthy private donors have long helped provide local law enforcement “free” access to surveillance equipment that bypasses local oversight. The result is predictable: serious accountability gaps and data pipelines to other entities, including Immigration and Customs Enforcement (ICE), that expose millions of people to harm.
The collection and sharing of our data quietly generates detailed records of people’s movements and associations that can be exposed, hacked, or repurposed without their knowledge or consent. Those records weaken sanctuary and First Amendment protections while facilitating the targeting of vulnerable people.
Cities can and should use their power to reject federal grants, vendor trials, donations from wealthy individuals, or participation in partnerships that facilitate surveillance and experimentation with spy tech.
If these projects are greenlit, oversight is imperative. Mechanisms like public hearings, competitive bidding, public records transparency, and city council supervision aid to ensure these acquisitions include basic safeguards — like use policies, audits, and consequences for misuse — to protect the public from abuse and from creeping contracts that grow into whole suites of products.
Clear policies and oversight mechanisms must be in place before using any surveillance tools, free or not, and communities and their elected officials must be at the center of every decision about whether to bring these tools in at all.
Here are some of the most common methods “free” surveillance tech makes its way into communities.
Trials and Pilots
Police departments are regularly offered free access to surveillance tools and software through trials and pilot programs that often aren’t accompanied by appropriate use policies. In many jurisdictions, trials do not trigger the same requirements to go before decision-makers outside the police department. This means the public may have no idea that a pilot program for surveillance technology is happening in their city.
In Denver, Colorado, the police department is running trials of possible unmanned aerial vehicles (UAVs) for a drone-as-first-responder (DFR) program from two competing drone vendors: Flock Safety Aerodome drones (through August 2026) and drones from the company Skydio, partnering with Axon, the multi-billion dollar police technology company behind tools like Tasers and AI-generated police reports. Drones create unique issues given their vantage for capturing private property and unsuspecting civilians, as well as their capacity to make other technologies, like ALPRs, airborne.
Functional, Even Without Funding
We’ve seen cities decide not to fund a tool, or run out of funding for it, only to have a company continue providing it in the hope that money will turn up. This happened in Fall River, Massachusetts, where the police department decided not to fund ShotSpotter’s $90,000 annual cost and its frequent false alarms, but continued using the system when the company provided free access.
Importantly, police technology companies are developing more features and subscription-based models, so what’s “free” today frequently results in taxpayers footing the bill later.
Gifts from Police Foundations and Wealthy Donors
Police foundations and the wealthy have pushed surveillance-driven agendas in their local communities by donating equipment and making large monetary gifts, another means of acquiring these tools without public oversight or buy-in.
In Atlanta, the Atlanta Police Foundation (APF) attempted to use its position as a private entity to circumvent transparency. Following a court challenge from the Atlanta Community Press Collective and Lucy Parsons Labs, a Georgia court determined that the APF must comply with public records laws related to some of its actions and purchases on behalf of law enforcement. In San Francisco, billionaire Chris Larsen has financially supported a supercharging of the city’s surveillance infrastructure, donating $9.4 million to fund the San Francisco Police Department’s (SFPD) Real-Time Investigation Center, where a menu of surveillance technologies and data come together to surveil the city’s residents. This move comes after the billionaire backed a ballot measure, which passed in March 2025, eroding the city’s surveillance technology law and allowing the SFPD free rein to use new surveillance technologies for a full year without oversight.
Free Tech for Federal Data Pipelines
Federal grants and Department of Homeland Security funding are another way surveillance technology appears free to, only to lock municipalities into long‑term data‑sharing and recurring costs.
Through the Homeland Security Grant Program, which includes the State Homeland Security Program (SHSP) and the Urban Areas Security (UASI) Initiative, and Department of Justice programs like Byrne JAG, the federal government reimburses states and cities for “homeland security” equipment and software, including including law‑enforcement surveillance tools, analytics platforms, and real‑time crime centers. Grant guidance and vendor marketing materials make clear that these funds can be used for automated license plate readers, integrated video surveillance and analytics systems, and centralized command‑center software—in other words, purchases framed as counterterrorism investments but deployed in everyday policing.
Vendors have learned to design products around this federal money, pitching ALPR networks, camera systems, and analytic platforms as “grant-ready” solutions that can be acquired with little or no upfront local cost. Motorola Solutions, for example, advertises how SHSP and UASI dollars can be used for “law enforcement surveillance equipment” and “video surveillance, warning, and access control” systems. Flock Safety, partnering with Lexipol, a company that writes use policies for law enforcement, offers a “License Plate Readers Grant Assistance Program” that helps police departments identify federal and state grants and tailor their applications to fund ALPR projects.
Grant assistance programs let police chiefs fast‑track new surveillance: the paperwork is outsourced, the grant eats the upfront cost, and even when there is a formal paper trail, the practical checks from residents, councils, and procurement rules often get watered down or bypassed.
On paper, these systems arrive “for free” through a federal grant; in practice, they lock cities into recurring software, subscription, and data‑hosting fees that quietly turn into permanent budget lines—and a lasting surveillance infrastructure—as soon as police and prosecutors start to rely on them. In Santa Cruz, California, the police department explicitly sought to use a DHS-funded SHSP grant to pay for a new citywide network of Flock ALPR cameras at the city’s entrances and exits, with local funds covering additional cameras. In Sumner, Washington, a $50,000 grant was used to cover the entire first year of a Flock system — including installation and maintenance — after which the city is on the hook for roughly $39,000 every year in ongoing fees. The free grant money opens the door, but local governments are left with years of financial, political, and permanent surveillance entanglements they never fully vetted.
The most dangerous cost of this “free” funding is not just budgetary; it is the way it ties local systems into federal data pipelines. Since 9/11, DHS has used these grant streams to build a nationwide network of at least 79–80 state and regional fusion centers that integrate and share data from federal, state, local, tribal, and private partners. Research shows that state fusion centers rely heavily on the DHS Homeland Security Grant Program (especially SHSP and UASI) to “mature their capabilities,” with some centers reporting that 100 percent of their annual expenditures are covered by these grants.
Civil rights investigations have documented how this funding architecture creates a backdoor channel for ICE and other federal agencies to access local surveillance data for their own purposes. A recent report by the Surveillance Technology Oversight Project (S.T.O.P.) describes ICE agents using a Philadelphia‑area fusion center to query the city’s ALPR network to track undocumented drivers in a self‑described sanctuary city.
Ultimately, federal grants follow the same script as trials and foundation gifts: what looks “free” ends up costing communities their data, their sanctuary protections, and their power over how local surveillance is used.
Protecting Yourself Against “Free” Technology
The most important protection against “free” surveillance technology is to reject it outright. Cities do not have to accept federal grants, vendor trials, or philanthropic donations. Saying no to “free” tech is not just a policy choice; it is a political power that local governments possess and can exercise. Communities and their elected officials can and should refuse surveillance systems that arrive through federal grants, vendor pilots, or private donations, regardless of how attractive the initial price tag appears.
For those cities that have already accepted surveillance technology, the imperative is equally clear: shut it down. When a community has rejected use of a spying tool, the capabilities, equipment, and data collected from that tool should be shut off immediately. Full stop.
And for any surveillance technology that remains in operation, even temporarily, there must be clear rules: when and how equipment is used, how that data is retained and shared, who owns data and how companies can access and use it, transparency requirements, and consequences for any misuse and abuse.
“Free” surveillance technology is never free. Someone profits or gains power from it. Police technology vendors, federal agencies, and wealthy donors do not offer these systems out of generosity; they offer them because surveillance serves their interests, not ours. That is the real cost of “free” surveillance.
A California police department is none too happy that its license plate reader records were accessed by federal employees it never gave explicit permission to peruse. And, once again, it’s Flock Safety shrugging itself into another PR black eye.
Mountain View police criticized the company supplying its automated license plate reader system after an audit turned up “unauthorized” use by federal law enforcement agencies.
At least six offices of four agencies accessed data from the first camera in the city’s Flock Safety license-tracking system from August to November 2024 without the police department’s permission or knowledge, according to a press release Friday night.
Flock has been swimming in a cesspool of its own making for several months now, thanks to it being the public face of “How To Hunt Down Someone Who Wanted An Abortion.” That debacle was followed by even more negative press (and congressional rebuke) for its apparent unwillingness to place any limits at all on access to the hundreds of millions of license plate records its cameras have captured, including those owned by private individuals.
Mountain View is in California. And that’s only one problem with everything in this paragraph:
The city said its system was accessed by Bureau of Alcohol, Tobacco, Firearms and Explosives offices in Kentucky and Tennessee, which investigate crimes related to guns, explosives, arson and the illegal trafficking of alcohol and tobacco; the inspector general’s office of the U.S.. General Services Administration, which manages federal buildings, procurement, and property; Air Force bases in Langley, Virginia, and in Ohio; and the Lake Mead National Recreation Area in Nevada.
Imagine trying to explain this to anyone. While it’s somewhat understandable that the ATF might be running nationwide searches on Flock’s platform, it’s almost impossible to explain why images captured by a single camera in Mountain View, California were accessed by the Inspector General for the GSA, much less Lake Mead Recreation Area staffers.
This explains how this happened. But it doesn’t do anything to explain why.
They accessed Mountain View’s system for one camera via a “nationwide” search setting that was turned on by Flock Safety, police said.
Apparently, this is neither opt-in or opt-out. It just is. The Mountain View police said they “worked closely” with Flock to block out-of-state access, as well as limit internal access to searches expressly approved by the department’s police chief.
Flock doesn’t seem to care what its customers want. Either it can’t do what this department asked or it simply chose not to because a system that can’t be accessed by government randos scattered around the nation is much tougher to sell than a locked-down portal that actually serves the needs of the people paying for it.
The privacy protection that Flock promised to Oregonians — that Flock software will automatically examine the reason provided by law enforcement officers for terms indicating an abortion- or immigration-related search — is meaningless when law enforcement officials provide generic reasons like “investigation” or “crime.” Likewise, Flock’s filters are meaningless if no reason for a search is provided in the first place. While the search reasons collected by Flock, obtained by press and activists through open records requests, have occasionally revealed searches for immigration and abortion enforcement, these are likely just the tip of the iceberg. Presumably, most officers using Flock to hunt down immigrants and women who have received abortions are not going to type that in as the reason for their search. And, regardless, given that Flock has washed its hands of any obligation to audit its customers, Flock customers have no reason to trust a search reason provided by another agency.
I now believe that abuses of your product are not only likely but inevitable, and that Flock is unable and uninterested in preventing them.
Flock just keeps making Wyden’s points for him. The PD wanted limited access with actual oversight. Flock gave the PD a lending library of license plate/location images anyone with or without a library card (so to speak) could check out at will. Flock is part of the surveillance problem. And it’s clear it’s happy being a tool that can be readily and easily abused, no matter what its paying customers actually want from its technology.
Let’s start with Flock, the company behind a number of automated license plate reader (ALPR) and other camera technologies. You might be surprised at how many Flock cameras there are in your community. Many large and small municipalities around the country have signed deals with Flock for license plate readers to track the movement of all cars in their city. Even though these deals are signed by local police departments, oftentimes ICE also gains access.
Because of their ubiquity, people are interested in finding out where and how many Flock cameras are in their community. One project that can help with this is the OUI-SPY, a small piece of open source hardware. The OUI-SPY runs on a cheap Arduino compatible chip called an ESP-32. There are multiple programs available for loading on the chip, such as “Flock You,” which allows people to detect Flock cameras and “Sky-Spy” to detect overhead drones. There’s also “BLE Detect,” which detects various Bluetooth signals including ones from Axon, Meta’s Ray-Bans that secretly record you, and more. It also has a mode commonly known as “fox hunting” to track down a specific device. Activists and researchers can use this tool to map out different technologies and quantify the spread of surveillance.
There’s also the open source Wigle app which is primarily designed for mapping out Wi-Fi, but also has the ability to make an audio alert when a specific Wi-Fi or Bluetooth identifier is detected. This means you can set it up to get a notification when it detects products from Flock, Axon, or other nasties in their vicinity.
One enterprising YouTuber, Benn Jordan, figured out a way to fool Flock cameras into not recording his license plate simply by painting some minor visual noise on his license plate. This is innocuous enough that any human will still be able to read his license plate, but it completely prevented Flock devices from recognizing his license plate as a license plate at the time. Some states have outlawed drivers obscuring their license plates, so taking such action is not recommended.
Jordan later went on to discover hundreds of misconfigured Flock cameras that were exposing their administrator interface without a password on the public internet. This would allow anyone with an internet connection to view a live surveillance feed, download 30 days of video, view logs, and more. The cameras pointed at parks, public trails, busy intersections, and even a playground. This was a massive breach of public trust and a huge mistake for a company that claims to be working for public safety.
Other hackers have taken on the task of open-source intelligence and community reporting. One interesting example is deflock.me and alpr.watch, which are crowdsourced maps of ALPR cameras. Much like the OUI-SPY project, this allows activists to map out and expose Flock surveillance cameras in their community.
Another interesting project documenting ICE and creating a trove of open-source intelligence is ICE List Wiki which contains info on companies that have contracts with ICE, incidents and encounters with ICE, and vehicles ICE uses.
People without programming knowledge can also get involved. In Chicago, people used whistles to warn their neighbors that ICE was present or in the area. Many people 3D-printed whistles along with instructional booklets to hand out to their communities, allowing a wider distribution of whistles and consequently earlier warnings for their neighbors.
There is also EFF’s own Rayhunter project for detecting cell-site simulators, about which we have written extensively. Rayhunter runs on a cheap mobile hotspot and doesn’t require deep technical knowledge to use.
It’s important to remember that we are not powerless. Even in the face of a domestic law enforcement presence with massive surveillance capabilities and military-esque technologies, there are still ways to engage in surveillance self-defense. We cannot give into nihilism and fear. We must continue to find small ways to protect ourselves and our communities, and when we can, fight back.
EFF is not affiliated with any of these projects (other than Rayhunter) and does not endorse them. We don’t make any statements about the legality of using any of these projects. Please consult with an attorney to determine what risks there may be.
It’s no secret that 2025 has givenAmericansplentytoprotestabout. But as news cameras showed protesters filling streets of cities across the country, law enforcement officers—including U.S. Border Patrol agents—were quietly watching those same streets through different lenses: Flock Safety automated license plate readers (ALPRs) that tracked every passing car.
Through an analysis of 10 months of nationwide searches on Flock Safety’s servers, we discovered that more than 50 federal, state, and local agencies ran hundreds of searches through Flock’s national network of surveillance data in connection with protest activity. In some cases, law enforcement specifically targeted known activist groups, demonstrating how mass surveillance technology increasingly threatens our freedom to demonstrate.
Flock Safety provides ALPR technology to thousands of law enforcement agencies. The company installs cameras throughout their jurisdictions, and these cameras photograph every car that passes, documenting the license plate, color, make, model and other distinguishing characteristics. This data is paired with time and location, and uploaded to a massive searchable database. Flock Safety encourages agencies to share the data they collect broadly with other agencies across the country. It is common for an agency to search thousands of networks nationwide even when they don’t have reason to believe a targeted vehicle left the region.
Via public records requests, EFF obtained datasets representing more than 12 million searches logged by more than 3,900 agencies between December 2024 and October 2025. The data shows that agencies logged hundreds of searches related to the 50501 protests in February, the Hands Off protests in April, the No Kings protests in June and October, and other protests in between.
The Tulsa Police Department in Oklahoma was one of the most consistent users of Flock Safety’s ALPR system for investigating protests, logging at least 38 such searches. This included running searches that corresponded to a protest against deportation raids in February, a protest at Tulsa City Hall in support of pro-Palestinian activist Mahmoud Khalil in March, and the No Kings protest in June. During the most recent No Kings protests in mid-October, agencies such as the Lisle Police Department in Illinois, the Oro Valley Police Department in Arizona, and the Putnam County (Tenn.) Sheriff’s Office all ran protest-related searches.
While EFF and other civil liberties groups argue the law should require a search warrant for such searches, police are simply prompted to enter text into a “reason” field in the Flock Safety system. Usually this is only a few words–or even just one.
In these cases, that word was often just “protest.”
Crime does sometimes occur at protests, whether that’s property damage, pick-pocketing, or clashes between groups on opposite sides of a protest. Some of these searches may have been tied to an actual crime that occurred, even though in most cases officers did not articulate a criminal offense when running the search. But the truth is, the only reason an officer is able to even search for a suspect at a protest is because ALPRs collected data on every single person who attended the protest.
Search and Dissent
2025 was an unprecedented year of street action. In June and again in October, thousands across the country mobilized under the banner of the “No Kings” movement—marches against government overreach, surveillance, and corporate power. By some estimates, the October demonstrations ranked among the largest single-day protests in U.S. history, filling the streets from Washington, D.C., to Portland, OR.
EFF identified 19 agencies that logged dozens of searches associated with the No Kings protests in June and October 2025. In some cases the “No Kings” was explicitly used, while in others the term “protest” was used but coincided with the massive protests.
Law Enforcement Agencies that Ran Searches Corresponding with “No Kings” Rallies * Anaheim Police Department, Calif. * Arizona Department of Public Safety * Beaumont Police Department, Texas * Charleston Police Department, SC * Flagler County Sheriff’s Office, Fla. * Georgia State Patrol * Lisle Police Department, Ill. * Little Rock Police Department, Ark. * Marion Police Department, Ohio * Morristown Police Department, Tenn. * Oro Valley Police Department, Ariz. * Putnam County Sheriff’s Office, Tenn. * Richmond Police Department, Va. * Riverside County Sheriff’s Office, Calif. * Salinas Police Department, Calif. * San Bernardino County Sheriff’s Office, Calif. * Spartanburg Police Department, SC * Tempe Police Department, Ariz. * Tulsa Police Department, Okla. * US Border Patrol
For example:
In Washington state, the Spokane County Sheriff’s Office listed “no kings” as the reason for three searches on June 15, 2025 [Note: date corrected]. The agency queried 95 camera networks, looking for vehicles matching the description of “work van,” “bus” or “box truck.”
In Texas, the Beaumont Police Department ran six searches related to two vehicles on June 14, 2025, listing “KINGS DAY PROTEST” as the reason. The queries reached across 1,774 networks.
In California, the San Bernardino County Sheriff’s Office ran a single search for a vehicle across 711 networks, logging “no king” as the reason.
In Arizona, the Tempe Police Department made three searches for “ATL No Kings Protest” on June 15, 2025 searching through 425 networks. “ATL” is police code for “attempt to locate.” The agency appears to not have been looking for a particular plate, but for any red vehicle on the road during a certain time window.
But the No Kings protests weren’t the only demonstrations drawing law enforcement’s digital dragnet in 2025.
For example:
In Nevada’s state capital, the Carson City Sheriff’s Office ran three searches that correspond to the February 50501 Protests against DOGE and the Trump administration. The agency searched for two vehicles across 178 networks with “protest” as the reason.
In Florida, the Seminole County Sheriff’s Office logged “protest” for five searches that correspond to a local May Day rally.
In Alabama, the Homewood Police Department logged four searches in early July 2025 for three vehicles with “PROTEST CASE” and “PROTEST INV.” in the reason field. The searches, which probed 1,308 networks, correspond to protests against the police shooting of Jabari Peoples.
In Texas, the Lubbock Police Department ran two searches for a Tennessee license plate on March 15 that corresponds to a rally to highlight the mental health impact of immigration policies. The searches hit 5,966 networks, with the logged reason “protest veh.”
In Michigan, Grand Rapids Police Department ran five searches that corresponded with the Stand Up and Fight Back Rally in February. The searches hit roughly 650 networks, with the reason logged as “Protest.”
Someagencies have adopted policies that prohibit using ALPRs for monitoring activities protected by the First Amendment. Yet many officers probed the nationwide network with terms like “protest” without articulating an actual crime under investigation.
In a few cases, police were using Flock’s ALPR network to investigate threats made against attendees or incidents where motorists opposed to the protests drove their vehicle into crowds. For example, throughout June 2025, an Arizona Department of Public Safety officer logged three searches for “no kings rock threat,” and a Wichita (Kan.) Police Department officer logged 22 searches for various license plates under the reason “Crime Stoppers Tip of causing harm during protests.”
Even when law enforcement is specifically looking for vehicles engaged in potentially criminal behavior such as threatening protesters, it cannot be ignored that mass surveillance systems work by collecting data on everyone driving to or near a protest—not just those under suspicion.
Border Patrol’s Expanding Reach
As U.S. Border Patrol (USBP), ICE, and other federal agencies tasked with immigration enforcement have massively expanded operations into major cities, advocates for immigrants have responded through organized rallies, rapid-response confrontations, and extended presences at federal facilities.
USBP has made extensive use of Flock Safety’s system for immigration enforcement, but also to target those who object to its tactics. In June, a few days after the No Kings Protest, USBP ran three searches for a vehicle using the descriptor “Portland Riots.”
USBP also used the Flock Safety network to investigate a motorist who had “extended his middle finger” at Border Patrol vehicles that were transporting detainees. The motorist then allegedly drove in front of one of the vehicles and slowed down, forcing the Border Patrol vehicle to brake hard. An officer ran seven searches for his plate, citing “assault on agent” and “18 usc 111,” the federal criminal statute for assaulting, resisting or impeding a federal officer. The individual was charged in federal court in early August.
USBP had access to the Flock system during a trial period in the first half of 2025, but the company says it has since paused the agency’s access to the system. However, Border Patrol and other federal immigration authorities have been able to access the system’s data through local agencies who have run searches on their behalf or even lent them logins.
Targeting Animal Rights Activists
Law enforcement’s use of Flock’s ALPR network to surveil protesters isn’t limited to large-scale political demonstrations. Three agencies also used the system dozens of times to specifically target activists from Direct Action Everywhere (DxE), an animal-rights organization known for using civil disobedience tactics to expose conditions at factory farms.
Delaware State Police queried the Flock national network nine times in March 2025 related to DxE actions, logging reasons such as “DxE Protest Suspect Vehicle.” DxE advocates told EFF that these searches correspond to an investigation the organization undertook of a Mountaire Farms facility.
Additionally, the California Highway Patrol logged dozens of searches related to a “DXE Operation” throughout the day on May 27, 2025. The organization says this corresponds with an annual convening in California that typically ends in a direct action. Participants leave the event early in the morning, then drive across the state to a predetermined but previously undisclosed protest site. Also in May, the Merced County Sheriff’s Office in California logged two searches related to “DXE activity.”
As an organization engaged in direct activism, DxE has experienced criminalprosecution for its activities, and so the organization told EFF they were not surprised to learn they are under scrutiny from law enforcement, particularly considering how industrial farmers have collected and distributed their own intelligence to police.
The targeting of DxE activists reveals how ALPR surveillance extends beyond conventional and large-scale political protests to target groups engaged in activism that challenges powerful industries. For animal-rights activists, the knowledge that their vehicles are being tracked through a national surveillance network undeniably creates a chilling effect on their ability to organize and demonstrate.
Fighting Back Against ALPR
ALPR systems are designed to capture information on every vehicle that passes within view. That means they don’t just capture data on “criminals” but on everyone, all the time—and that includes people engaged in their First Amendment right to publicly dissent. Police are sitting on massive troves of data that can reveal who attended a protest, and this data shows they are not afraid to use it.
Our analysis only includes data where agencies explicitly mentioned protests or related terms in the “reason” field when documenting their search. It’s likely that scores more were conducted under less obvious pretexts and search reasons. According to our analysis, approximately 20 percent of all searches we reviewed listed vague language like “investigation,” “suspect,” and “query” in the reason field. Those terms could well be cover for spying on a protest, an abortion prosecution, or an officer stalking a spouse, and no one would be the wiser–including the agencies whose data was searched. Flock has said it will now require officers to select a specific crime under investigation, but that can and will also be used to obfuscate dubious searches.
For protestors, this data should serve as confirmation that ALPR surveillance has been and will be used to target activities protected by the First Amendment. Depending on your threat model, this means you should think carefully about how you arrive at protests, and explore options such as by biking, walking, carpooling, taking public transportation, or simply parking a little further away from the action. Our Surveillance Self-Defense project has more information on steps you could take to protect your privacy when traveling to and attending a protest.
For local officials, this should serve as another example of how systems marketed as protecting your community may actually threaten the values your communities hold most dear. The best way to protect people is to shut down these camera networks.
Everyone should have the right to speak up against injustice without ending up in a database.
More than 80 law enforcement agencies across the United States have used language perpetuating harmful stereotypes against Romani people when searching the nationwide Flock Safety automated license plate reader (ALPR) network, according to audit logs obtained and analyzed by the Electronic Frontier Foundation.
When police run a search through the Flock Safety network, which links thousands of ALPR systems, they are prompted to leave a reason and/or case number for the search. Between June 2024 and October 2025, cops performed hundreds of searches for license plates using terms such as “roma” and “g*psy,” and in many instances, without any mention of a suspected crime. Other uses include “g*psy vehicle,” “g*psy group,” “possible g*psy,” “roma traveler” and “g*psy ruse,” perpetuating systemic harm by demeaning individuals based on their race or ethnicity.
These queries were run through thousands of police departments’ systems—and it appears that none of these agencies flagged the searches as inappropriate.
These searches are, by definition, racist.
Word Choices and Flock Searches
We are using the terms “Roma” and “Romani people” as umbrella terms, recognizing that they represent different but related groups. Since 2020, the U.S. federal government has officially recognized “Anti-Roma Racism” as including behaviors such as “stereotyping Roma as persons who engage in criminal behavior” and using the slur “g*psy.” According to the U.S. Department of State, this language “leads to the treatment of Roma as an alleged alien group and associates them with a series of pejorative stereotypes and distorted images that represent a specific form of racism.”
Nevertheless, police officers have run hundreds of searches for license plates using the terms “roma” and “g*psy.” (Unlike the police ALPR queries we’ve uncovered, we substitute an asterisk for the Y to avoid repeating this racist slur). In many cases, these terms have been used on their own, with no mention of crime. In other cases, the terms have been used in contexts like “g*psy scam” and “roma burglary,” when ethnicity should have no relevance to how a crime is investigated or prosecuted.
A “g*psy scam” and “roma burglary” do not exist in criminal law separate from any other type of fraud or burglary. Several agencies contacted by EFF have since acknowledged the inappropriate use and expressed efforts to address the issue internally.
“The use of the term does not reflect the values or expected practices of our department,” a representative of the Palos Heights (IL) Police Department wrote to EFF after being confronted with two dozen searches involving the term “g*psy.” “We do not condone the use of outdated or offensive terminology, and we will take this inquiry as an opportunity to educate those who are unaware of the negative connotation and to ensure that investigative notations and search reasons are documented in a manner that is accurate, professional, and free of potentially harmful language.”
Of course, the broader issue is that allowing “g*psy” or “Roma” as a reason for a search isn’t just offensive, it implies the criminalization an ethnic group. In fact, the Grand Prairie Police Department in Texas searched for “g*psy” six times while using Flock’s “Convoy” feature, which allows an agency to identify vehicles traveling together—in essence targeting an entire traveling community of Roma without specifying a crime.
At the bottom of this post is a list of agencies and the terms they used when searching the Flock system.
Anti-Roma Racism in an Age of Surveillance
Racism against Romani people has been a problem for centuries, with one of its most horrific manifestations during the Holocaust, when the Third Reich and its allies perpetuated genocide by murdering hundreds of thousands of Romani people and sterilizing thousands more. Despite efforts by the UN and EU to combat anti-Roma discrimination, this form of racism persists. As scholars Margareta Matache and Mary T. Bassett explain, it is perpetuated by modern American policing practices:
In recent years, police departments have set up task forces specialised in “G*psy crimes”, appointed “G*psy crime” detectives, and organised police training courses on “G*psy criminality”. The National Association of Bunco Investigators (NABI), an organisation of law enforcement professionals focusing on “non-traditional organised crime”, has even created a database of individuals arrested or suspected of criminal activity, which clearly marked those who were Roma.
Thus, it is no surprise that a 2020 Harvard University survey of Romani Americans found that 4 out of 10 respondents reported being subjected to racial profiling by police. This demonstrates the ongoing challenges they face due to systemic racism and biased policing.
Notably, many police agencies using surveillance technologies like ALPRs have adopted some sort of basic policy against biased policing or the use of these systems to target people based on race or ethnicity. But even when such policies are in place, an agency’s failure to enforce them allows these discriminatory practices to persist. These searches were also run through the systems of thousands of other police departments that may have their own policies and state laws that prohibit bias-based policing—yet none of those agencies appeared to have flagged the searches as inappropriate.
The Flock search data in question here shows that surveillance technology exacerbates racism, and even well-meaning policies to address bias can quickly fall apart without proper oversight and accountability.
Cops In Their Own Words
EFF reached out to a sample of the police departments that ran these searches. Here are five representative responses we received from police departments in Illinois, California, and Virginia. They do not inspire confidence.
1. Lake County Sheriff’s Office, IL
In June 2025, the Lake County Sheriff’s Office ran three searches for a dark colored pick-up truck, using the reason: “G*PSY Scam.” The search covered 1,233 networks, representing 14,467 different ALPR devices.
In response to EFF, a sheriff’s representative wrote via email:
“Thank you for reaching out and for bringing this to our attention. We certainly understand your concern regarding the use of that terminology, which we do not condone or support, and we want to assure you that we are looking into the matter.
Any sort of discriminatory practice is strictly prohibited at our organization. If you have the time to take a look at our commitment to the community and our strong relationship with the community, I firmly believe you will see discrimination is not tolerated and is quite frankly repudiated by those serving in our organization.
We appreciate you bringing this to our attention so we can look further into this and address it.”
2. Sacramento Police Department, CA
In May 2025, the Sacramento Police Department ran six searches using the term “g*psy.” The search covered 468 networks, representing 12,885 different ALPR devices.
In response to EFF, a police representative wrote:
“Thank you again for reaching out. We looked into the searches you mentioned and were able to confirm the entries. We’ve since reminded the team to be mindful about how they document investigative reasons. The entry reflected an investigative lead, not a disparaging reference.
We appreciate the chance to clarify.”
3. Palos Heights Police Department, IL
In September 2024, the Palos Heights Police Department ran more than two dozen searches using terms such as “g*psy vehicle,” “g*psy scam” and “g*psy concrete vehicle.” Most searches hit roughly 1,000 networks.
In response to EFF, a police representative said the searches were related to a singular criminal investigation into a vehicle involved in a “suspicious circumstance/fraudulent contracting incident” and is “not indicative of a general search based on racial or ethnic profiling.” However, the agency acknowledged the language was inappropriate:
“The use of the term does not reflect the values or expected practices of our department. We do not condone the use of outdated or offensive terminology, and we will take this inquiry as an opportunity to educate those who are unaware of the negative connotation and to ensure that investigative notations and search reasons are documented in a manner that is accurate, professional, and free of potentially harmful language.
We appreciate your outreach on this matter and the opportunity to provide clarification.”
4. Irvine Police Department, CA
In February and May 2025, the Irvine Police Department ran eight searches using the term “roma” in the reason field. The searches covered 1,420 networks, representing 29,364 different ALPR devices.
In a call with EFF, an IPD representative explained that the cases were related to a series of organized thefts. However, they acknowledged the issue, saying, “I think it’s an opportunity for our agency to look at those entries and to use a case number or use a different term.”
5. Fairfax County Police Department, VA
Between December 2024 and April 2025, the Fairfax County Police Department ran more than 150 searches involving terms such as “g*psy case” and “roma crew burglaries.” Fairfax County PD continued to defend its use of this language.
In response to EFF, a police representative wrote:
“Thank you for your inquiry. When conducting searches in investigative databases, our detectives must use the exact case identifiers, terms, or names connected to a criminal investigation in order to properly retrieve information. These entries reflect terminology already tied to specific cases and investigative files from other agencies, not a bias or judgment about any group of people. The use of such identifiers does not reflect bias or discrimination and is not inconsistent with our Bias-Based Policing policy within our Human Relations General Order.”
A National Trend
Roma individuals and families are not the only ones being systematically and discriminatorily targeted by ALPR surveillance technologies. For example, Flock audit logs show agencies ran 400 more searches using terms targeting Traveller communities more generally, with a specific focus on Irish Travellers, often without any mention of a crime.
Across the country, these tools are enabling and amplifying racial profiling by embedding longstanding policing biases into surveillance technologies. For example, data from Oak Park, IL, show that 84% of drivers stopped in Flock-related traffic incidents were Black—despite Black people making up only 19% of the local population. ALPR systems are far from being neutral tools for public safety and are increasingly being used to fuel discriminatory policing practices against historically marginalized people.
The racially coded language in Flock’s logs mirrors long-standing patterns of discriminatory policing. Terms like “furtive movements,” “suspicious behavior,” and “high crime area” have always been cited by police to try to justify stops and searches of Black, Latine, and Native communities. These phrases might not appear in official logs because they’re embedded earlier in enforcement—in the traffic stop without clear cause, the undocumented stop-and-frisk, the intelligence bulletin flagging entire neighborhoods as suspect. They function invisibly until a body-worn camera, court filing, or audit brings them to light. Flock’s network didn’t create racial profiling; it industrialized it, turning deeply encoded and vague language into scalable surveillance that can search thousands of cameras across state lines.
The Path Forward
U.S. Sen. Ron Wyden, D-OR, recently recommended that local governments reevaluate their decisions to install Flock Safety in their communities. We agree, but we also understand that sometimes elected officials need to see the abuse with their own eyes first.
We know which agencies ran these racist searches, and they should be held accountable. But we also know that the vast majority of Flock Safety’s clients—thousands of police and sheriffs—also allowed those racist searches to run through their Flock Safety systems unchallenged.
Elected officials must act decisively to address the racist policing enabled by Flock’s infrastructure. First, they should demand a complete audit of all ALPR searches conducted in their jurisdiction and a review of search logs to determine (a) whether their police agencies participated in discriminatory policing and (b) what safeguards, if any, exist to prevent such abuse. Second, officials should institute immediate restrictions on data-sharing through Flock’s nationwide network. As demonstrated by California law, for example, police agencies should not be able to share their ALPR data with federal authorities or out-of-state agencies, thus eliminating a vehicle for discriminatory searches spreading across state lines.
Ultimately, elected officials must terminate Flock Safety contracts entirely. The evidence is now clear: audit logs and internal policies alone cannot prevent a surveillance system from becoming a tool for racist policing. The fundamental architecture of Flock—thousands of cameras feeding into a nationwide searchable network—makes discrimination inevitable when enforcement mechanisms fail.
As Sen. Wyden astutely explained, “local elected officials can best protect their constituents from the inevitable abuses of Flock cameras by removing Flock from their communities.”
Table Overview and Notes
The following table compiles terms used by agencies to describe the reasons for searching the Flock Safety ALPR database. In a small number of cases, we removed additional information such as case numbers, specific incident details, and officers’ names that were present in the reason field.
We removed one agency from the list due to the agency indicating that the word was a person’s name and not a reference to Romani people.
In general, we did not include searches that used the term “Romanian,” although many of those may also be indicative of anti-Roma bias. We also did not include uses of “traveler” or “Traveller” when it did not include a clear ethnic modifier; however, we believe many of those searches are likely relevant.
A text-based version of the spreadsheet is available here.
New documents and court records obtained by EFF show that Texas deputies queried Flock Safety’s surveillance data in an abortion investigation, contradicting the narrative promoted by the company and the Johnson County Sheriff that she was “being searched for as a missing person,” and that “it was about her safety.”
The new information shows that deputies had initiated a “death investigation” of a “non-viable fetus,” logged evidence of a woman’s self-managed abortion, and consulted prosecutors about possibly charging her.
Johnson County Sheriff Adam King repeatedly denied the automated license plate reader (ALPR) search was related to enforcing Texas’s abortion ban, and Flock Safety called media accounts “false,” “misleading” and “clickbait.” However, according to a sworn affidavit by the lead detective, the case was in fact a death investigation in response to a report of an abortion, and deputies collected documentation of the abortion from the “reporting person,” her alleged romantic partner. The death investigation remained open for weeks, with detectives interviewing the woman and reviewing her text messages about the abortion.
The documents show that the Johnson County District Attorney’s Office informed deputies that “the State could not statutorily charge [her] for taking the pill to cause the abortion or miscarriage of the non-viable fetus.”
An excerpt from the JCSO detective’s sworn affidavit.
The records include previously unreported details about the case that shocked public officials and reproductive justice advocates across the country when it was first reported by 404 Media in May. The case serves as a clear warning sign that when data from ALPRs is shared across state lines, it can put people at risk, including abortion seekers. And, in this case, the use may have run afoul of laws in Washington and Illinois.
A False Narrative Emerges
Last May, 404 Media obtained data revealing the Johnson County Sheriff’s Office conducted a nationwide search of more than 83,000 Flock ALPR cameras, giving the reason in the search log: “had an abortion, search for female.” Both the Sheriff’s Office and Flock Safety have attempted to downplay the search as akin to a search for a missing person, claiming deputies were only looking for the woman to “check on her welfare” and that officers found a large amount of blood at the scene – a claim now contradicted by the responding investigator’s affidavit. Flock Safety went so far as to assert that journalists and advocates covering the story intentionally misrepresented the facts, describing it as “misreporting” and “clickbait-driven.”
As Flock wrote of EFF’s previous commentary on this case (bold in original statement):
Earlier this month, there was purposefully misleading reporting that a Texas police officer with the Johnson County Sheriff’s Office used LPR “to target people seeking reproductive healthcare.” This organization is actively perpetuating narratives that have been proven false, even after the record has been corrected.
According to the Sheriff in Johnson County himself, this claim is unequivocally false.
… No charges were ever filed against the woman and she was never under criminal investigation by Johnson County. She was being searched for as a missing person, not as a suspect of a crime.
That sheriff has since been arrested and indicted on felony counts in an unrelated sexual harassment and whistleblower retaliation case. He has also been charged with aggravated perjury for allegedly lying to a grand jury. EFF filed public records requests with Johnson County to obtain a more definitive account of events.
The newly released incident report and affidavit unequivocally describe the case as a “death investigation” of a “non-viable fetus.” These documents also undermine the claim that the ALPR search was in response to a medical emergency, since, in fact, the abortion had occurred more than two weeks before deputies were called to investigate.
In recent years, anti-abortion advocates and prosecutors have increasingly attempted to use “fetal homicide” and “wrongful death” statutes – originally intended to protect pregnant people from violence – to criminalize abortion and pregnancy loss. These laws, which exist in dozens of states, establish legal personhood of fetuses and can be weaponized against people who end their own pregnancies or experience a miscarriage.
In fact, a new report from Pregnancy Justice found that in just the first two years since the Supreme Court’s decision in Dobbs, prosecutors initiated at least 412 cases charging pregnant people with crimes related to pregnancy, pregnancy loss, or birth–most under child neglect, endangerment, or abuse laws that were never intended to target pregnant people. Nine cases included allegations around individuals’ abortions, such as possession of abortion medication or attempts to obtain an abortion–instances just like this one. The report also highlights how, in many instances, prosecutors use tangentially related criminal charges to punish people for abortion, even when abortion itself is not illegal.
By framing their investigation of a self-administered abortion as a “death investigation” of a “non-viable fetus,” Texas law enforcement was signaling their intent to treat the woman’s self-managed abortion as a potential homicide, even though Texas law does not allow criminal charges to be brought against an individual for self-managing their own abortion.
The Investigator’s Sworn Account
Over two days in April, the woman went through the process of taking medication to induce an abortion. Two weeks later, her partner–who would later be charged with domestic violence against her–reported her to the sheriff’s office.
The documents confirm that the woman was not present at the home when the deputies “responded to the death (Non-viable fetus).” As part of the investigation, officers collected evidence that the man had assembled of the self-managed abortion, including photographs, the FedEx envelope the medication arrived in, and the instructions for self-administering the medication.
Another Johnson County official ran two searches through the ALPR database with the note “had an abortion, search for female,” according to Flock Safety search logs obtained by EFF. The first search, which has not been previously reported, probed 1,295 Flock Safety networks–composed of 17,684 different cameras–going back one week. The second search, which was originally exposed by 404 Media, was expanded to a full month of data across 6,809 networks, including 83,345 cameras. Both searches listed the same case number that appears on the death investigation/incident report obtained by EFF.
After collecting the evidence from the woman’s partner, the investigators say they consulted the district attorney’s office, only to be told they could not press charges against the woman.
An excerpt from the JCSO detective’s sworn affidavit.
Nevertheless, when the subject showed up at the Sheriff’s office a week later, officers were under the impression that she came to “to tell her side of the story about the non-viable fetus.” They interviewed her, inspected text messages about the abortion on her phone, and watched her write a timeline of events.
Only after all that did they learn that she actually wanted to report a violent assault by her partner–the same individual who had called the police to report her abortion. She alleged that less than an hour after the abortion, he choked her, put a gun to her head, and made her beg for her life. The man was ultimately charged in connection with the assault, and the case is ongoing.
This documented account runs completely counter to what law enforcement and Flock have said publicly about the case.
Johnson County Sheriff Adam King told 404 media: “Her family was worried that she was going to bleed to death, and we were trying to find her to get her to a hospital.” He later told the Dallas Morning News: “We were just trying to check on her welfare and get her to the doctor if needed, or to the hospital.”
The account by the detective on the scene makes no mention of concerned family members or a medical investigator. To the contrary, the affidavit says that they questioned the man as to why he “waited so long to report the incident,” and he responded that he needed to “process the event and call his family attorney.” The ALPR search was recorded 2.5 hours after the initial call came in, as documented in the investigation report.
The Desk Sergeant’s Report—One Month Later
EFF obtained a separate “case supplemental report” written by the sergeant who says he ran the May 9 ALPR searches.
The sergeant was not present at the scene, and his account was written belatedly on June 5, almost a month after the incident and nearly a week after 404 Media had already published the sheriff’s alternative account of the Flock Safety search, kicking off a national controversy. The sheriff’s office provided this sergeant’s report to Dallas Morning News.
In the report, the sergeant claims that the officers on the ground asked him to start “looking up” the woman due to there being “a large amount of blood” found at the residence—an unsubstantiated claim that is in conflict with the lead investigator’s affidavit. The sergeant repeatedly expresses that the situation was “not making sense.” He claims he was worried that the partner had hurt the woman and her children, so “to check their welfare,” he used TransUnion’s TLO commercial investigative database system to look up her address. Once he identified her vehicle, he ran the plate through the Flock database, returning hits in Dallas.
Two abortion-related searches in the JCSO’s Flock Safety ALPR audit log
The sergeant’s report, filed after the case attracted media attention, notably omits any mention of the abortion at the center of the investigation, although it does note that the caller claimed to have found a fetus. The report does not explain, or even address, why the sergeant used the phrase “had an abortion, search for female” as the official reason for the ALPR searches in the audit log.
It’s also unclear why the sergeant submitted the supplemental report at all, weeks after the incident. By that time, the lead investigator had already filed a sworn affidavit that contradicted the sergeant’s account. For example, the investigator, who was on the scene, does not describe finding any blood or taking blood samples into evidence, only photographs of what the partner believed to be the fetus.
One area where they concur: both reports are clearly marked as a “death investigation.”
Correcting the Record
Since 404 Media first reported on this case, King has perpetuated the false narrative, telling reporters that the woman was never under investigation, that officers had not considered charges against her, and that “it was all about her safety.”
But here are the facts:
The reports that have been released so far describe this as a death investigation.
The lead detective described himself as “working a death investigation… of a non-viable fetus” at the time he interviewed the woman (a week after the ALPR searches).
The detective wrote that they consulted the district attorney’s office about whether they could charge her for “taking the pill to cause the abortion or miscarriage of the non-viable fetus.” They were told they could not.
Investigators collected a lot of data, including photos and documentation of the abortion, and ran her through multiple databases. They even reviewed her text messages about the abortion.
The death investigation was open for more than a month.
The death investigation was only marked closed in mid-June, weeks after 404 Media’s article and a mere days before the Dallas Morning News published its report, in which the sheriff inaccurately claimed the woman “was not under investigation at any point.”
Flock has promoted this unsupported narrative on its blog and in multimediaappearances. We did not reach out to Flock for comment on this article, as their communications director previously told us the company will not answer our inquiries until we “correct the record and admit to your audience that you purposefully spread misinformation which you know to be untrue” about this case.
Consider the record corrected: It turns out the truth is even more damning than initially reported.
The Aftermath
In the aftermath of the original reporting, government officials began to take action. The networks searched by Johnson County included cameras in Illinois and Washington state, both states where abortion access is protected by law. Since then:
The Illinois Secretary of State has announced his intent to “crack down on unlawful use of license plate reader data,” and urged the state’s Attorney General to investigate the matter.
In California, which also has prohibitions on sharing ALPR out of state and for abortion-ban enforcement, the legislature cited the case in support of pending legislation to restrict ALPR use.
Ranking Members of the House Oversight Committee and one of its subcommittees launched a formal investigation into Flock’s role in “enabling invasive surveillance practices that threaten the privacy, safety, and civil liberties of women, immigrants, and other vulnerable Americans.”
Senator Ron Wyden secured a commitment from Flock to protect Oregonians’ data from out-of-state immigration and abortion-related queries.
In response to mounting pressure, Flock announced a series of new features supposedly designed to prevent future abuses. These include blocking “impermissible” searches, requiring that all searches include a “reason,” and implementing AI-driven audit alerts to flag suspicious activity. But as we’ve detailed elsewhere, these measures are cosmetic at best—easily circumvented by officers using vague search terms or reusing legitimate case numbers. The fundamental architecture that enabled the abuse remains unchanged.
Meanwhile, as the news continued to harm the company’s sales, Flock CEO Garrett Langley embarked on a press tour to smear reporters and others who had raised alarms about the usage. In an interview with Forbes, he even doubled down and extolled the use of the ALPR in this case.
So when I look at this, I go “this is everything’s working as it should be.” A family was concerned for a family member. They used Flock to help find her, when she could have been unwell. She was physically okay, which is great. But due to the political climate, this was really good clickbait.
Nothing about this is working as it should, but it is working as Flock designed.
The Danger of Unchecked Surveillance
This case reveals the fundamental danger of allowing companies like Flock Safety to build massive, interconnected surveillance networks that can be searched across state lines with minimal oversight. When a single search query can access more than 83,000 cameras spanning almost the entire country, the potential for abuse is staggering, particularly when weaponized against people seeking reproductive healthcare.
The searches in this case may have violated laws in states like Washington and Illinois, where restrictions exist specifically to prevent this kind of surveillance overreach. But those protections mean nothing when a Texas deputy can access cameras in those states with a few keystrokes, without external review that the search is legal and legitimate under local law. In this case, external agencies should have seen the word “abortion” and questioned the search, but the next time an officer is investigating such a case, they may use a more vague or misleading term to justify the search. In fact, it’s possible it has already happened.
ALPRs were marketed to the public as tools to find stolen cars and locate missing persons. Instead, they’ve become a dragnet that allows law enforcement to track anyone, anywhere, for any reason—including investigating people’s healthcare decisions. This case makes clear that neither the companies profiting from this technology nor the agencies deploying it can be trusted to tell the full story about how it’s being used.
States must ban law enforcement from using ALPRs to investigate healthcare decisions and prohibit sharing data across state lines. Local governments may try remedies like reducing data retention period to minutes instead of weeks or months—but, really, ending their ALPR programs altogether is the strongest way to protect their most vulnerable constituents. Without these safeguards, every license plate scan becomes a potential weapon against a person seeking healthcare.
Two recent statements from the surveillance company—one addressing Illinois privacy violations and another defending the company’s national surveillance network—reveal a troubling pattern: when confronted by evidence of widespread abuse, Flock Safety has blamed users, downplayed harms, and doubled down on the very systems that enabled the violations in the first place.
Flock’s aggressive public relations campaign to salvage its reputation comes as no surprise. Last month, we described how investigative reporting from 404 Media revealed that a sheriff’s office in Texas searched data from more than 83,000 automated license plate reader (ALPR) cameras to track down a woman suspected of self-managing an abortion. (A scenario that may have been avoided, it’s worth noting, had Flock taken action when they were first warned about this threat three years ago).
Flock calls the reporting on the Texas sheriff’s office “purposefully misleading,” claiming the woman was searched for as a missing person at her family’s request rather than for her abortion. But that ignores the core issue: this officer used a nationwide surveillance dragnet (again: over 83,000 cameras) to track someone down, and used her suspected healthcare decisions as a reason to do so. Framing this as concern for her safety plays directly into anti-abortion narratives that depict abortion as dangerous and traumatic in order to justify increased policing, criminalization, control—and, ultimately, surveillance.
As if that weren’t enough, the company has also come under fire for how its ALPR network data is being actively used to assist in mass deportation. Despite U.S. Immigration and Customs Enforcement (ICE) having no formal agreement with Flock Safety, public records revealed “more than 4,000 nation and statewide lookups by local and state police done either at the behest of the federal government or as an ‘informal’ favor to federal law enforcement, or with a potential immigration focus.” The network audit data analyzed by 404 exposed an informal data-sharing environment that creates an end-run around oversight and accountability measures: federal agencies can access the surveillance network through local partnerships without the transparency and legal constraints that would apply to direct federal contracts.
Flock Safety is adamant this is “not Flock’s decision,” and by implication, not their fault. Instead, the responsibility lies with each individual local law enforcement agency. In the same breath, they insist that data sharing is essential, loudly claiming credit when the technology is involved in cross-jurisdictional investigations—but failing to show the same attitude when that data-sharing ecosystem is used to terrorize abortion seekers or immigrants.
Flock Safety: The Surveillance Social Network
In growing from a 2017 startup to a $7.5 billion company “serving over 5,000 communities,” Flock allowed individual agencies wide berth to set and regulate their own policies. In effect, this approach offered cheap surveillance technology with minimal restrictions, leaving major decisions and actions in the hands of law enforcement while the company scaled rapidly.
And they have no intention of slowing down. Just this week, Flock launched its Business Network, facilitating unregulated data sharing amongst its private sector security clients. “For years, our law enforcement customers have used the power of a shared network to identify threats, connect cases, and reduce crime. Now, we’re extending that same network effect to the private sector,” Flock Safety’s CEO announced.
The company is building out a new mass surveillance network using the exact template that ended with the company having to retrain thousands of officers in Illinois on how not to break state law—the same template that made it easy for officers to do so in the first place. Flock’s continued integration of disparate surveillance networks across the public and private spheres—despite the harms that have already occurred—is owed in part to the one thing that it’s gotten really good at over the past couple of years: facilitating a surveillance social network.
Employing marketing phrases like “collaboration” and “force multiplier,” Flock encourages as much sharing as possible, going as far as to claim that network effects can significantly improve case closure rates. They cultivate a sense of shared community and purpose among users so they opt into good faith sharing relationships with other law enforcement agencies across the country. But it’s precisely that social layer that creates uncontrollable risk.
The possibility of human workarounds at every level undermines any technical safeguards Flock may claim. Search term blocking relies on officers accurately labeling search intent—a system easily defeated by entering vague reasons like “investigation” or incorrect justifications, made either intentionally or not. And, of course, words like “investigation” or “missing person” can mean virtually anything, offering no value to meaningful oversight of how and for what the system is being used. Moving forward, sheriff’s offices looking to avoid negative press can surveil abortion seekers or immigrants with ease, so long as they use vague and unsuspecting reasons.
The same can be said for case number requirements, which depend on manual entry. This can easily be circumvented by reusing legitimate case numbers for unauthorized searches. Audit logs only track inputs, not contextual legitimacy. Flock’s proposed AI-driven audit alerts, something that may be able to flag suspicious activity after searches (and harm) have already occurred, relies on local agencies to self-monitor misuse—despite their demonstrated inability to do so.
And, of course, even the most restrictive department policy may not be enough. Austin, Texas, had implemented one of the most restrictive ALPR programs in the country, and the program still failed: the city’s own audit revealed systematic compliance failures that rendered its guardrails meaningless. The company’s continued appeal to “local policies” means nothing when Flock’s data-sharing network does not account for how law enforcement policies, regulations, and accountability vary by jurisdiction. You may have a good relationship with your local police, who solicit your input on what their policy looks like; you don’t have that same relationship with hundreds or thousands of other agencies with whom they share their data. So if an officer on the other side of the country violates your privacy, it’d be difficult to hold them accountable.
ALPR surveillance systems are inherently vulnerable to both technical exploitation and human manipulation. These vulnerabilities are not theoretical—they represent real pathways for bad actors to access vast databases containing millions of Americans’ location data. When surveillance databases are breached, the consequences extend far beyond typical data theft—this information can be used to harass, stalk, or even extort. The intimate details of people’s daily routines, their associations, and their political activities may become available to anyone with malicious intent. Flock operates as a single point of failure that can compromise—and has compromised—the privacy of millions of Americans simultaneously.
Don’t Stop de-Flocking
Rather than addressing legitimate concerns about privacy, security, and constitutional rights, Flock has only promised updates that fall short of meaningful reforms. These software tweaks and feature rollouts cannot assuage the fear engendered by the massive surveillance system it has built and continues to expand.
Flock’s insistence that what’s happening with abortion criminalization and immigration enforcement has nothing to do with them—that these are just red-state problems or the fault of rogue officers—is concerning. Flock designed the network that is being used, and the public should hold them accountable for failing to build in protections from abuse that cannot be easily circumvented.
Thankfully, that’s exactly what’s happening: cities like Austin, San Marcos, Denver, Norfolk, and San Diego are pushing back. And it’s not nearly as hard a choice as Flock would have you believe: Austinites are weighing the benefits of a surveillance system that generates a hit less than 0.02% of the time against the possibility that scanning 75 million license plates will result in an abortion seeker being tracked down by police, or an immigrant being flagged by ICE in a so-called “sanctuary city.” These are not hypothetical risks. It is already happening.
Given how pervasive, sprawling, and ungovernable ALPR sharing networks have become, the only feature update we can truly rely on to protect people’s rights and safety is no network at all. And we applaud the communities taking decisive action to dismantle its surveillance infrastructure.
Here’s yet another worrying development in the world of privately-owned security cameras. Flock Safety has made aggressive in-roads in both the private and public sector, something aided greatly by the company’s ability to blend the two.
Much like Ring before it, Flock is pitching cheap cameras with local law enforcement buy-in, nudging residents towards leaving their cameras (some of which have license plate reader capabilities) open so law enforcement can search their plate captures without a warrant. Law enforcement agencies are also buying their own cameras to ensure people can’t travel very far without leaving at least a temporary record of their travels the government can access pretty much at will.
And this is how that meshing of public-private is playing out in real life. As Joseph Cox and Jason Koebler report for 404 Media, at least one law enforcement officer has used this meshed network of Flock ALPR cameras to help locate a woman who recently had an abortion.
On May 9, an officer from the Johnson County Sheriff’s Office in Texas searched Flock cameras and gave the reason as “had an abortion, search for female,” according to the multiple sets of data. Whenever officers search Flock cameras they are required to provide a reason for doing so, but generally do not require a warrant or any sort of court order. Flock cameras continually scan the plates, color, and model of any vehicle driving by, building a detailed database of vehicles and by extension peoples’ movements.
Cops are able to search cameras acquired in their own district, those in their state, or those in a nationwide network of Flock cameras. That single search for the woman spread across 6,809 different Flock networks, with a total of 83,345 cameras, according to the data. The officer looked for hits over a month long period, it shows.
Some of these cameras were likely owned and operated by private purchasers. But even with those excluded, it’s still a massive data set the government can access without having to offer up much in the way of justification. The justification here (one that was reflected in access audits from Flock systems located as far away as Washington state) seems especially ominous and especially flimsy: “had an abortion, search for female.”
The Johnson County Sheriff’s Office claims this search was performed to help, not harm.
Sheriff Adam King of the Johnson County Sheriff’s Office told 404 Media in a phone call that the woman self-administered the abortion “and her family was worried that she was going to bleed to death, and we were trying to find her to get her to a hospital.”
“We weren’t trying to block her from leaving the state or whatever to get an abortion,” he said. “It was about her safety.”
Even if that’s completely true, it’s not that comforting to know Texas law enforcement officers can perform the same searches for the purpose of prosecuting people who have sought abortions in nearby states where this is still legal. The justifications offered during the acquisition process always stresses the equipment will be used to deal with the most violent crimes. While utilizing the tech to search for a missing person is something most people would find acceptable, its proximity to the state’s recent abortion ban definitely isn’t an encouraging sign.
If these tools can be used this way, you can guarantee they will be used this way. Once one law enforcement agency gets the ball rolling on abortion arrests and weathers the press storm that it will provoke, the rest will follow suit, especially in areas populated by prosecutors with anti-abortion beliefs. Companies like Flock will just make everything easier for people looking to punish women for daring to explore their options and retain what’s left of their bodily autonomy.
When your local police department buys one piece of surveillance equipment, you can easily expect that the company that sold it will try to upsell them on additional tools and upgrades.
At the end of the day, public safety vendors are tech companies, and their representatives are salespeople using all the tricks from the marketing playbook. But these companies aren’t just after public money—they also want data.
And each new bit of data that police collect contributes to a pool of information to which the company can attach other services: storage, data processing, cross-referencing tools, inter-agency networking, and AI analysis. The companies may even want the data to train their own AI model. The landscape of the police tech industry is changing, and companies that once specialized in a single technology (such as hardware products like automated license plate readers (ALPRs) or gunshot detection sensors) have developed new capabilities or bought up other tech companies and law enforcement data brokers—all in service of becoming the corporate giant that serves as a one-stop shop for police surveillance needs.
One of the most alarming trends in policing is that companies are regularly pushing police to buy more than they need. Vendors regularly pressure police departments to lock in the price now for a whole bundle of features and tools in the name of “cost savings,” often claiming that the cost à la carte for any of these tools will be higher than the cost of a package, which they warn will also be priced more expensively in the future. Market analysts have touted the benefits of creating “moats” between these surveillance ecosystems and any possible competitors. By making it harder to switch service providers due to integrated features, these companies can lock their cop customers into multi-year subscriptions and long-term dependence.
Think your local police are just getting body-worn cameras (BWCs) to help with public trust or ALPRs to aid their hunt for stolen vehicles? Don’t assume that’s the end of it. If there’s already a relationship between a company and a department, that department is much more likely to get access to a free trial of whatever other device or software that company hopes the department will put on its shopping list.
These vendors also regularly help police departments apply for grants and waivers, and provide other assistance to find funding, so that as soon as there’s money available for a public safety initiative, those funds can make their way directly to their business.
Companies like Axon have been particularly successful at using their relationships and leveraging the ability to combine equipment into receiving “sole source” designations. Typically, government agencies must conduct a bidding process when buying a new product, be it toilet paper, computers, or vehicles. For a company to be designated a sole-source provider, it is supposed to provide a product that no other vendor can provide. If a company can get this designation, it can essentially eliminate any possible competition for particular government contracts. When Axon is under consideration as a vendor for equipment like BWCs, for which there are multiple possible other providers, it’s not uncommon to see a police department arguing for a sole-source procurement for Axon BWCs based on the company’s ability to directly connect their cameras to the Fusus system, another Axon product.
Here are a few of the big players positioning themselves to collect your movements, analyze your actions, and make you—the taxpayer—bear the cost for the whole bundle of privacy invasions.
Axon Enterprise’s ‘Suite’
Axon expects to have yet another year of $2 billion-plus in revenue in 2025. The company first got its hooks into police departments through the Taser, the electric stun gun. Axon then plunged into the BWC market amidst Obama-era outrage at police brutality and the flood of grant money flowing from the federal government to local police departments for BWCs, which were widely promoted as a police accountability tool. Axon parlayed its relationships with hundreds of police departments and capture and storage of growing terabytes of police footage into a menu of new technological offerings.
In its annual year-end securities filing, Axon told investors it was “building the public safety operating system of the future” through its suite of “cloud-hosted digital evidence management solutions, productivity and real-time operations software, body cameras, in-car cameras, TASER energy devices, robotic security and training solutions” to cater to agencies in the federal, corrections, justice, and security sectors.”
Axon controls an estimated 85 percent of the police body-worn camera market. Its Evidence.com platform, once a trial add-on for BWC customers, is now also one of the biggest records management systems used by police. Its other tools and services include record management, video storage in the cloud, drones, connected private cameras, analysis tools, virtual reality training, and real-time crime centers.
An image from the Quarter 4 2024 slide deck for investors, which describes different levels of the “Officer Safety Plan” (OSP) product package and highlights how 95% of Axon customers are tied to a subscription plan.
Axon has been adding AI to its repertoire, and it now features a whole “AI Era” bundle plan. One recent offering is Draft One, which connects to Axon’s body-worn cameras (BWCs) and uses AI to generate police reports based on the audio captured in the BWC footage. While use of the tool may start off as a free trial, Axon sees Draft One as another key product for capturing new customers, despite widespread skepticism of the accuracy of the reports, the inability to determine which reports have been drafted using the system, and the liability they could bring to prosecutions.
In 2024, Axon acquired a company called Fusus, a platform that combines the growing stores of data that police departments collect—notifications from gunshot detection and automated license plate reader (ALPR) systems; footage from BWCs, drones, public cameras, and sometimes private cameras; and dispatch information—to create “real-time crime centers.” The company now claims that Fusus is being used by more than 250 different policing agencies.
Fusus claims to bring the power of the real-time crime center to police departments of all sizes, which includes the ability to help police access and use live footage from both public and private cameras through an add-on service that requires a recurring subscription. It also claims to integrate nicely with surveillance tools from other providers. Recently, it has been cutting ties, most notably with Flock Safety, as it starts to envelop some of the options its frenemies had offered.
In the middle of April, Axon announced that it would begin offering fixed ALPR, a key feature of the Flock Safety catalogue, and an AI Assistant, which has been a core offering of Truleo, another Axon competitor.
Flock Safety’s Bundles and FlockOS
Flock Safety is another major police technology company that has expanded its focus from one primary technology to a whole package of equipment and software services.
Flock Safety started with ALPRs. These tools use a camera to read vehicle license plates, collecting the make, model, location, and other details which can be used for what Flock calls “Vehicle Fingerprinting.” The details are stored in a database that sometimes finds a match among a “hot list” provided by police officers, but otherwise just stores and shares data on how, where, and when everyone is driving and parking their vehicles.
Much of what Flock Safety does now comes together in their FlockOS system, which claims to bring together various surveillance feeds and facilitate real-time “situational awareness.”
When you think of Motorola, you may think of phones—but there’s a good chance that you missed the moment in 2011 when the phone side of the company, Motorola Mobility, split off from Motorola Solutions, which is now a big player in police surveillance.
On its website, Motorola Solutions claims that departments are better off using a whole list of equipment from the same ecosystem, boasting the tagline, “Technology that’s exponentially more powerful, together.” Motorola describes this as an “ecosystem of safety and security technologies” in its securities filings. In 2024, the company also reported $2 billion in sales, but unlike Axon, its customer base is not exclusively law enforcement and includes private entities like sports stadiums, schools, and hospitals.
Motorola’s technology includes 911 services, radio, BWCs, in-car cameras, ALPRs, drones, face recognition, crime mapping, and software that supposedly unifies it all. Notably, video can also come with artificial intelligence analysis, in some cases allowing law enforcement to search video and track individuals across cameras.
In January 2019, Motorola Solutions acquired Vigilant Solutions, one of the big players in the ALPR market, as part of its takeover of Vaas International Holdings. Now the company (under the subsidiary DRN Data) claims to have billions of scans saved from police departments and private ALPR cameras around the country. Marketing language for its Vehicle Manager system highlights that “data is overwhelming,” because the amount of data being collected is “a lot.” It’s a similar claim made by other companies: Now that you’ve bought so many surveillance tools to collect so much data, you’re finding that it is too much data, so you now need more surveillance tools to organize and make sense of it.
SoundThinking’s ‘SafetySmart Platform’
SoundThinking began as ShotSpotter, a so-called gunshot detection tool that uses microphones placed around a city to identify and locate sounds of gunshots. As news reports of the tool’s inaccuracy and criticisms have grown, the company has rebranded as SoundThinking, adding to its offerings ALPRs, case management, and weapons detection. The company is now marketing its SafetySmart platform, which claims to integrate different stores of data and apply AI analytics.
In 2024, SoundThinking laid out its whole scheme in its annual report, referring to it as the “cross-sell” component of their sales strategy.
The “cross-sell” component of our strategy is designed to leverage our established relationships and understanding of the customer environs by introducing other capabilities on the SafetySmart platform that can solve other customer challenges. We are in the early stages of the upsell/cross-sell strategy, but it is promising – particularly around bundled sales such as ShotSpotter + ResourceRouter and CaseBuilder +CrimeTracer. Newport News, VA, Rocky Mount, NC, Reno, NV and others have embraced this strategy and recognized the value of utilizing multiple SafetySmart products to manage the entire life cycle of gun crime…. We will seek to drive more of this sales activity as it not only enhances our system’s effectiveness but also deepens our penetration within existing customer relationships and is a proof point that our solutions are essential for creating comprehensive public safety outcomes. Importantly, this strategy also increases the average revenue per customer and makes our customer relationships even stickier.
Many of SoundThinking’s new tools rely on a push toward “data integration” and artificial intelligence. ALPRs can be integrated with ShotSpotter. ShotSpotter can be integrated with the CaseBuilder records management system, and CaseBuilder can be integrated with CrimeTracer. CrimeTracer, once known as COPLINK X, is a platform that SoundThinking describes as a “powerful law enforcement search engine and information platform that enables law enforcement to search data from agencies across the U.S.” EFF tracks this type of tool in the Atlas of Surveillance as a third-party investigative platform: software tools that combine open-source intelligence data, police records, and other data sources, including even those found on the dark web, to generate leads or conduct analyses.
SoundThinking, like a lot of surveillance, can be costly for departments, but the company seems to see the value in fostering its existing police department relationships even if they’re not getting paid right now. In Baton Rouge, budget cuts recently resulted in the elimination of the $400,000 annual contract for ShotSpotter, but the city continues to use it.
“They have agreed to continue that service without accepting any money from us for now, while we look for possible other funding sources. It was a decision that it’s extremely expensive and kind of cost-prohibitive to move the sensors to other parts of the city,” Baton Rouge Police Department Chief Thomas Morse told a local news outlet, WBRZ.
Beware the Bundle
Government surveillance is big business. The companies that provide surveillance and police data tools know that it’s lucrative to cultivate police departments as loyal customers. They’re jockeying for monopolization of the state surveillance market that they’re helping to build. While they may be marketing public safety in their pitches for products, from ALPRs to records management to investigatory analysis to AI everything, these companies are mostly beholden to their shareholders and bottom lines.
The next time you come across BWCs or another piece of tech on your city council’s agenda or police department’s budget, take a closer look to see what other strings and surveillance tools might be attached. You are not just looking at one line item on the sheet—it’s probably an ongoing subscription to a whole package of equipment designed to challenge your privacy, and no sort of discount makes that a price worth paying.