Cities are pulling back from a surveillance network expanding rapidly beyond their control
License plate readers, home security cameras, and outsourced data labeling have created a control grid that was approved piecemeal before anyone realized how it would chillingly coagulate against us.

This September, Evanston, Illinois took deliberate steps to regain authority over its own streets. The city ordered Flock Safety to shut down 19 cameras and issued a 30-day termination notice after learning that federal immigration authorities had accessed the statewide Flock network to conduct raids.
Flock complied, removing 15 stationary cameras. Then, days later, the company reinstalled them without authorization and without notifying the city. Evanston issued a cease-and-desist letter instructing Flock to take the cameras down once more. For several days, the only available interim measure was to cover the devices in black plastic bags while the dispute played out.
The image of those poles — their cameras hooded just as a prisoner is before execution — was an eerie metaphor for Evanston: the municipality was no longer in control of a surveillance system marketed to them as a tool for ‘public safety.’
Over the past year, a wave of municipalities — from Cambridge, Massachusetts to San Marcos, Texas — has reversed course on Flock, the automatic license plate reader company whose devices now monitor more than 6,000 communities across the United States. These cities had, in many cases, approved the cameras with little debate. Public safety is a persuasive argument, and the promise of a network that could track vehicles across state lines appealed to departments facing staffing shortages and rising caseloads.
The scale of ‘interoperability’ changed the mood. Local officials began to recognize how easily the cameras exchanged information with a national data pool and how often federal agencies queried the system. Cambridge paused its program after learning that federal investigators could access data despite assurances of local oversight. Evanston ended its contract after a state audit revealed immigration authorities had been pulling footage from multiple Illinois jurisdictions. San Marcos withdrew after sustained resident concerns about privacy, and Hays County, Texas, soon followed.
In Eugene, Oregon, officials learned the Postal Inspection Service and the Bureau of Alcohol, Tobacco, Firearms and Explosives had used the city’s Flock network while investigating mail fraud and narcotics cases. Each community reached the same conclusion: a system sold as local in scope had become the de facto structural backbone of a national surveillance system. Worse, it was unstoppable.

This outcome was baked into Flock’s design, something most local councils should have known, but probably didn’t comprehend. The company built a distributed surveillance grid capable of storing up to 30 days of movement patterns, license plates, and vehicle characteristics. So a car first spotted in Massachusetts, could be traced through New Jersey, and all the way to Florida, Texas, or beyond. City councils did not grasp what activating the benign-sounding “national sharing network” entailed, or how broadly their local data would circulate.
Plans for expansion certainly don’t stop at license plates. Ring — an Amazon subsidiary and now the country’s most widely adopted home camera security system — announced a new integration with Flock that would allow law enforcement to request footage directly through Ring’s ‘Neighbors’ app. Agencies using Flock will be able to send ‘Community Requests’ to Ring users, who can choose to provide footage during an investigation.
This arrangement links residential cameras to neighborhood cameras, which link to national networks, which feed federal databases. The infrastructure of home security — voluntarily installed by private individuals — has now become an extension of public surveillance, folding private footage into the same ecosystem. A unwitting private-public partnership, as it were.
Another layer emerged when WIRED reported last week that Flock has been using overseas gig workers to review and label surveillance footage. Workers, primarily in the Philippines and contracted through Upwork, were instructed to categorize material captured in American neighborhoods. Their tasks included identifying vehicle types, clothing, pedestrians, motorcycle riders, and audio signatures such as gunshots, tire screeching, and screaming. The footage became training material for machine learning models designed to detect these events automatically. The pipeline moved domestic surveillance images through an offshore labor market with little public knowledge or oversight.

Exiting these networks is difficult or impossible. Cities and municipalities have found that data collected over months and years could not be withdrawn. Federal agencies had already run searches and acted on the results. Machine learning models had already internalized patterns of daily movement — who traveled which routes, which vehicles appeared where, which streets held predictable rhythms. Even when municipalities removed physical cameras, the system retained the intelligence it had already accumulated, and in some cases, as Evanston witnessed, the hardware returned without approval. The control grid did not depend on any single installation; it existed in the relationships between them.
The expansion from roadside poles to residential doorbells now appears less like an evolution of convenience and more like an infrastructural shift. Local governments believed they were adopting a tool for recovering stolen cars and solving burglaries. Homeowners counted on a device that would show them when their pizza delivery arrived, or nabbing that pesky neighbor who doesn’t pick up after their dog.
Instead, what emerged is a nationwide monitoring ecosystem that has grown beyond its original mandate and operates on a scale that includes police departments, federal agencies, private corporations, offshore labor, and millions of individual households.
The grid expanded not because of a single directive from on high, but because of tens of thousands of small permissions, which in turn created the conditions for it to exist. Go ahead — cover the devices with plastic bags or dissolve your contract by unsubscribing from a service. That only slows the spread of the technology, but doesn’t unpick the ecosystem that has sprung up everywhere. The information will continue to influence policing, analytics, and decision-making long after hardware is removed, if it ever is.
This is the shape of modern control: incremental, networked, and largely invisible until it becomes impossible to ignore. By which time, we will all be in digital handcuffs, and it will indeed be too late.


They are pulling back for two reasons. First is the public resistance but I think that is more of an excuse. They certainly don't feel compelled by public sentiment in their choices most times.
I feel that the bigger reason for the pull back is that they are being embarrassed in court when their narrative falls apart due to conflicting evidence.
Look at the January 6 evidence. They wanted to portray all the people at the capital as being a bloodthirsty mob when most of them were actually behaving very meekly. Things like being shepherded through the building by the Capitol Police and having doors opened for them really went against the story they tried to build.
This plays out over and over with things like body cameras. It especially has a lot of comedic impact when they make mistakes like not remembering to turn off the cameras when they do things that get them into trouble.
This is why I have mixed emotions about the surveillance. People behave much differently when they are being watched than they do when they think nobody is looking.
You don't really have an expectation of privacy when in fact you are in a public space. You don't have the right to break the law just because nobody sees it happen. Yes, there are some bad laws but that is a different question.