What the Flock Is This: The Future of Mass Surveillance in the USA

Big Brother’s highway cameras now have AI, and they capture 6 to 12 photos of every car that goes by. Then, they all get uploaded into a huge national database, which out-of-state police and government agencies can access. They are very expensive and keep the information for 30 days.

Cameras are in 49 states.

It’s like being watched by a bunch of prison guards.

Anyone could access it publicly. Then came the lawsuits. What does this mean for mass surveillance in the United States?

Keep reading

UK Government Plans to Use Delegated Powers to Undermine Encryption and Expand Online Surveillance

The UK government wants to scan people’s photos before they send them. Not just children’s photos. Everyone’s.

Technology Secretary Liz Kendall spelled it out on BBC Breakfast, floating a proposal to “block photographs being sent that are potentially nude photographs by anybody or block children from sending those.” That second clause is the tell. Blocking “anybody” from sending potentially nude images requires scanning everybody’s messages. There’s no technical path to that outcome that doesn’t involve reading content the sender assumed was private.

Kendall said the government is conducting a consultation on “whether we should have age limits on things like live streaming” and whether there should be “age limits on what’s called stranger pairing, for example, on games online.” The consultation, she said, will look at all of these. That list now covers messaging apps, photo sharing, gaming, and live streaming. Any feature that lets you share an image with another person potentially falls inside it.

This is how the mandate grows. The government announced a push for new delegated powers on February 16, framing them around age verification for social media and VPNs.

Keep reading

Was It a Coincidental Traffic Stop or AI-Powered Surveillance?

Seth Ferranti was driving his Ford pickup on a southeastern Nebraska stretch of the interstate in November 2024 when law enforcement pulled him over, claiming that he had wobbled onto the hard shoulder.

As the Seward County sheriff’s deputies questioned Ferranti, a filmmaker who had spent 21 years in prison for distributing LSD, they allegedly smelled cannabis. Declaring this probable cause for a search, they searched the vehicle and discovered more than 400 pounds of marijuana.

But were those the actual reasons for the stop and search? When Ferranti went on trial, his attorneys presented a license plate reader report produced by the security communications company Motorola Solutions. It revealed Ferranti had been consistently monitored prior to his arrest, including by the local sheriff on the day he was apprehended. (Neither the sheriff’s office nor Motorola responded to Reason‘s requests for comment.)

Ferranti’s legal team argued that it was unconstitutional to surveil somebody based on his previous crimes. The argument did not carry the day: Last month their client was sentenced to up to two and a half years for possession of cannabis with intent to distribute. But the case still raises substantial moral and constitutional questions about both the scale of these public-private surveillance partnerships and the ways they’re being used.

Ferranti had long been a celebrity in the drug-reform world, going back to that LSD arrest in the early ’90s. After that first bust, he jumped bail, went on the lam, landed on the U.S. Marshals’ 15 Most Wanted Fugitives list, and even staged his own drowning to evade the authorities. After he started serving his sentence in 1993, he became a prolific prison journalist, writing the “I’m Busted” column for Vice. The New Jersey native always insisted that his crimes were nonviolent and that the drugs he sold, LSD and cannabis, had medicinal or therapeutic benefits.

After Ferranti came out of prison, his 2017 documentary White Boy—the true story of a teenage FBI informant who became a major cocaine trafficker—was a success on Netflix. He produced a number of further films, including 2023’s Secret History of the LSD Trade. And apparently, the government kept watching him.

It’s been watching a lot of people—and Motorola isn’t the only company helping it. Flock Safety was founded in 2017, and within five years it had tens of thousands of cameras operational. As the American Civil Liberties Union (ACLU) has warned, Flock’s AI-assisted automated license plate recognition (ALPR) system has been undergoing an “insidious expansion” beyond its supposed purposes of identifying vehicles of interest, such as stolen cars and hit-and-run suspects. Immigration and Customs Enforcement has used it to locate illegal migrants, and law enforcement in Texas used it to investigate a self-administered abortion, foreshadowing its potential use as a predictive policing tool for all Americans. Lee Schmidt, a veteran in Virginia, recently learned that the system logged him more than 500 times in four months. 

“I don’t know whether law enforcement officers are using [ALPRs] to do predictive policing,” says Joshua Windham of the Institute of Justice, a public interest law firm that is campaigning to stop the warrantless use of license plate reader cameras. “We know that [Customs and Border Patrol] is using ALPRs generally to stop cars with what they deem ‘suspicious’ travel patterns.”

After reviewing the document cataloguing the Ferranti’s vehicle monitoring, Windham adds: “The records are consistent with an officer either looking up a car in his system to see where else that car was captured by ALPRs, or that car showing up as a ‘hot list’ alert in the Motorola system. But it’s hard to tell, from the records alone, whether the stop was a ‘predictive policing’ stop.”

Ferranti is convinced it was. “There were no warrants, investigations, informants, state police, DEA, or FBI involvement, just Seward County Sheriff’s office [and an] AI-assisted license plate tracking service to perpetuate their outdated War on Drugs mission,” he said in an Instagram post published by his family following his sentencing. “Traveling the highways as a person with a record is now considered [suspicious] activity by the AI.”

Keep reading

Ring Cancels Flock Safety Integration After Public Backlash

Public backlash has forced Ring to cancel its partnership with Flock Safety, the law enforcement surveillance company whose camera network has reportedly given ICE and other federal agencies access to footage across the country.

Ring announced the cancellation this week, saying the integration never went live.

The company’s statement was careful:

“Following a comprehensive review, we determined the planned Flock Safety integration would require significantly more time and resources than anticipated. We therefore made the joint decision to cancel the integration and continue with our current partners…The integration never launched, so no Ring customer videos were ever sent to Flock Safety.”

That last sentence is doing a lot of work. Ring users responding to the Flock announcement went further than strongly worded tweets. People smashed cameras. Others announced publicly that they were throwing their devices away. The Amazon-owned company had badly misread the moment.

Flock Safety is a surveillance technology company that operates a nationwide network of AI-powered cameras, primarily known for license plate readers, and sells access to the resulting database of vehicle movements to roughly 5,000 law enforcement agencies across the United States.

The Flock partnership was announced back in October 2025, and you may remember the feature report How Amazon Is Turning Your Neighborhood Into a Police Database, which gave deeper insight into the plans.

It got pushback at the time, but only became a bigger crisis after the recent outrage some cities have shown to ICE enforcement activity, when social media posts claimed Ring was providing a direct pipeline through Flock to ICE.

That specific claim isn’t accurate, since the Flock connection never went live. But Ring’s broader relationship with the police is real and extensive, which gave the fear enough traction to land.

Keep reading

Amazon’s Ring and Google’s Nest Unwittingly Reveal the Severity of the U.S. Surveillance State

That the U.S. Surveillance State is rapidly growing to the point of ubiquity has been demonstrated over the past week by seemingly benign events. While the picture that emerges is grim, to put it mildly, at least Americans are again confronted with crystal clarity over how severe this has become.

The latest round of valid panic over privacy began during the Super Bowl held on Sunday. During the game, Amazon ran a commercial for its Ring camera security system. The ad manipulatively exploited people’s love of dogs to induce them to ignore the consequences of what Amazon was touting. It seems that trick did not work.

The ad highlighted what the company calls its “Search Party” feature, whereby one can upload a picture, for example, of a lost dog. Doing so will activate multiple other Amazon Ring cameras in the neighborhood, which will, in turn, use AI programs to scan all dogs, it seems, and identify the one that is lost. The 30-second commercial was full of heart-tugging scenes of young children and elderly people being reunited with their lost dogs.

But the graphic Amazon used seems to have unwittingly depicted how invasive this technology can be. That this capability now exists in a product that has long been pitched as nothing more than a simple tool for homeowners to monitor their own homes created, it seems, an unavoidable contract between public understanding of Ring and what Amazon was now boasting it could do.

Keep reading

‘No Privacy’ CBDCs Will Come, Warns Billionaire Ray Dalio

American billionaire and hedge fund manager Ray Dalio has warned that central bank digital currencies (CBDCs) are coming, offering benefits but also potentially allowing governments to exert more control over people’s finances.

“I think it will be done,” said Dalio on CBDCs in a wide-ranging interview on the Tucker Carlson Show on Monday, which also included topics on the US debt crisis, gold prices, and even a potential civil war. 

Ray Dalio is a billionaire hedge fund manager who has been co-chief investment officer of Bridgewater Associates since 1985, after founding the firm in 1975. 

During the interview, Dalio said CBDCs could be appealing due to the ease of transactions, likening them to money market funds in terms of functionality, but he also cautioned about their downsides.

He said there will be a debate, but CBDCs “probably won’t” offer interest, so they will not be “an effective vehicle to hold because you’ll have the depreciation [of the dollar].”

Dalio also cautioned that all CBDC transactions will be known to the government, which is good for controlling illegal activity, but also provides a great deal of control in other areas. 

“There will be no privacy, and it’s a very effective controlling mechanism by the government.”

Keep reading

The Lost Dog That Made Constant Surveillance Feel Like a Favor

Amazon picked the Super Bowl for a reason. Nothing softens a technological land grab like a few million viewers, a calm voice, and a lost dog.

Ring’s commercial introduced “Search Party,” a feature that links doorbell cameras through AI and asks users to help find missing pets. The tone was gentle despite the scale being enormous.

Jamie Siminoff, Ring’s founder, narrated the ad over images of taped-up dog posters and surveillance footage polished to look comforting rather than clinical. “Pets are family, but every year, 10 million go missing,” he said. The answer arrived on cue. “Search Party from Ring uses AI to help families find lost dogs.”

This aired during a broadcast already stuffed with AI branding, where commercial breaks felt increasingly automated. Ring’s spot stood out because it described a system already deployed across American neighborhoods rather than a future promise.

Search Party lets users post a missing dog alert through the Ring app. Participating outdoor cameras then scan their footage for dogs resembling the report. When the system flags a possible match, the camera owner receives an alert and can decide whether to share the clip.

Siminoff framed the feature as a community upgrade. “Before Search Party, the best you could do was drive up and down the neighborhood, shouting your dog’s name in hopes of finding them,” he said.

The new setup allows entire neighborhoods to participate at once. He emphasized that it is “available to everyone for free right now” in the US, including people without Ring cameras.

Amazon paired the launch with a $1 million initiative to equip more than 4,000 animal shelters with Ring systems. The company says the goal is faster reunification and shorter shelter stays.

Every element of the rollout leaned toward public service language.

The system described in the ad already performs pattern detection, object recognition, and automated scanning across a wide network of private cameras.

The same system that scans footage for a missing dog already supports far broader forms of identification. Software built to recognize an animal by color and shape also supports license plate reading, facial recognition, and searches based on physical description.

Ring already operates a process that allows police to obtain footage without a warrant under situations they classify as emergencies. Once those capabilities exist inside a shared camera network, expanding their use becomes a matter of policy choice rather than technical limitation.

Ring also typically enables new AI features by default, leaving users responsible for finding the controls to disable them.

Keep reading

EU Law Could Extend Scanning of Private Messages Until 2027

The European Parliament is considering another extension of Chat Control 1.0, the “temporary” exemption that allows communications providers to scan private messages (under the premise of preventing child abuse) despite the protections of the EU’s ePrivacy Directive.

draft report presented by rapporteur Birgit Sippel (S&D) would prolong the derogation until April 3, 2027.

At first glance, the proposal appears to roll back some of the most controversial elements of Chat Control. Text message scanning and automated analysis of previously unknown images would be explicitly excluded. Supporters have framed this as a narrowing of scope.

However, the core mechanism of Chat Control remains untouched.

The draft continues to permit mass hash scanning of private communications for so-called “known” material.

According to former MEP and digital rights activist Patrick Breyer, approximately 99 percent of all reports generated under Chat Control 1.0 originate from hash-based detection.

Almost all of those reports come from a single company, Meta, which already limits its scanning to known material only. Under the new proposal, Meta’s practices would remain fully authorized.

As a result, the draft would not meaningfully reduce the volume, scope, or nature of surveillance. The machinery keeps running, with a few of its most visibly controversial attachments removed.

Hash scanning is often portrayed as precise and reliable. The evidence points in the opposite direction.

First, the technology is incapable of understanding context or intent. Hash databases are largely built using US legal definitions of illegality, which do not map cleanly onto the criminal law of EU Member States.

The German Federal Criminal Police Office (BKA) reports that close to half of all chat control reports are criminally irrelevant.

Each false positive still requires assessment, documentation, and follow-up. Investigators are forced to triage noise rather than pursue complex cases involving production, coercion, and organized abuse.

The strategic weakness is compounded by a simple reality. Offenders adapt. As more services adopt end-to-end encryption, abusers migrate accordingly. Since 2022, the number of chat-based reports sent to police has fallen by roughly 50 percent, not because abuse has declined, but because scanning has become easier to evade.

“Both children and adults deserve a paradigm shift in online child protection, not token measures,” Breyer said in a statement to Reclaim The Net.

“Whether looking for ‘known’ or ‘unknown’ content, the principle remains: the post office cannot simply open and scan every letter at random. Searching only for known images fails to stop ongoing abuse or rescue victims.”

Keep reading

Palantir’s ELITE: Not All Maps Are Meant To Guide Us

Many memorable journeys start with a map. Maps have been around for ages, guiding humanity on its way in grand style. Maps have helped sailors cross oceans, caravans traverse deserts, and armies march into the pages of history. Maps have been staple tools of exploration, survival, and sovereignty. And today? Today, they’re on our devices, and we use them to find literally everything, including the nearest taco truck, coffee shop, and gas station. Yet, today’s maps don’t just show us where we are and where we are going. Increasingly, they also tell someone else the gist of who we are. What does that mean exactly? It means not all maps are made for us. Some maps are made about us. Case in point—the objective of Palantir’s ELITE demands our immediate attention. ELITE is a digital map used by ICE to identify neighborhoods, households, and individuals for targeted enforcement, drawing on data that was never meant to become ammunition.

No, Palantir’s ELITE is not strictly limited to use by U.S. Immigration and Customs Enforcement (ICE), but its primary and reported use is specifically for immigration enforcement. ELITE, which stands for Enhanced Leads Identification & Targeting for Enforcement, is a software tool/app developed by Palantir for ICE to find, classify, and prioritize presumably illegal immigrants for deportation. It was rolled out in late 2025, with reports of use starting in September 2025. Essentially, ELITE is a map that pulls data from across federal systems—including agencies like Medicaid and Health Department information—and uses it to compile dossiers on people, complete with address confidence scores and patterns of residence density. It tells ICE agents where individuals live and how likely they are to be there so that ICE can prioritize “target-rich environments” for raids.

In other words, data that was once siloed for entirely different purposes—health records, public assistance, demographic lists—is now being fused into a single dashboard designed to help federal agents decide where to show up and who to detain. While no one wants criminal illegal aliens freely roaming the streets of our nation, the result of the operation is not “analytics”—it is anticipatory policing dressed as operational efficiency. One might think the scenario sounds like something only seen in dystopian fiction, and others agree. Advocates for freedom have pointed out that ELITE’s model resembles (in unsettling ways) systems designed to anticipate behavior rather than respond to actual wrongdoing. Beyond that, what else could it be used for, and when will that next step begin?

Keep reading

Britain To Roll Out Facial Recognition in Police Overhaul

Britain’s policing system, we are told, is broken. And on Monday, the home secretary, Shabana Mahmood, announced that the fix would arrive in the form of algorithms, facial recognition vans, and a large check made out to the future.

The government plans to spend £140m ($191M) on artificial intelligence and related technology, with the promise that it will free up six million police hours a year, the equivalent of 3,000 officers.

It is being billed as the biggest overhaul of policing in England and Wales in 200 years, aimed at dragging a creaking system into the modern world.

The ambition is serious. The implications are too.

The plan is for AI software that will analyze CCTV, doorbell, and mobile phone footage, detect deepfakes, carry out digital forensics, and handle administrative tasks such as form filling, redaction, and transcription. Mahmood’s argument is that criminals are getting smarter, while parts of the police service are stuck with tools that belong to another era.

She put it plainly: “Criminals are operating in increasingly sophisticated ways. However, some police forces are still fighting crime with analogue methods.”

And she promised results: “We will roll out state-of-the-art tech to get more officers on the streets and put rapists and murderers behind bars.”

There is logic here. Few people would argue that trained officers should be buried in paperwork. Technology can help with that. The concern is what else comes with it.

Live facial recognition is being expanded aggressively. The number of police vans equipped with the technology will increase fivefold, from ten to fifty, operating across the country. These systems scan faces in public spaces and compare them to watch lists of wanted individuals.

This is a form of mass surveillance and when automated systems get things wrong, the consequences fall on real people.

Keep reading