UK’s New Pandemic Plan Would Turn Big Tech Into a Mass Location Tracking Network

Britain’s new £1 billion ($1.3m) pandemic strategy treats a future outbreak as a “certainty” and proposes building a contact tracing system that would feed on real-time location data harvested with the help of Silicon Valley’s biggest companies.

The plan, published by the Department of Health and Social Care, also calls for PPE stockpiles, new emergency legislation, and a biosecurity research hub in Essex.

But the centerpiece that deserves the most scrutiny is the contact tracing proposal, which would create a surveillance architecture designed to track the movements of millions of people, ready to switch on at a moment’s notice.

The UKHSA will run the new system, which the strategy document says will use “live location data” and artificial intelligence to provide “a more rapid, large-scale detection and alert system during pandemics.”

The agency plans to “explore options to work with ‘big tech’” to build it, with deployment targeted for 2030. The government is pre-building a location surveillance system in partnership with companies whose entire business model depends on harvesting as much personal data as possible.

The strategy doesn’t name which companies, what data-sharing agreements would look like, or what happens to your location history once the pandemic ends.

The UK government has already tracked its own citizens through their phones without telling them. A 2021 report by the Scientific Pandemic Influenza Group on Behaviors (SPI-B) revealed that government-funded researchers tracked one in ten people in Britain via their mobile phones in February of that year, without the users’ knowledge or permission.

Researchers used cell phone mobility data to select over 4,200 vaccinated individuals, then monitored them through 40 call data records with corresponding location observations. The data was used for behavioral analysis, tracking radius of movement on vaccination day, whether people visited businesses during opening hours, and whether they went straight home afterwards. None of this was made public at the time.

When the tracking came to light, a spokesperson for Big Brother Watch said citizens would be “disturbed to discover they were unwittingly tracked and subjected to behavioral analysis via their phones.”

“No one expects that by going to get a vaccine they will be tracked and monitored by their own Government,” the spokesperson said. “This is deeply chilling and could be extremely damaging to public trust in medical confidentiality. Between looming Covid passports and vaccine phone surveillance, this Government is turning Britain into a Big Brother state under the cover of Covid. This should be a wake up call to us all.”

The government’s defense was that the data was collected at cell tower level, not the individual level, and that it was “GDPR-compliant” data provided by a company that “collected, cleaned, and anonymized” it.

Keep reading

FC Barcelona Fined for Privacy Violations Over Biometric Data Collection

FC Barcelona got fined €500,000 ($579,219) for scanning the faces and recording the voices of over 100,000 members without doing the legal homework first.

Spain’s data protection authority, the AEPD, found the club had deployed biometric identity verification during a membership census update and processed all of it without a valid Data Protection Impact Assessment.

Members renewing their details remotely were required to either submit a facial scan through their device camera or record their voice. Both systems were live, both were processing biometric data at scale, and the documentation Barcelona produced to justify any of it didn’t meet the bar GDPR sets for high-risk processing.

Article 35 of the GDPR requires organizations to conduct a DPIA before deploying any system likely to create a high risk for individuals. Biometric data used for identification qualifies automatically.

Processing that touches more than 100,000 people, including minors, qualifies. Using new technologies qualifies. Barcelona’s system hit all three. The AEPD concluded the club’s documentation was missing the essential components of a genuine assessment: no real necessity and proportionality analysis, no adequate evaluation of what the processing actually risks for the people whose faces and voices it captured.

The AEPD’s decision in case PS-00450-2024 makes one point with particular clarity: consent doesn’t substitute for a DPIA. Barcelona had asked members to agree to biometric data collection, and members had agreed.

That agreement is legally irrelevant to the separate procedural obligation to assess risk before the system goes live. The GDPR treats them as independent requirements. Satisfying one doesn’t discharge the other.

What a valid DPIA actually requires, according to the decision, is a clear description of the processing, a genuine necessity and proportionality assessment, a detailed risk evaluation, proposed mitigation measures, and a residual risk assessment after mitigations are applied. Organizations that generate DPIA documentation as a compliance checkbox, without substantively working through those questions, remain exposed regardless of what consent language they put in front of users.

The appetite for facial biometric data has become near-universal across industries, and the Barcelona case lands in a moment when that appetite is accelerating faster than the rules meant to govern it.

Keep reading

Canada’s Public Safety Minister Defends Mass Surveillance Bill

Canada’s Public Safety Minister, Gary Anandasangaree, wants you to know that Bill C-22 is not a surveillance bill. He said so twice.

“I want to be very clear about what C-22 is not. It is not about the surveillance of honest, hard-working Canadians going on about their daily lives,” Anandasangaree told an audience that included police chiefs and law enforcement officials.

Then, a few sentences later: “We’re not looking for sneaky ways to surveil Canadians. We are doing our part to combat bad actors in both the physical and digital worlds.”

What he described is a surveillance bill.

The Lawful Access Act, introduced this month, compels electronic service providers to retain Canadians’ metadata for a year and gives police and CSIS new mechanisms to access it. That includes location data, device identifiers, and daily movement patterns, all stored in advance, on every Canadian, not just suspects, held ready for law enforcement retrieval.

Keep reading

GrapheneOS Defies Age Verification Surveillance Laws, Vowing to Protect User Privacy Worldwide

GrapheneOS has a simple answer to the wave of age verification laws moving through US state legislatures and already live in Brazil: no.

The privacy-focused Android fork announced last Friday that it won’t implement the age data collection these laws demand. “GrapheneOS will remain usable by anyone around the world without requiring personal information, identification, or an account,” the project stated.

“If GrapheneOS devices can’t be sold in a region due to their regulations, so be it.” That’s a blunter response than most OS developers are willing to give, and it’s worth understanding what it’s actually refusing.

Brazil’s Digital ECA (Law 15.211) came into force on March 17, hitting OS providers with fines of up to R$50 million, roughly $9.5 million per violation, for failing to build age verification into device setup.

California’s Digital Age Assurance Act, AB-1043, signed by Governor Newsom in October 2025 and effective January 1, 2027, goes further: it requires every OS provider to collect a user’s age or date of birth during account setup, then push that data to app stores and developers through a real-time API.

Colorado’s SB26-051 cleared the state senate on March 3 with similar demands. The architecture these laws collectively envision is an age-linked identity layer baked into the operating system itself, present before you’ve opened a single app.

GrapheneOS is developed by the GrapheneOS Foundation, a registered Canadian nonprofit.

California’s AB-1043 carries civil penalties of up to $2,500 per affected child for negligent violations and $7,500 for intentional ones, enforced by the state attorney general. The Canadian nonprofit status provides some distance but not a guarantee.

The stakes grew more concrete after GrapheneOS and Motorola announced a partnership at MWC on March 2, bringing the hardened OS to future Motorola hardware and ending GrapheneOS’s long exclusivity to Google Pixel devices. A GrapheneOS-powered Motorola phone is expected in 2027.

Once a major hardware manufacturer ships devices with GrapheneOS pre-installed, those products need to comply with local regulations in every market where they’re sold, or Motorola will have to restrict sales geographically.

The defiant stance that’s easy for a nonprofit software project becomes a commercial problem for a global device manufacturer.

Keep reading

The Feds Are Investing in Wearable Health Trackers. That Could Put Your Private Data at Risk.

By gathering continuous data about sleep, heart rate, and physical activity, biowearable devices can give individuals more control over their well-being. But they also create a detailed digital record of our daily lives—one that the federal government may soon be able to access readily.

Consider this scenario.

You’ve recently received a government-subsidized biowearable. Accordingly, the authorities now know when you’re sleeping, because the device reports your sleep cycle, location, and daily movements in real time to a cloud server accessible through a legal process. It knows when you’re home. It knows when you leave.

Those data are then obtained by an FBI field office (either through direct purchase or, if necessary, a legal process), because a federal prosecutor has decided that your criticism of immigration enforcement operations and your social media posts supporting Immigration and Customs Enforcement protesters constitute “incitement to violence” against federal agents. Under the Trump administration’s elastic (and legally dubious) domestic terrorism definitions and designations, that is enough to open a criminal investigation.

And because the government has known for weeks when you’re at home sleeping, it knows exactly when to break down your door.

That scenario may sound far-fetched, but it is getting closer to reality. In March, the Department of Health and Human Services (HHS) announced that the Advanced Research Projects Agency for Health (ARPA-H) would begin investing in new biowearable technologies through a program it called Delphi, after the ancient Greek sanctuary where the maxim “know thyself” was inscribed. It’s a fitting name for a program designed to help people understand their bodies, but it also raises an uncomfortable question: Who else might come to know them just as well?

The program aims to develop biosensors capable of continuously monitoring cytokines (cellular inflammation markers) and hormone levels, going substantially beyond what current wearables can detect. Funding will be determined on a competitive basis as private-sector stakeholders submit proposals; no specific appropriation has been announced.

It remains unclear why this taxpayer funding is necessary in a field that is already thriving. The global wearables market was valued at roughly $43 billion in 2024 and is projected to exceed $168 billion by 2030.

Devices worn on the wrist, finger, or skin can already monitor heart ratesblood oxygen levelssleep patterns, physical activity, and—in the case of continuous glucose monitors—blood sugar levels in real time. Some smartwatches can even conduct electrocardiograms capable of detecting irregular heart rhythms, such as atrial fibrillation.

Until recently, people could access most of this information only during periodic visits to a clinic or hospital. Biowearables now enable people to monitor many of these signals continuously in everyday life.

Keep reading

FBI Resumes Buying Americans’ Location Data Without Warrants

The FBI is buying Americans’ location data again. Director Kash Patel confirmed it to lawmakers on Wednesday, confirming what we already knew: that it has resumed purchasing commercial surveillance data, including detailed location histories, from data brokers.

The brokers feeding that data pipeline source much of it from phone apps and games that people use daily without realizing they’re being tracked.

By the time a precise location record reaches a federal agency, it may have originated from a weather app or a mobile game, passed through an advertising middleman, and been packaged for resale, with the person who generated it never consulted or notified.

Senator Ron Wyden asked Patel directly whether the FBI would commit to not buying Americans’ location data without a warrant. Patel declined. The agency “uses all tools…to do our mission,” he told the committee.

He followed up by confirming that “we do purchase commercially available information that is consistent with the Constitution and the laws under the Electronic Communications Privacy Act,” adding that it “has led to some valuable intelligence for us.”

Wyden called that arrangement exactly what it is: the government buying what it cannot legally seize. Purchasing information on Americans without a warrant is “an outrageous end-run around the Fourth Amendment,” he said, referring to the constitutional protection against unreasonable searches and seizures.

The workaround is not unique to the FBI. Federal agencies are generally required to convince a judge that probable cause exists before demanding private records from a tech or phone company.

The commercial data market offers a way around that requirement entirely. Agencies simply purchase what they would otherwise need a warrant to obtain, creating a market for data grabbing and exploiting a legal gap that courts have not yet addressed.

Wyden and other lawmakers introduced the Government Surveillance Reform Act last week, which would require a court-authorized warrant before any federal agency can purchase Americans’ data from brokers. The bill is bipartisan and bicameral. Without it, the gap that lets agencies buy their way around the Fourth Amendment remains open.

Keep reading

The FISA Surveillance Tool Is Up for Renewal, and the SAVE Act Is Riding Shotgun

Congress is about to stage one of its annual spectacles: reauthorizing Section 702 of the Foreign Intelligence Surveillance Act.

Normally, this is an already messy affair, but President Trump has decided to spice things up by suggesting that Republicans attach the SAVE America Act to the must-pass FISA bill.

The result is a headache for House Speaker Mike Johnson.

“Maybe you put them together, because a lot of people feel very strongly about FISA,” Trump told House Republicans at their retreat last week.

That might be the understatement of the year.

The Foreign Intelligence Surveillance Act, or FISA, was created to let the US government collect intelligence on foreigners. In theory, it targets only non-US citizens abroad. In reality, it has become a tool for sweeping up Americans’ communications on a massive scale.

Section 702, the part now up for reauthorization, allows intelligence agencies to grab emails, texts, and calls from foreign targets and, in doing so, they routinely capture the American side of those conversations.

This incidental collection has become anything but incidental. The FBI treats Section 702 data as a domestic treasure trove, conducting millions of warrantless searches of Americans’ communications each year.

It effectively bypasses the Fourth Amendment, giving federal agencies legal cover to monitor Americans without warrants, often funneling the information into ordinary criminal investigations. FISA’s original promise of balancing security and privacy has been eroded by decades of routine overreach.

GOP leadership had been planning a clean extension, but Trump’s intervention opens the door for a faction of conservatives, led by Rep. Anna Paulina Luna, to insist on a legislative package deal.

Luna didn’t vote to reauthorize FISA in 2024, but she and other SAVE supporters are already signaling they will use their leverage to shape the House floor debate.

Johnson likely has the votes to pass FISA with bipartisan support, but the rule vote, the procedural step determining how the floor debate proceeds, is the real landmine. Conservatives have yet to announce support, and procedural votes have long been the preferred weapon for those who want leverage without responsibility.

Keep reading

Britain’s Business Registry Left Director Data Wide Open — Yet the Government Is Still Building a National Digital ID

Companies House in the UK briefly turned its own corporate register into a self-service fraud toolkit. A vulnerability in the dashboard of the UK’s official business registry let anyone access other companies’ private records by pressing the back button, no hacking required.

Directors’ home addresses, email addresses, and dates of birth were all sitting there, readable and editable by anyone who knew where to look.

Companies House is the government body where every limited company must register to legally exist. It holds the official record of who runs Britain’s businesses, including the personal details of every director. When you incorporate a company in the UK, your information goes into this register. There is no opt-out.

The timing is what makes this even more interesting. Since November 2025, all directors in the UK have been legally required to verify their identity through GOV.UK One Login to act in their roles, feeding passport scans, biometric data, and government credentials into the same Companies House infrastructure.

Keep reading

Canada’s Bill C-22 Mandates Mass Metadata Surveillance of Canadians

Canada’s Liberal government has introduced Bill C-22, the Lawful Access Act, 2026, a surveillance bill that compels electronic service providers to store Canadians’ metadata for a year and hands police and intelligence agencies new tools to access it.

We obtained a copy of the bill for you here.

The bill follows a failed first attempt, Bill C-2, which collapsed under the weight of near-universal criticism from opposition parties, rights groups, and the tech industry.

This is a mandatory data retention regime that forces companies to hold location data, device information, and other sensitive metadata on every Canadian, not just those suspected of crimes, ready for law enforcement retrieval via warrant. The logic is familiar: build the haystack first, search it later.

Keep reading

OpenAI on Surveillance and Autonomous Killings: You’re Going to Have to Trust Us

OpenAI claims it has accomplished what Anthropic couldn’t: securing a Pentagon contract that won’t cross professed red lines against dragnet domestic spying and the use of artificial intelligence to order lethal military strikes. Just don’t expect any proof.

Sam Altman, OpenAI’s CEO, announced the company’s big win with the Defense Department in a post on X on February 27.

“Two of our most important safety principles are prohibitions on domestic mass surveillance and human responsibility for the use of force, including for autonomous weapon systems,” he wrote. The Pentagon “agrees with these principles, reflects them in law and policy, and we put them into our agreement.”

The deal came after the very public implosion of what was to be a similar contract between the U.S. military and Anthropic, one of OpenAI’s chief rivals. Anthropic had said negotiations collapsed because it could not enshrine prohibitions against killer robots and domestic spying in its contract. The company’s insistence on these two points earned it the wrath of the Pentagon and President Donald Trump, who ordered the government to phase out use of Anthropic’s tools within six months.

But if the government booted Anthropic for refusing mass surveillance and autonomous weapons, how could OpenAI take over the contract without having the same problem?

OpenAI has attempted to square this circle through a string of posts to X by company executives and researchers, including Katrina Mulligan, its national security chief, and a claim by Altman that the company negotiated stricter protections around domestic surveillance.

The company and the government, however, are not releasing the only proof that matters: the contract itself.

The Department of Defense did not respond to a request for comment.

Keep reading