Germany Pressures Apple and Google to Ban Chinese AI App DeepSeek

Apple and Google are facing mounting pressure from German authorities to remove the Chinese AI app DeepSeek from their app stores in Germany over data privacy violations.

The Berlin Commissioner for Data Protection and Freedom of Information, Meike Kamp, has flagged the app for transferring personal data to China without adhering to EU data protection standards.

Kamp’s office examined DeepSeek’s practices and found that the company failed to offer “convincing evidence” that user information is safeguarded as mandated by EU law.

She emphasized the risks linked to Chinese data governance, warning that “Chinese authorities have far-reaching access rights to personal data within the sphere of influence of Chinese companies.”

With this in mind, Apple and Google have been urged to evaluate the findings and consider whether to block the app in Germany.

Authorities in Berlin had already asked DeepSeek to either meet EU legal requirements for data transfers outside the bloc or remove its app from German availability.

DeepSeek did not take action to address these concerns, according to Kamp.

Germany’s move follows Italy’s earlier decision this year to block DeepSeek from local app stores, citing comparable concerns about data security and privacy.

Keep reading

UK Leads Global Push For Notification Data Requests

Back in 2023, we reported on how US agencies have used push notification metadata on smartphones for surveillance, pressuring tech companies like Apple and Google to hand over user information. Prompted by Senator Ron Wyden’s inquiry, Apple revealed it had been legally barred from disclosing this practice, which raises serious concerns about civil liberties and government overreach.

Cut to today and government demands for user information tied to Apple’s push notification system continued into the first half of 2024, with the United Kingdom submitting 141 requests, despite the nation’s relatively small size, and the United States following with 129.

Germany also obtained data during this period. Singapore, despite making inquiries, received none. These figures come from Apple’s most recent transparency report, shedding light on global government interest in a lesser-known surveillance vector.

Even some privacy apps can be undermined by surveillance at the push notification level. Many apps have to rely on Apple or Google to deliver notifications; services that can expose critical metadata such as which app sent the notification, when it was sent, and how often.

This metadata can be used by governments to infer user activity, and social connections, and even de-anonymize users. It bypasses app-level encryption entirely, exploiting a layer outside the user’s or developer’s control.

Apple’s report outlines what’s at stake with these requests. When someone enables notifications for an app, the system generates a “push token” that links the device and app to a specific Apple account.

Keep reading

UK Tribunal Blocks Government’s Attempt to Keep Apple Surveillance Case Secret

With a necessary reality check, a UK tribunal has told the government that, no, it cannot hold a secret legal battle against Apple over encryption. The Investigatory Powers Tribunal (IPT), the body meant to oversee the country’s surveillance powers, has dismissed efforts by the Home Office to keep the entire case hidden from public view. And in doing so, it has delivered a quietly important win for press freedom and digital rights. Although, things are far from over.

The case revolves around Apple’s Advanced Data Protection system, or ADP. It’s a security feature that gives users the option to encrypt their iCloud data in a way that even Apple itself cannot access. Not through a backdoor, not with a master key, not at all. It’s the kind of robust end-to-end encryption that governments around the world have grown increasingly nervous about.

The UK, it turns out, is no exception.

Keep reading

Spies, Secrets, and iCloud: Apple’s Legal Showdown in London

The Investigatory Powers Tribunal (IPT) in London is the one that will consider Apple’s appeal against the UK’s Home Office secret order to include an encryption backdoor in the giant’s iCloud service.

As things stand now, pending the outcome of the legal – and political – wrangling, iCloud users no longer enjoy the security and privacy benefits of the Advanced Data Protection (ADP).

This affects iCloud Backup in the following categories: iCloud Drive, Photos, Notes, Reminders, Safari Bookmarks, Siri Shortcuts, Voice Memos, Wallet Passes, and Freeform.

Meanwhile, the tribunal itself is “secret,” and the date it will consider Apple’s attempt to avoid the permanent breaking of encryption, and of the trust of its users worldwide, has been set for Friday, March 14.

But privacy activists like Privacy International (PI) want these hearings to be public, since the outcome of the UK’s anti-encryption push potentially affects millions, possibly billions of people around the world.

Secret as it may be, the IPT – which is believed to normally deal with national security issues – announced Friday’s closed-door meeting, a move that is described as “unusual.”

Unusual perhaps, but not illogical – Apple’s appeal against the original secret order was also apparently meant to be secret but has in the meantime been “leaked” to the public.

The original order came from Home Secretary Yvette Cooper, who targeted the US company with a “technical compatibility notice.” The end result of compliance was giving UK’s spies and law enforcement access to data, by compromising iCloud encryption.

Keep reading

Apple Confirms Infowars Report iPhone Voice Dictation Swaps ‘Trump’ for ‘Racist’

Apple says it’s fixing an issue after users noticed their iPhones temporarily showed the name “Trump” when they attempted to type the word “racist” by using the voice-to-text feature.

The issue was reported and replicated by Infowars host Alex Jones Tuesday morning after his daughter saw it on TikTok and duplicated it on several phones.

Jones was then able to replicate the issue on several other iPhones, warning it could be a “subliminal attack on President Trump.”

Keep reading

Apple Pulls Privacy Protections For UK Citizens After The UK Is the First Country to Demand a Backdoor Into Your Private Data

Apple has effectively told the UK government to get lost when it comes to inserting a worldwide surveillance backdoor into its iCloud encryption. Instead of playing along with Britain’s ever-expanding digital police state, the tech giant has chosen to pull its most secure data protection feature — Advanced Data Protection (ADP) — for users in the UK. Because nothing says “we respect your privacy” like stripping away the very feature designed to protect it.

The whole mess started when the British government, wielding the notoriously invasive Investigatory Powers Act (a law that might as well be named the “We Own Your Data Act”), demanded that Apple sabotage its own encryption. The UK’s authorities wanted a golden key to every citizen’s iCloud storage, under the guise of “public safety.” But here’s the wider issue: the directive wouldn’t only affect Brits — it would have compromised Apple’s encryption system worldwide.

This was an attempt to strong-arm one of the world’s most powerful tech companies into submission, setting a precedent that could crack open user privacy like an egg.

Rather than comply, Apple responded with a very diplomatic version of hell no. Instead of weakening encryption for everyone, the company opted to remove ADP from the UK entirely. In a statement that practically oozed frustration, Apple declared:

“We are gravely disappointed that the protections provided by Advanced Data Protection will not be available to our customers in the United Kingdom, given the continuing rise of data breaches and other threats to customer privacy.”

They continued, insisting that they remain committed to offering users “the highest level of security” and expressing “hope” that they’ll be able to restore ADP in the UK at some point in the future. That’s corporate-speak for, maybe when your current government stops acting like the digital arm of Big Brother.

Keep reading

UK Government Secretly Orders Apple to Build Global iCloud Backdoor, Threatening Digital Privacy Worldwide

Imagine waking up one morning to find out your government has demanded the master key to every digital iPhone lock on Earth — without telling anyone. That’s exactly what British security officials have tried to pull off, secretly ordering Apple to build a backdoor into iCloud that would allow them to decrypt any user’s data, anywhere in the world. Yes, not just suspected criminals, not just UK citizens — everyone. And they don’t even want Apple to talk about it.

This breathtakingly authoritarian stunt, first reported by The Washington Post, is one of the most aggressive attempts to dismantle digital privacy ever attempted by a so-called Western democracy. It’s the kind of thing you’d expect from regimes that plaster their leader’s face on every street corner, not from a country that still pretends to believe in civil liberties.

This isn’t about catching a single terrorist or cracking a single case. No, this order — issued in secret last month by Keir Starmer’s Labour government — demands universal decryption capabilities, effectively turning Apple into a surveillance arm of the UK government. Forget warrants, forget oversight, forget even the pretense of targeted investigations. If this order were obeyed, British authorities would have the power to rifle through anyone’s iCloud account at will, no justification required.

The officials pushing for this monstrosity are hiding behind the UK’s Investigatory Powers Act of 2016, a law so Orwellian it’s lovingly referred to as the “Snoopers’ Charter.” This piece of legislative overreach forces tech companies to comply with government spying requests while making it illegal to even disclose that such demands have been made. It’s the surveillance state’s dream—limitless power, zero accountability.

Keep reading

Apple Reaches $95M Settlement Over Lawsuit Accusing ‘Siri’ Of Eavesdropping On Consumers

Apple has agreed to pay $95 million to settle a lawsuit that accuses the company of infringing on its users’ privacy by utilizing “Siri,” Apple’s artificial intelligence (AI) assistant, to eavesdrop on individuals with Apple devices.

The agreed upon settlement, which was filed on December 13th, 2024, in Oakland, California, is currently awaiting approval by a U.S. district judge.

The 5-year-old lawsuit alleged that Apple would activate Siri without the user’s knowledge “for over a decade.” The suit continued, claiming that Apple would continue to record, unbeknownst to the phone owner, sharing conversations and certain key words with advertisers in order to push products and services.

Apple has long marketed itself as a “pioneer” in protecting its consumers privacy. However, users have also long suspected that their device is listening to them after specific ads for products or services have been presented via social media apps after simply discussing topics or figures out loud that are related.

Two plaintiffs in the suit recall that after merely mentioning Air Jordan shoes, their iPhone began showing them advertising for the shoes more often. Another noted that after discussing a specific surgical treatment with his doctor, he began receiving medical ads related to that treatment.

The claims fly in the face of Apple CEO Tim Cook’s claim that the right to privacy is a “fundamental human right.”

If the district judge approves the settlement, tens of millions of Apple consumers who owned devices beginning in September 17th, 2014, would be able to file claims, receiving up to $20 per device, depending on the volume of the claims, according to court documents.

Keep reading

Apple auto-opts everyone into having their photos analyzed by AI for landmarks

Apple last year deployed a mechanism for identifying landmarks and places of interest in images stored in the Photos application on its customers iOS and macOS devices and enabled it by default, seemingly without explicit consent.

Apple customers have only just begun to notice.

The feature, known as Enhanced Visual Search, was called out last week by software developer Jeff Johnson, who expressed concern in two write-ups about Apple’s failure to explain the technology, which is believed to have arrived with iOS 18.1 and macOS 15.1 on October 28, 2024.

In a policy document dated November 18, 2024 (not indexed by the Internet Archive’s Wayback Machine until December 28, 2024, the date of Johnson’s initial article), Apple describes the feature thus:

Enhanced Visual Search in Photos allows you to search for photos using landmarks or points of interest. Your device privately matches places in your photos to a global index Apple maintains on our servers. We apply homomorphic encryption and differential privacy, and use an OHTTP relay that hides [your] IP address. This prevents Apple from learning about the information in your photos. You can turn off Enhanced Visual Search at any time on your iOS or iPadOS device by going to Settings > Apps > Photos. On Mac, open Photos and go to Settings > General.

Apple did explain the technology in a technical paper published on October 24, 2024, around the time that Enhanced Visual Search is believed to have debuted. A local machine-learning model analyzes photos to look for a “region of interest” that may depict a landmark. If the AI model finds a likely match, it calculates a vector embedding – an array of numbers – representing that portion of the image.

The device then uses homomorphic encryption to scramble the embedding in such a way that it can be run through carefully designed algorithms that produce an equally encrypted output. The goal here being that the encrypted data can be sent to a remote system to analyze without whoever is operating that system from knowing the contents of that data; they just have the ability to perform computations on it, the result of which remain encrypted. The input and output are end-to-end encrypted, and not decrypted during the mathematical operations, or so it’s claimed.

The dimension and precision of the embedding is adjusted to reduce the high computational demands for this homomorphic encryption (presumably at the cost of labeling accuracy) “to meet the latency and cost requirements of large-scale production services.” That is to say Apple wants to minimize its cloud compute cost and mobile device resource usage for this free feature.

With some server optimization metadata and the help of Apple’s private nearest neighbor search (PNNS), the relevant Apple server shard receives a homomorphically-encrypted embedding from the device, and performs the aforementioned encrypted computations on that data to find a landmark match from a database and return the result to the client device without providing identifying information to Apple nor its OHTTP partner Cloudflare.

Keep reading

Apple patent uses FRT with ‘body data’ so cameras can ID people without seeing faces

Apple has been granted a patent for “identity recognition utilizing face-associated body characteristics.” The face recognition technology is anticipated to appear in a forthcoming smart security product from the tech giant.

Patent No. 12154386 2B, filed in May 2022 and granted on November 26, 2024, describes a system that associates facial recognition with other body characteristics, which might include things like clothing, gait, or gesture, to recognize certain people even if their faces are not visible to the camera.

The patent outlines the problem it intends to solve in clear terms: “sometimes a video camera may not be able to perform facial recognition of a person, given a particular video feed.”

It then describes the capability to monitor a video feed and determine, based on the analysis of video frames and previously stored face and body biometrics, whether an identification can be made with a primary body characteristic (face) or requires a secondary characteristic.

The system might work by linking a gallery of “body croppings” such as torso, arms or legs with their face biometrics, then comparing the data with a live video feed. It proceeds in a stepped approach, identifying face, then body parts, then, if needed, “physical characteristics” that could include body shape, skin color, or the texture or color of clothing. The order of operations is adaptable to the scenario.

The resulting data constitutes a cluster of “bodyprints” which can be assigned a confidence score against a person’s faceprint and other characteristics. Since there is a limited time in which certain identifiers are useful (clothing, for instance), the technology can utilize storage periods as brief as 24 hours.

Keep reading