Apple Confirms Infowars Report iPhone Voice Dictation Swaps ‘Trump’ for ‘Racist’

Apple says it’s fixing an issue after users noticed their iPhones temporarily showed the name “Trump” when they attempted to type the word “racist” by using the voice-to-text feature.

The issue was reported and replicated by Infowars host Alex Jones Tuesday morning after his daughter saw it on TikTok and duplicated it on several phones.

Jones was then able to replicate the issue on several other iPhones, warning it could be a “subliminal attack on President Trump.”

Keep reading

Apple Pulls Privacy Protections For UK Citizens After The UK Is the First Country to Demand a Backdoor Into Your Private Data

Apple has effectively told the UK government to get lost when it comes to inserting a worldwide surveillance backdoor into its iCloud encryption. Instead of playing along with Britain’s ever-expanding digital police state, the tech giant has chosen to pull its most secure data protection feature — Advanced Data Protection (ADP) — for users in the UK. Because nothing says “we respect your privacy” like stripping away the very feature designed to protect it.

The whole mess started when the British government, wielding the notoriously invasive Investigatory Powers Act (a law that might as well be named the “We Own Your Data Act”), demanded that Apple sabotage its own encryption. The UK’s authorities wanted a golden key to every citizen’s iCloud storage, under the guise of “public safety.” But here’s the wider issue: the directive wouldn’t only affect Brits — it would have compromised Apple’s encryption system worldwide.

This was an attempt to strong-arm one of the world’s most powerful tech companies into submission, setting a precedent that could crack open user privacy like an egg.

Rather than comply, Apple responded with a very diplomatic version of hell no. Instead of weakening encryption for everyone, the company opted to remove ADP from the UK entirely. In a statement that practically oozed frustration, Apple declared:

“We are gravely disappointed that the protections provided by Advanced Data Protection will not be available to our customers in the United Kingdom, given the continuing rise of data breaches and other threats to customer privacy.”

They continued, insisting that they remain committed to offering users “the highest level of security” and expressing “hope” that they’ll be able to restore ADP in the UK at some point in the future. That’s corporate-speak for, maybe when your current government stops acting like the digital arm of Big Brother.

Keep reading

UK Government Secretly Orders Apple to Build Global iCloud Backdoor, Threatening Digital Privacy Worldwide

Imagine waking up one morning to find out your government has demanded the master key to every digital iPhone lock on Earth — without telling anyone. That’s exactly what British security officials have tried to pull off, secretly ordering Apple to build a backdoor into iCloud that would allow them to decrypt any user’s data, anywhere in the world. Yes, not just suspected criminals, not just UK citizens — everyone. And they don’t even want Apple to talk about it.

This breathtakingly authoritarian stunt, first reported by The Washington Post, is one of the most aggressive attempts to dismantle digital privacy ever attempted by a so-called Western democracy. It’s the kind of thing you’d expect from regimes that plaster their leader’s face on every street corner, not from a country that still pretends to believe in civil liberties.

This isn’t about catching a single terrorist or cracking a single case. No, this order — issued in secret last month by Keir Starmer’s Labour government — demands universal decryption capabilities, effectively turning Apple into a surveillance arm of the UK government. Forget warrants, forget oversight, forget even the pretense of targeted investigations. If this order were obeyed, British authorities would have the power to rifle through anyone’s iCloud account at will, no justification required.

The officials pushing for this monstrosity are hiding behind the UK’s Investigatory Powers Act of 2016, a law so Orwellian it’s lovingly referred to as the “Snoopers’ Charter.” This piece of legislative overreach forces tech companies to comply with government spying requests while making it illegal to even disclose that such demands have been made. It’s the surveillance state’s dream—limitless power, zero accountability.

Keep reading

Apple Reaches $95M Settlement Over Lawsuit Accusing ‘Siri’ Of Eavesdropping On Consumers

Apple has agreed to pay $95 million to settle a lawsuit that accuses the company of infringing on its users’ privacy by utilizing “Siri,” Apple’s artificial intelligence (AI) assistant, to eavesdrop on individuals with Apple devices.

The agreed upon settlement, which was filed on December 13th, 2024, in Oakland, California, is currently awaiting approval by a U.S. district judge.

The 5-year-old lawsuit alleged that Apple would activate Siri without the user’s knowledge “for over a decade.” The suit continued, claiming that Apple would continue to record, unbeknownst to the phone owner, sharing conversations and certain key words with advertisers in order to push products and services.

Apple has long marketed itself as a “pioneer” in protecting its consumers privacy. However, users have also long suspected that their device is listening to them after specific ads for products or services have been presented via social media apps after simply discussing topics or figures out loud that are related.

Two plaintiffs in the suit recall that after merely mentioning Air Jordan shoes, their iPhone began showing them advertising for the shoes more often. Another noted that after discussing a specific surgical treatment with his doctor, he began receiving medical ads related to that treatment.

The claims fly in the face of Apple CEO Tim Cook’s claim that the right to privacy is a “fundamental human right.”

If the district judge approves the settlement, tens of millions of Apple consumers who owned devices beginning in September 17th, 2014, would be able to file claims, receiving up to $20 per device, depending on the volume of the claims, according to court documents.

Keep reading

Apple auto-opts everyone into having their photos analyzed by AI for landmarks

Apple last year deployed a mechanism for identifying landmarks and places of interest in images stored in the Photos application on its customers iOS and macOS devices and enabled it by default, seemingly without explicit consent.

Apple customers have only just begun to notice.

The feature, known as Enhanced Visual Search, was called out last week by software developer Jeff Johnson, who expressed concern in two write-ups about Apple’s failure to explain the technology, which is believed to have arrived with iOS 18.1 and macOS 15.1 on October 28, 2024.

In a policy document dated November 18, 2024 (not indexed by the Internet Archive’s Wayback Machine until December 28, 2024, the date of Johnson’s initial article), Apple describes the feature thus:

Enhanced Visual Search in Photos allows you to search for photos using landmarks or points of interest. Your device privately matches places in your photos to a global index Apple maintains on our servers. We apply homomorphic encryption and differential privacy, and use an OHTTP relay that hides [your] IP address. This prevents Apple from learning about the information in your photos. You can turn off Enhanced Visual Search at any time on your iOS or iPadOS device by going to Settings > Apps > Photos. On Mac, open Photos and go to Settings > General.

Apple did explain the technology in a technical paper published on October 24, 2024, around the time that Enhanced Visual Search is believed to have debuted. A local machine-learning model analyzes photos to look for a “region of interest” that may depict a landmark. If the AI model finds a likely match, it calculates a vector embedding – an array of numbers – representing that portion of the image.

The device then uses homomorphic encryption to scramble the embedding in such a way that it can be run through carefully designed algorithms that produce an equally encrypted output. The goal here being that the encrypted data can be sent to a remote system to analyze without whoever is operating that system from knowing the contents of that data; they just have the ability to perform computations on it, the result of which remain encrypted. The input and output are end-to-end encrypted, and not decrypted during the mathematical operations, or so it’s claimed.

The dimension and precision of the embedding is adjusted to reduce the high computational demands for this homomorphic encryption (presumably at the cost of labeling accuracy) “to meet the latency and cost requirements of large-scale production services.” That is to say Apple wants to minimize its cloud compute cost and mobile device resource usage for this free feature.

With some server optimization metadata and the help of Apple’s private nearest neighbor search (PNNS), the relevant Apple server shard receives a homomorphically-encrypted embedding from the device, and performs the aforementioned encrypted computations on that data to find a landmark match from a database and return the result to the client device without providing identifying information to Apple nor its OHTTP partner Cloudflare.

Keep reading

Apple patent uses FRT with ‘body data’ so cameras can ID people without seeing faces

Apple has been granted a patent for “identity recognition utilizing face-associated body characteristics.” The face recognition technology is anticipated to appear in a forthcoming smart security product from the tech giant.

Patent No. 12154386 2B, filed in May 2022 and granted on November 26, 2024, describes a system that associates facial recognition with other body characteristics, which might include things like clothing, gait, or gesture, to recognize certain people even if their faces are not visible to the camera.

The patent outlines the problem it intends to solve in clear terms: “sometimes a video camera may not be able to perform facial recognition of a person, given a particular video feed.”

It then describes the capability to monitor a video feed and determine, based on the analysis of video frames and previously stored face and body biometrics, whether an identification can be made with a primary body characteristic (face) or requires a secondary characteristic.

The system might work by linking a gallery of “body croppings” such as torso, arms or legs with their face biometrics, then comparing the data with a live video feed. It proceeds in a stepped approach, identifying face, then body parts, then, if needed, “physical characteristics” that could include body shape, skin color, or the texture or color of clothing. The order of operations is adaptable to the scenario.

The resulting data constitutes a cluster of “bodyprints” which can be assigned a confidence score against a person’s faceprint and other characteristics. Since there is a limited time in which certain identifiers are useful (clothing, for instance), the technology can utilize storage periods as brief as 24 hours.

Keep reading

iPhone Now Collects Your Mental Health Data

True Story: The Health app built into iPhones is now collecting as much personal information on the mental health of each and every one of us as they can get a hold of.

Yet, a search on Google and Brave yielded no results on the dangers of sharing such information over the phone or the internet. Seriously, no single MSM has done an article on why such data sharing might be a bad idea?

To start, in sharing such data, you aren’t just sharing your information; iPhone knows exactly who your family members are. In many cases, those phones are connected via family plans.

iPhone mental health assessments not only ask questions about your mental health but can also infer the mental health status of family members, as demonstrated by the image publicly shared by phone on the benefits of a phone mental health assessment.

Keep reading

Musk Declares War on Apple: Threatens to Ban Devices Over “Creepy Spyware” AI Integration

Elon Musk, the CEO of Tesla, SpaceX, and X (formerly Twitter), has declared war on big tech Apple.

The tech mogul threatens to ban Apple devices across his companies unless Apple abandons its plans to integrate OpenAI’s woke ChatGPT technology into its operating system.

Apple announced on Monday that it would be integrating ChatGPT into iOS, iPadOS, and macOS. This integration would allow users to access ChatGPT’s capabilities, including image and document understanding, without needing to switch between tools. Siri, Apple’s virtual assistant, could also tap into ChatGPT’s intelligence when necessary.

“We’re excited to partner with Apple to bring ChatGPT to their users in a new way. Apple shares our commitment to safety and innovation, and this partnership aligns with OpenAI’s mission to make advanced AI accessible to everyone. Together with Apple, we’re making it easier for people to benefit from what AI can offer,” said Sam Altman, CEO of OpenAI.

“It’s personal, powerful, and private—and it’s integrated into the apps you rely on every day. Introducing Apple Intelligence—our next chapter in AI,” said Tim Cook, Apple’s CEO.

In response to Tim Cook’s announcement, Musk stated, “Don’t want it. Either stop this creepy spyware or all Apple devices will be banned from the premises of my companies.”

Keep reading

Apple’s Latest iOS 17.5 Update Coerces Millions of Americans into Downloading LGBTQ Propaganda with Phone Update

Apple Inc. has once again sparked outrage with its latest iOS 17.5 update—this time by pushing LGBTQ-themed content onto its millions of users through a mandatory software update.

On Tuesday, the tech giant rolled out an update that introduced a new set of Pride Radiance wallpapers that many see as a coercive push of LGBTQ propaganda.

“This update introduces a new Pride Radiance wallpaper for the Lock Screen, Apple News enhancements, and other features, bug fixes, and security updates for your iPhone,” stated Apple’s update notes. “Some features may not be available for all regions or on all Apple devices.”

The move has been met with mixed reactions. Although using the wallpaper is not mandatory, some users feel that including LGBTQ+-themed wallpapers is an unnecessary politicization of what should be a neutral tech update.

Twitter user Sid commented, “iOS 17.5 is out! Nothing new, just a new wallpaper pack for gay people which even they’ll refuse to use because of how bad it looks.”

Keep reading

Apple APOLOGISES For ‘Soul Crushing’ iPad Ad

Apple has issued an apology following massive backlash to an ad for the new iPad Pro that crushes physical creative tools in an industrial press.

As we highlighted, the commercial shows the device obliterating items such as musical instruments, paint and cameras, then unveiling an iPad as the replacement for everything creative.

Following an overwhelmingly negative response from viewers who branded the ad ‘soul crushing’, Apple issued a statement.

“Our goal is to always celebrate the myriad of ways users express themselves and bring their ideas to life through iPad,’ the company claimed, going on to admit “We missed the mark with this video, and we’re sorry.”

Keep reading