We’ve handed over our location, our browsing history, our voice, our face, and our purchasing habits. In exchange, we’ve gotten convenience. Now Apple wants the one thing each of us might have thought was still ours—the electrical activity of our brain. And this time, they’re not even asking. What are we talking about here?
In January 2023, Apple quietly filed patent US20230225659A1 with the U.S. Patent and Trademark Office. The filing describes a wearable electronic device—an earbud—equipped with multiple electrodes embedded directly into the ear tip and housing. These electrodes aren’t for audio. They are not there to improve our sound quality. No indeed. Instead, they are there to read our brain—using the same EEG technology doctors use to monitor neurological activity in clinical settings. And because every ear canal is shaped differently, Apple’s patent describes a machine-learning model that figures out which electrode combinations work best for each person’s specific anatomy, then keeps refining that over time. The result is a read that is accurate, continuous, and tailored to each of us personally. The digital signal is then transmitted wirelessly to our phone—and, per the patent’s own language, to a server, where it can be stored as “historic data” accessible by “another person given permission.”
Read that sentence again.
What EEG Actually Reveals
This is not science fiction, and it is worth understanding what EEG data actually captures—because it is a lot more than Apple’s marketing department will ever tell you. Brain waves are not background noise. They are a direct readout of our inner life. The alpha, beta, delta, theta, and gamma frequencies each correspond to distinct mental states—relaxation, intense focus, deep sleep, creativity, active learning. Together they paint an individual portrait of our mind that is more revealing than anything we have ever typed into a search bar or whispered to a smart speaker. These frequencies, as Loyola University researchers have noted, are also the same signals measured in polygraph tests—the ones used to determine whether someone is lying. They can reveal our stress levels, our concentration, our emotional state, and potentially flag neurological conditions that have not yet been diagnosed. As one researcher at the Neurorights Foundation put it in a Science Friday interview, neural circuits in the brain create our thoughts, emotions, memories, decision-making, and our very sense of self.
Apple wants that data streaming off our ears into their servers.
Are There Any Upsides?
Fair is fair—applications for in-ear EEG technology are being floated, and it’s worth addressing them. As Neurofounders reports, startups like NextSense are already developing in-ear EEG devices to improve clinical sleep staging. Detecting seizure disorders from continuous passive monitoring is another possibility. Early signals for degenerative diseases like Alzheimer’s may surface in EEG data years before symptoms appear. And researchers have argued that natural-environment EEG collection—on the couch, at work, during real life rather than inside a sterile lab—would produce more accurate data on attention and cognitive states than anything gathered under clinical conditions.
These applications sound compelling on the surface. But step back for a second. Americans are not sleeping poorly because they lack a brain-monitoring device. They are sleeping poorly because they are overprescribed, overstimulated, and undernourished—and the same medical system profiting from that reality is not exactly rushing to fix it. Handing our neural data to Apple is not a solution to a pharmaceutical-created problem. It is just a new layer of surveillance dressed up as fake wellness. The idea that we should surrender the electrical activity of our brains as the price of entry for better sleep tracking should raise more than a few eyebrows.
Who Gets the Data?
Here is where things get serious. A 2024 Neurorights Foundation report pulled back the curtain on 30 companies already selling consumer neurotechnology devices. What they found should stop you cold. Twenty-nine of the thirty companies claimed unlimited rights to their users’ neural data. Most had quietly written third-party data sharing directly into their terms—buried in the kind of legal language nobody reads until it’s too late. Fewer than half even encrypt the data or de-identify users. There is no federal law in the United States governing how neural data collected by consumer devices can be used or sold. A handful of states—Colorado, California, Illinois—have moved to address this, but protections remain patchwork at best.
As a published paper in PMC bluntly put it, bulk sales of neural data by tech giants to third parties may already be occurring with minimal accountability. Data brokers could soon be cataloging individual “brain fingerprints” on a mass scale—data as uniquely identifying as a fingerprint, and infinitely more revealing.
Apple has faced its own data breach history. As Pearl Cohen’s legal analysts note, the patent describes data transmission to external servers accessible by parties beyond the user. The company that couldn’t keep our FaceID data secure wants a continuous stream of our brain’s electrical activity.