Scientists Just Taught Lab-Grown Brain Cells to Play Video Games — and People Are Freaked Out

Researchers say they have successfully trained living human brain cells to play the classic video game Doom, marking the latest experiment in so-called “biological computing.”

Australian biotech company Cortical Labs announced the development in a recent demonstration showing neurons grown in a laboratory interacting with the 1993 first-person shooter.

The experiment builds on earlier work from 2022, when the company revealed that clusters of human brain cells grown in a petri dish had learned to play the much simpler video game Pong.

Those early experiments involved “mini-brains” made up of roughly 800,000 to one million living human neurons.

According to the company, the cells demonstrated the ability to adapt and learn basic tasks in real time.

Now researchers say they have taken the technology further by teaching the neurons to interact with the far more complex environment of Doom, a three-dimensional game that requires movement, targeting, and exploration.

To make the system work, engineers translated the digital signals from the game into patterns of electrical stimulation that the neurons could interpret.

“So we showed that biological neurons could play the game Pong,” Cortical Labs chief scientific officer Brett Kagan explained in a video announcement.

“This was a massive milestone because it demonstrated adaptive, real-time, goal-directed learning.”

“Doom was much more complex,” he added. “It’s 3D. It has enemies. It needs to explore, its an environment, and it’s hard.”

Keep reading

UK Consults on Social Media Age Verification While Directing Parents to Report “Hate Speech” to Big Tech

The British government launched a consultation this week that could require age verification for anyone using social media, gaming sites, or AI chatbots.

The consultation, titled “Growing up in the online world,” opened on March 2nd and closes May 26, 2026. It asks the public whether the government should ban under-16s from social media entirely, impose mandatory overnight curfews on platform access, restrict AI chatbot features for minors, and require platforms to disable “addictive design features” like infinite scrolling and autoplay.

The government says it will respond in summer 2026, and Parliament has already handed ministers new legal powers to act on the findings without waiting for fresh primary legislation.

The Prime Minister announced those powers on February 16, weeks before the consultation even opened. The government can now move faster once it decides what it wants. What the public thinks determines the packaging, not the destination.

Technology Secretary Liz Kendall framed it this way: “The path to a good life is a great childhood, one full of love, learning, and play. That applies just as much to the online world as it does to the real one.”

The actual policy tools being considered are a different matter.

Age verification, as a mechanism, works by proving identity. Every user proves who they are.

A social media platform that must exclude under-16s must verify the age of its over-16s. That means collecting identity documents, linking browsing activity to real identities, or building infrastructure that a government can later compel to serve other purposes.

The surveillance architecture required to enforce a children’s safety law is the same architecture required to surveil adults. It gets built for one reason. It gets used for others.

Then there’s the “Help your child stay safe online” campaign site, the government launched alongside the consultation. The site includes a page directing parents to report “bullying, threats, harassment, hate speech, and content promoting self-harm or suicide” directly to platforms, with links to the reporting tools of Instagram, Snapchat, Facebook, WhatsApp, TikTok, Discord, YouTube, and Twitch.

Keep reading

Epstein ALIVE? Conspiracy Theories Surge Over Fortnite Logs And ‘Fake’ Prison Pics

The Jeffrey Epstein scandal just keeps expanding and evolving. Recent document dumps have reignited wild theories that the convicted sex trafficker didn’t kill himself—or perhaps didn’t die at all. 

With rampant speculation that both Epstein and Ghislaine Maxwell were intelligence assets, controlled by a “supra government” above elected officials, many are arguing nothing in their case is beyond the realms of possibility.

As we previously highlighted, Epstein has an extensive gaming history, where his “littlestjeff1” Fortnite username sparked claims of post-death logins from Israel, amplifying the alive-and-kicking narrative. 

As we reported, Epic Games debunked the original account as a hoax rename, but the gaming angle persists as a gateway for deeper conspiracies.

Ben Swann highlighted this in a recent X video, diving into “a shocking theory based on Fortnite activity linked to him.” 

Keep reading

Xbox UK Age Verification Launch Locks Out Thousands of Players

Xbox’s mandatory age verification rollout in the UK was a disaster almost immediately, locking thousands of players out of games, voice chat, and apps like Discord with no clear path back in.

The failures started overnight. Players report being ejected mid-session to complete age verification checks that then took hours, stalled indefinitely, or simply refused to work regardless of what identification they submitted.

Government ID, mobile numbers, and live video age estimation; the system rejected them all for many users. Others made it through verification only to find their accounts still restricted with no explanation and no recourse beyond contacting Xbox support.

Microsoft’s support page now carries a notice confirming it is “aware of the issue and working to fix it.” That’s the extent of the official guidance.

The verification requirement exists to comply with the UK’s new censorship law, the Online Safety Act, legislation mandating that platforms facilitating online communication verify user ages. The actual system XBox built to deliver that compliance forcibly disconnected players from games in progress, stripped away chat functionality with anyone outside their friends list, and blocked access to third-party services.

Users who have held Xbox accounts for over 18 years found themselves flagged for verification anyway. The system doesn’t consider account age, history, or any contextual signal that might indicate an adult user. Everyone gets treated as potentially underage until they hand over documentation.

“The amount of times I’ve tried to do any method of the verification tonight is stupid,” wrote one user. “Can’t change privacy settings on my Xbox to allow me to see mods on games too. Can’t chat on Discord. Utterly broken.”

“Been trying to verify my ID for the past few hours,” added another. “It finally worked but I can’t access anything still. No Discord access at all.”

Keep reading

LA County Sues Roblox Over False Child Safety Claims and Lack of Age Verification

Los Angeles County filed a lawsuit against Roblox, alleging the platform has built a system that leaves children exposed to grooming because it does not go far enough in checking user IDs to prove their age.

The suit names the company for public nuisance and violations of California’s false advertising law.

We obtained a copy of the complaint for you here.

The complaint is direct: “Roblox portrays its platform as a safe and appropriate place for children to play. In reality, and as Roblox well knows, the design of its platform makes children easy prey for pedophiles.”

If you weren’t aware of how big Roblox is and why this is important, Roblox serves roughly 144 million daily active users. That’s more than both Fortnite and the entire userbase of the Steam platform combined.

The platform also lets people create and play games, chat through customizable avatars, and spend real money on virtual currency.

LA County’s suit argues Roblox has consistently failed to moderate user-generated content, enforce its own age restrictions, or honestly disclose the risks predators pose to children using the service.

There is no doubt the platform’s moderation gaps have attracted scrutiny for years, and that the platform has had issues with grooming of minors, but the LA lawsuit is the latest in a pattern of governments and researchers documenting the same problem Roblox has repeatedly said it’s addressing, and the latest attempt to mandate digital ID checks.

Roblox rejected the suit’s allegations. A company spokesman said the platform was built “with safety at its core” and pointed to existing protections: “We have advanced safeguards that monitor our platform for harmful content and communications, and users cannot send or receive images via chat, avoiding one of the most prevalent opportunities for misuse seen elsewhere online.”

The company added that it takes action against rule violators and cooperates with law enforcement, closing with: “There is no finish line when it comes to protecting kids and, while no system can be perfect, our commitment to safety never ends.”

The false advertising angle is what is most important to note. LA isn’t suing Roblox over what it collects or who can see it. The county is suing because the company told parents the platform was safe for kids while allegedly knowing otherwise.

Keep reading

Inside Jeffrey Epstein’s Shockingly Extensive Gaming History

Buried in the Department of Justice’s three-million-page Epstein document dump is an unexpected subplot: Jeffrey Epstein was a gamer.

Not a casual one, either. The files — released Jan. 30 as part of a sprawling DOJ disclosure — paint a portrait of a convicted sex offender who maintained an active presence across multiple gaming platforms for years, who corresponded with some of the video game industry’s most powerful executives about monetizing children, and whose username is now at the center of a viral conspiracy theory alleging he’s still alive and playing Fortnite from Israel.

The saga begins with Xbox Live.

Documents show Epstein received a “Welcome to Xbox Live” email on Oct. 31, 2012. He had been a registered sex offender since 2008, and Microsoft had joined New York Attorney General Eric Schneiderman’s “Operation: Game Over” initiative to purge sex offenders from online gaming platforms six months earlier, in April 2012.

Despite that, Epstein’s account remained active for roughly 14 months.

On Dec. 19, 2013, Microsoft finally pulled the plug. An automated enforcement email sent to Epstein’s “jeevacation@gmail.com” address cited “harassment, threats, and/or abuse of other players,” describing the conduct as “severe, repeated, and/or excessive.” A follow-up email the same day provided the real explanation: “This action is based on the New York Attorney General’s partnership with Microsoft and other online gaming companies to remove New York registered sex offenders from online gaming services to minimize the risk to others, particularly children.”

Keep reading

Dad claims 16-year-old daughter took her own life after meeting a predator on Roblox, slams game platform beloved by kids

Penelope Sokolowski was just 16 years old when she took her own life last February.

Her father, Jason, believes her suicide was the culmination of a grooming process that began on Roblox, the game platform beloved by kids — with some 170,000 users under the age of 13, according to company data from 2023.

“We kind of thought we were covering all the bases,” Jason told The Post, noting that his family had used a third-party app to monitor Penelope’s online activity.

Jason alleges that his only child was contacted by a predator on Roblox who coerced her into cutting his name into her chest and sending videos of herself bloodied from self-harm — and who, ultimately, sent Penelope down a spiral that culminated in her death.

The girl was 7 or 8 years old when she first signed up for Roblox, players rove around online worlds and can chat with other users.

“I’d come in and sit in the room with her and see what she was doing, ask who those people were,” Jason said, recalling Penelope drawing an anime-style sketch for a friend she’d made on Roblox.

“As a dad I thought, oh, this is nice, she’s artistic, and she’s made artistic friends,” he added. “But I didn’t understand what Roblox was and its effect on her.”

The dad, who works in the film industry in Vancouver, British Columbia, separated from Penelope’s mother and moved out of the family home when the girl was 13.

He recalls how Penelope’s grades began to tumble and, when she was 14, he noticed scars from self-inflicted cuts on her arms, which she had been covering with bracelets and his oversized hockey jerseys. 

Penelope confided that she had been recruited into a self-harm group via Roblox, but assured her father she had moved on.

But not long after her 16th birthday, she took her own life.

Later, when Jason opened up his daughter’s cell phone, he found what he describes as a “crime scene.”

According to the dad, there were messages spanning two years with a person who egged on her self-destruction. Jason believes Penelope met this person on Roblox and then began privately conversing with them over Discord — sometimes for hours.

In one exchange, Penelope sent a photo of her chest, offering to cut herself there but worrying she couldn’t go “too deep.” Minutes later, she followed up with an image of the predator’s Discord user name written across her chest in bloodied letters.

In other images, she had carved the numbers “764” into her body. Jason believes Penelope had been contacted by a member of 764, described by the FBI as a “violent online group” that targets minors and grooms them into committing egregious acts of self-harm and violence.

Members of 764 reportedly troll platforms like Roblox looking for victims they can persuade — via grooming or sextortion — into hurting themselves.

“They are grooming girls to do whatever it is they can get a girl to do, whether it’s nudes or cuts or gore or violence,” Jason said. “[Penelope] was brainwashed all the way through.”

Keep reading

Amelia Victorious: How to Lose the Culture War With a Video Game

There’s something genuinely funny going on in the United Kingdom right now.

The British government’s Prevent office, housed under the Home Office (think Department of the Interior, but allergic to dissent), partnered with a media nonprofit called Shout Out UK (like a PBS focused on preventing “radicalism”) to come up with a clever new way to re-educate British youth.

The concern, as always, was “radicalization.” They thought the solution was inspired: a choice-based video game. Kids like games. Games involve decisions. Decisions shape values. What could possibly go wrong?

Thus Pathways was born, a government-funded interactive morality play designed to gently shepherd British children toward being properly antiracist, properly accepting, and properly enthusiastic about the ever-increasing number of migrants reshaping their country. Civics class, but fun. And digital. And corrective.

As part of this effort, the designers introduced a character named Amelia, a cute, purple-haired, vaguely goth girl who carries a Union Jack and talks about Britain being for the British. She was meant to function as a warning, a living illustration of how nationalism can look attractive, even charming, and yet be dangerous to the impressionable youths of Britain who may not have fully internalized the idea that Brexit is bad and they are to obey their elitist overlords.

What they did not anticipate was that the public would take one look at adorable, charming Amelia and decide she was the good guy.

What Prevent Was Supposed to Be

To understand how Pathways ended up here, you have to rewind to what Prevent was originally meant to do. The program emerged from the post-9/11 security logic that shaped Western counter-terror policy across the board. The target was not opinions or aesthetics. It was violence, and specifically Islamist terrorism and the recruitment pipelines that fed it. “Radicalization” meant movement toward planning or committing acts of terror.

The rationale was simple and, frankly, understandable. Governments have a duty to stop people from blowing up buses and concert halls. Identifying grooming networks, interrupting recruitment, and diverting individuals away from violent ideologies was the job. That’s why Prevent sat under the Home Office in the first place. Bombs and bodies are not abstract problems.

Over time, however, the definition of “radicalization” began to stretch. Then it stretched again. Eventually it stopped describing a trajectory toward violence at all and started describing a trajectory away from approved social and political consensus. The concern shifted from what someone might do to what someone might think, or worse, what they might feel attached to.

This is where Prevent quietly stopped being about prevention and started becoming about management, and specifically the management of populations rather than threats. Cultural signals like flags, language, and other symbols of national belonging were reclassified as early warning indicators. Discomfort with mass migration was treated less as a political opinion than as a diagnostic symptom. Belonging itself became something to be solved.

Once the mission changed, the tools followed.

Keep reading

UK Government Video Game Teaches Teens Questioning Mass Immigration Could Make Them Terror Suspects

Britain’s globalist—and increasingly authoritarian—state has found a new way to ‘fight extremism’: teach teenagers that asking the ‘wrong questions’ about mass immigration could make them terrorists.

According to newly surfaced materials, a government-funded video game now warns schoolchildren that doubting the positive effects of unrelenting  mass migration will land them in the crosshairs of counter-terrorism authorities.

The program, called Pathways, is marketed as an “educational” interactive experience for students aged 11 to 18. In practice, however, it functions as a digital loyalty test, funded in part by the Home Office’s Prevent program, Britain’s controversial anti-extremism scheme.

The game goes something like this. Players are placed in the role of a white teenage character named Charlie, newly enrolled in college and navigating modern Britain’s ideological minefield. Every decision—what videos to watch, what opinions to express, even whether to research immigration statistics—is tracked by an in-game extremism meter.

The premise is simple and utterly unmistakable: curiosity is dangerous, skepticism is suspect, and deviation from approved liberal-globalist, views carries severe consequences. Choose the wrong dialogue option, and Charlie is flagged for “extreme right-wing ideology,” a category that now appears to include asking basic questions about national identity.

Even the character’s gender is carefully flattened. Regardless of whether players select a male or female avatar, Charlie is referred to exclusively as “they,” a telling detail in a game obsessed with left-liberal ideological conformity.

Early scenarios in the game set the tone. Charlie struggles academically and is outperformed by an Afro-British classmate, after which players are nudged toward ‘correct’ emotional responses while being warned against drawing conclusions about immigration or competition.

At several points, the game introduces online posts claiming the government prioritizes migrants over British veterans for housing. Players are encouraged to scroll past these claims silently. Engaging, questioning, or researching them triggers ominous warnings.

Attempting to “learn more” is portrayed as especially risky. The game depicts Charlie being overwhelmed by statistics, reports, and protest information. Instead of being framed as civic engagement, the game clearly suggests it’s a slippery slope into ideological contamination.

Keep reading

Tennessee Sues Roblox, Says Game is a ‘Gateway for Predators’ Targeting Children

Tennessee Attorney General Jonathan Skrmetti announced a lawsuit against Roblox Corporation last week, claiming the popular game has become a haven for child predators while misleading parents about its safety.

Filed under the Tennessee Consumer Protection Act (TCPA), the suit accuses Roblox of prioritizing profits over child safety, slashing oversight and resources despite repeated warnings about exploitation risks.

Roblox, a massively popular online gaming world that markets itself as a creative playground for children, is described in the lawsuit as “the digital equivalent of a creepy cargo van lingering at the edge of a playground.”

“Roblox is the digital equivalent of a creepy cargo van lingering at the edge of a playground,” said Attorney General Skrmetti. “Roblox invites children into a fantastic online world with the promise of creativity and play, but that wonderland is a trap that lets the company sell sophisticated predators access to those vulnerable kids. Roblox worked to reduce oversight and child safety resources despite repeated warnings, because less overhead meant more profit. And the whole time, the company lied and said safety was its top priority.”

The allegations paint a disturbing picture of how Roblox’s design and features allegedly enable harmful content and grooming.

Keep reading