New York Sues Valve Over Loot Boxes, Calls Them Illegal Gambling

Valve, the maker of Steam and many of PC gaming’s most popular titles, is being sued by New York for its use of loot boxes. New York Attorney General Letitia James filed the lawsuit, claiming that loot box systems enable gambling habits and are particularly harmful for younger people.

The lawsuit specifically cites three games: Counter-Strike 2, Dota 2, and Team Fortress 2. It wants the video game developer to stop using loot boxes in its titles and to pay fines for previously promoting them.

press release from Attorney General James notes that Counter-Strike 2’s loot box system resembles a slot machine, featuring a spinning wheel that reveals a virtual item. Loot boxes are common in online titles, acting as a randomized treasure chest that may provide valuable in-game items.

It explains that valuable items found in loot boxes can be sold on Valve’s Steam Community Market and other third-party stores, indicating they have real-world value. It points to reports of a virtual gun skin within Counter-Strike 2 that sold for over $1 million in 2024.

However, the likelihood of gamers finding a valuable item is low, and the lawsuit alleges that Valve intentionally makes some items harder to win than others to increase value.

“Illegal gambling can be harmful and lead to serious addiction problems, especially for our young people,” said Attorney General James. “Valve has made billions of dollars by letting children and adults alike illegally gamble for the chance to win valuable virtual prizes.”

“These features are addictive, harmful, and illegal, and my office is suing to stop Valve’s illegal conduct and protect New Yorkers.”

Keep reading

‘Roblox’ Programmer Faces 40 Child Porn Charges.

Jamie Borne, a 30-year-old probationer, was arrested in New Orleans after probation officers discovered a child-sized sex doll and electronic devices suspected of containing child sexual abuse material during a routine visit to his home. Borne, who identified himself as a programmer for the Roblox gaming platform, was serving probation for a 2023 conviction involving smoke grenades and a firearm.

Louisiana Attorney General Liz Murrill confirmed that ICE Homeland Security Investigations (HSI) and the Louisiana State Police were involved in the arrest. During the visit on February 26, probation officers reportedly found the child-sized sex doll in Borne’s bedroom, along with child clothing, condoms, and electronic devices believed to contain material related to children under 13 years old.

Borne was booked on 40 counts of possession of child sexual abuse material and one count of possession, trafficking, or importing a child sex doll. He is being held on a bond of $50,000 per count. Louisiana state law stipulates severe penalties for these offenses, including hard labor sentences of up to 20 years without parole for possession of child sexual abuse material and additional penalties for trafficking or importing child sex dolls.

Roblox, the gaming platform where Borne worked, has faced scrutiny in recent years for its lack of age restrictions and alleged facilitation of predatory behavior. According to Louisiana Attorney General Murrill, Roblox allows access to millions of games, including some that allegedly feature explicit content. Murrill has filed a lawsuit against Roblox, citing its failure to implement proper safeguards for minors.

The platform reportedly has 151.5 million daily users, with a significant portion being minors. Twenty percent of the user base is under nine years old, and another 20 percent falls between the ages of nine and twelve. Critics argue that this demographic makeup makes it especially urgent for Roblox to address safety concerns.

Keep reading

‘Pokémon Go’ Players Unknowingly Contributed 30 Billion Images to Train Delivery Robots

Nearly a decade after Pokémon Go transformed the real world into an augmented reality playground, the data collected from hundreds of millions of players is being repurposed to help autonomous delivery robots navigate city streets.

Popular Science reports that Niantic Spatial, part of the team behind the popular augmented reality game Pokémon Go, has announced a partnership with Coco Robotics, a company specializing in short-distance delivery robots for food and groceries. The collaboration will utilize Niantic’s Visual Positioning System, a navigation technology trained on more than 30 billion images captured by Pokémon Go users over the years, to help delivery robots navigate sidewalks and urban environments with unprecedented precision.

The Visual Positioning System can reportedly pinpoint location down to a few centimeters by analyzing nearby buildings and landmarks, offering a significant improvement over traditional GPS technology. This crowdsourced mapping effort represents one of the largest real-world data collection projects ever undertaken through a mobile gaming application, and demonstrates how user-generated content can be repurposed years after its initial collection.

“It turns out that getting Pikachu to realistically run around and getting Coco’s robot to safely and accurately move through the world is actually the same problem,” Niantic Spatial CEO John Hanke said in a recent interview with MIT Technology Review.

When Pokémon Go launched in 2016, it became a cultural phenomenon, attracting approximately 230 million monthly active players at its peak. The game prompted players to physically travel to specific locations and point their phone cameras at various angles while searching for virtual creatures superimposed onto real-world environments. While the game’s popularity has declined since its heyday, it still maintains around 50 million active users by some estimates.

The data collection effort received a significant boost in 2020 when Niantic added a feature called Field Research, which incentivized players to scan real-world statues and landmarks with their cameras in exchange for in-game rewards. Additional data reportedly came from areas designated as Pokémon battle arenas. These scans created detailed 3D models of the real world, capturing the same locations across varying weather conditions, lighting scenarios, angles, and heights.

Keep reading

Scientists Just Taught Lab-Grown Brain Cells to Play Video Games — and People Are Freaked Out

Researchers say they have successfully trained living human brain cells to play the classic video game Doom, marking the latest experiment in so-called “biological computing.”

Australian biotech company Cortical Labs announced the development in a recent demonstration showing neurons grown in a laboratory interacting with the 1993 first-person shooter.

The experiment builds on earlier work from 2022, when the company revealed that clusters of human brain cells grown in a petri dish had learned to play the much simpler video game Pong.

Those early experiments involved “mini-brains” made up of roughly 800,000 to one million living human neurons.

According to the company, the cells demonstrated the ability to adapt and learn basic tasks in real time.

Now researchers say they have taken the technology further by teaching the neurons to interact with the far more complex environment of Doom, a three-dimensional game that requires movement, targeting, and exploration.

To make the system work, engineers translated the digital signals from the game into patterns of electrical stimulation that the neurons could interpret.

“So we showed that biological neurons could play the game Pong,” Cortical Labs chief scientific officer Brett Kagan explained in a video announcement.

“This was a massive milestone because it demonstrated adaptive, real-time, goal-directed learning.”

“Doom was much more complex,” he added. “It’s 3D. It has enemies. It needs to explore, its an environment, and it’s hard.”

Keep reading

UK Consults on Social Media Age Verification While Directing Parents to Report “Hate Speech” to Big Tech

The British government launched a consultation this week that could require age verification for anyone using social media, gaming sites, or AI chatbots.

The consultation, titled “Growing up in the online world,” opened on March 2nd and closes May 26, 2026. It asks the public whether the government should ban under-16s from social media entirely, impose mandatory overnight curfews on platform access, restrict AI chatbot features for minors, and require platforms to disable “addictive design features” like infinite scrolling and autoplay.

The government says it will respond in summer 2026, and Parliament has already handed ministers new legal powers to act on the findings without waiting for fresh primary legislation.

The Prime Minister announced those powers on February 16, weeks before the consultation even opened. The government can now move faster once it decides what it wants. What the public thinks determines the packaging, not the destination.

Technology Secretary Liz Kendall framed it this way: “The path to a good life is a great childhood, one full of love, learning, and play. That applies just as much to the online world as it does to the real one.”

The actual policy tools being considered are a different matter.

Age verification, as a mechanism, works by proving identity. Every user proves who they are.

A social media platform that must exclude under-16s must verify the age of its over-16s. That means collecting identity documents, linking browsing activity to real identities, or building infrastructure that a government can later compel to serve other purposes.

The surveillance architecture required to enforce a children’s safety law is the same architecture required to surveil adults. It gets built for one reason. It gets used for others.

Then there’s the “Help your child stay safe online” campaign site, the government launched alongside the consultation. The site includes a page directing parents to report “bullying, threats, harassment, hate speech, and content promoting self-harm or suicide” directly to platforms, with links to the reporting tools of Instagram, Snapchat, Facebook, WhatsApp, TikTok, Discord, YouTube, and Twitch.

Keep reading

Epstein ALIVE? Conspiracy Theories Surge Over Fortnite Logs And ‘Fake’ Prison Pics

The Jeffrey Epstein scandal just keeps expanding and evolving. Recent document dumps have reignited wild theories that the convicted sex trafficker didn’t kill himself—or perhaps didn’t die at all. 

With rampant speculation that both Epstein and Ghislaine Maxwell were intelligence assets, controlled by a “supra government” above elected officials, many are arguing nothing in their case is beyond the realms of possibility.

As we previously highlighted, Epstein has an extensive gaming history, where his “littlestjeff1” Fortnite username sparked claims of post-death logins from Israel, amplifying the alive-and-kicking narrative. 

As we reported, Epic Games debunked the original account as a hoax rename, but the gaming angle persists as a gateway for deeper conspiracies.

Ben Swann highlighted this in a recent X video, diving into “a shocking theory based on Fortnite activity linked to him.” 

Keep reading

Xbox UK Age Verification Launch Locks Out Thousands of Players

Xbox’s mandatory age verification rollout in the UK was a disaster almost immediately, locking thousands of players out of games, voice chat, and apps like Discord with no clear path back in.

The failures started overnight. Players report being ejected mid-session to complete age verification checks that then took hours, stalled indefinitely, or simply refused to work regardless of what identification they submitted.

Government ID, mobile numbers, and live video age estimation; the system rejected them all for many users. Others made it through verification only to find their accounts still restricted with no explanation and no recourse beyond contacting Xbox support.

Microsoft’s support page now carries a notice confirming it is “aware of the issue and working to fix it.” That’s the extent of the official guidance.

The verification requirement exists to comply with the UK’s new censorship law, the Online Safety Act, legislation mandating that platforms facilitating online communication verify user ages. The actual system XBox built to deliver that compliance forcibly disconnected players from games in progress, stripped away chat functionality with anyone outside their friends list, and blocked access to third-party services.

Users who have held Xbox accounts for over 18 years found themselves flagged for verification anyway. The system doesn’t consider account age, history, or any contextual signal that might indicate an adult user. Everyone gets treated as potentially underage until they hand over documentation.

“The amount of times I’ve tried to do any method of the verification tonight is stupid,” wrote one user. “Can’t change privacy settings on my Xbox to allow me to see mods on games too. Can’t chat on Discord. Utterly broken.”

“Been trying to verify my ID for the past few hours,” added another. “It finally worked but I can’t access anything still. No Discord access at all.”

Keep reading

LA County Sues Roblox Over False Child Safety Claims and Lack of Age Verification

Los Angeles County filed a lawsuit against Roblox, alleging the platform has built a system that leaves children exposed to grooming because it does not go far enough in checking user IDs to prove their age.

The suit names the company for public nuisance and violations of California’s false advertising law.

We obtained a copy of the complaint for you here.

The complaint is direct: “Roblox portrays its platform as a safe and appropriate place for children to play. In reality, and as Roblox well knows, the design of its platform makes children easy prey for pedophiles.”

If you weren’t aware of how big Roblox is and why this is important, Roblox serves roughly 144 million daily active users. That’s more than both Fortnite and the entire userbase of the Steam platform combined.

The platform also lets people create and play games, chat through customizable avatars, and spend real money on virtual currency.

LA County’s suit argues Roblox has consistently failed to moderate user-generated content, enforce its own age restrictions, or honestly disclose the risks predators pose to children using the service.

There is no doubt the platform’s moderation gaps have attracted scrutiny for years, and that the platform has had issues with grooming of minors, but the LA lawsuit is the latest in a pattern of governments and researchers documenting the same problem Roblox has repeatedly said it’s addressing, and the latest attempt to mandate digital ID checks.

Roblox rejected the suit’s allegations. A company spokesman said the platform was built “with safety at its core” and pointed to existing protections: “We have advanced safeguards that monitor our platform for harmful content and communications, and users cannot send or receive images via chat, avoiding one of the most prevalent opportunities for misuse seen elsewhere online.”

The company added that it takes action against rule violators and cooperates with law enforcement, closing with: “There is no finish line when it comes to protecting kids and, while no system can be perfect, our commitment to safety never ends.”

The false advertising angle is what is most important to note. LA isn’t suing Roblox over what it collects or who can see it. The county is suing because the company told parents the platform was safe for kids while allegedly knowing otherwise.

Keep reading

Inside Jeffrey Epstein’s Shockingly Extensive Gaming History

Buried in the Department of Justice’s three-million-page Epstein document dump is an unexpected subplot: Jeffrey Epstein was a gamer.

Not a casual one, either. The files — released Jan. 30 as part of a sprawling DOJ disclosure — paint a portrait of a convicted sex offender who maintained an active presence across multiple gaming platforms for years, who corresponded with some of the video game industry’s most powerful executives about monetizing children, and whose username is now at the center of a viral conspiracy theory alleging he’s still alive and playing Fortnite from Israel.

The saga begins with Xbox Live.

Documents show Epstein received a “Welcome to Xbox Live” email on Oct. 31, 2012. He had been a registered sex offender since 2008, and Microsoft had joined New York Attorney General Eric Schneiderman’s “Operation: Game Over” initiative to purge sex offenders from online gaming platforms six months earlier, in April 2012.

Despite that, Epstein’s account remained active for roughly 14 months.

On Dec. 19, 2013, Microsoft finally pulled the plug. An automated enforcement email sent to Epstein’s “jeevacation@gmail.com” address cited “harassment, threats, and/or abuse of other players,” describing the conduct as “severe, repeated, and/or excessive.” A follow-up email the same day provided the real explanation: “This action is based on the New York Attorney General’s partnership with Microsoft and other online gaming companies to remove New York registered sex offenders from online gaming services to minimize the risk to others, particularly children.”

Keep reading

Dad claims 16-year-old daughter took her own life after meeting a predator on Roblox, slams game platform beloved by kids

Penelope Sokolowski was just 16 years old when she took her own life last February.

Her father, Jason, believes her suicide was the culmination of a grooming process that began on Roblox, the game platform beloved by kids — with some 170,000 users under the age of 13, according to company data from 2023.

“We kind of thought we were covering all the bases,” Jason told The Post, noting that his family had used a third-party app to monitor Penelope’s online activity.

Jason alleges that his only child was contacted by a predator on Roblox who coerced her into cutting his name into her chest and sending videos of herself bloodied from self-harm — and who, ultimately, sent Penelope down a spiral that culminated in her death.

The girl was 7 or 8 years old when she first signed up for Roblox, players rove around online worlds and can chat with other users.

“I’d come in and sit in the room with her and see what she was doing, ask who those people were,” Jason said, recalling Penelope drawing an anime-style sketch for a friend she’d made on Roblox.

“As a dad I thought, oh, this is nice, she’s artistic, and she’s made artistic friends,” he added. “But I didn’t understand what Roblox was and its effect on her.”

The dad, who works in the film industry in Vancouver, British Columbia, separated from Penelope’s mother and moved out of the family home when the girl was 13.

He recalls how Penelope’s grades began to tumble and, when she was 14, he noticed scars from self-inflicted cuts on her arms, which she had been covering with bracelets and his oversized hockey jerseys. 

Penelope confided that she had been recruited into a self-harm group via Roblox, but assured her father she had moved on.

But not long after her 16th birthday, she took her own life.

Later, when Jason opened up his daughter’s cell phone, he found what he describes as a “crime scene.”

According to the dad, there were messages spanning two years with a person who egged on her self-destruction. Jason believes Penelope met this person on Roblox and then began privately conversing with them over Discord — sometimes for hours.

In one exchange, Penelope sent a photo of her chest, offering to cut herself there but worrying she couldn’t go “too deep.” Minutes later, she followed up with an image of the predator’s Discord user name written across her chest in bloodied letters.

In other images, she had carved the numbers “764” into her body. Jason believes Penelope had been contacted by a member of 764, described by the FBI as a “violent online group” that targets minors and grooms them into committing egregious acts of self-harm and violence.

Members of 764 reportedly troll platforms like Roblox looking for victims they can persuade — via grooming or sextortion — into hurting themselves.

“They are grooming girls to do whatever it is they can get a girl to do, whether it’s nudes or cuts or gore or violence,” Jason said. “[Penelope] was brainwashed all the way through.”

Keep reading