Roblox Loses 12M Daily Users After Age ID Check Rollout

Roblox is paying for its surveillance push on users. The platform shed 12 million daily active users between Q4 2025 and Q1 2026, dropping from 144 million globally to 132 million, with the company pinning a meaningful share of the decline on its mandatory age-verification rollout.

Revenue still climbed to $1.4 billion and year-over-year DAU growth came in at 35 percent but the sequential numbers tell the story Roblox tried to bury under positive financial framing.

The fall is steeper when measured from the peak. Roblox hit 152 million daily active users in Q3 2025, meaning roughly 20 million people have stopped showing up daily since the company began demanding facial scans and identity checks to access basic chat features. The trajectory inverted almost exactly when the age checks rolled out globally in January.

Roblox’s own language gives the game away. The company says Q1 growth was “tempered by greater-than-expected headwinds” from the age-check rollout, which “slowed new user acquisition.”

Translated out of investor-speak, fewer people want to hand over biometric data or government ID to a gaming platform than Roblox’s models predicted and existing users who haven’t verified are pulling back from a service that now treats them as second-class accounts.

The verification mechanism deserves a closer look than corporate filings tend to give it. Roblox runs facial age estimation, a system that scans users’ faces to guess how old they are and supplements that with identity verification documents.

Facial scanning of a user base that skews young, with a substantial portion under 13, means the company is processing biometric data from millions of children. Roblox says this is for safety. The system being constructed is a database of face scans tied to platform identities, retained on terms the company has not publicly defined.

Earlier this month, Roblox widened the restrictions to gate game access by age bracket and it has signaled more changes ahead. The company plans to “implement additional improvements designed to facilitate age-appropriate access to content and product features” over coming quarters, and has openly said its safety push will lower Roblox’s “expectations for topline growth in 2026.”

Full-year revenue guidance dropped to 20 to 25 percent growth, down from 22 to 26 percent. Bookings guidance was cut by nearly $1 billion. Wall Street responded by knocking the stock down a whopping 20 percent.

The verification numbers themselves point to a two-tier platform taking shape. Through the end of Q1, 51 percent of global daily active users had completed age checks, with US adoption running at 65 percent.

The other half of the user base is interacting with a degraded version of Roblox where communication is restricted, certain games are off-limits and the path back to full functionality runs through a face scan or an ID upload. It’s a tollgate and the toll is biometric data.

Keep reading

Children pushed to suicide by online grooming network targeting kids through games and chat apps

A grieving British Columbia father is going after an online extremist group after his teenage daughter was allegedly groomed into taking her own life by a disturbing online network that targets children through popular gaming and messaging platforms.

The group, known as 764 or “the Com,” has been described as an international extremist network that preys on children as young as nine through apps such as Roblox, Discord and Telegram. Members are accused of manipulating young users into self-harm, harming pets, committing violent acts and ultimately attempting suicide, often while being watched online.

The father said his daughter Penelope loved amusement parks, zombie movies and creating digital art through games like Minecraft and Roblox. But over time, her behaviour changed dramatically. Her grades collapsed, she stopped attending school and began self-harming.

He later discovered she had allegedly been groomed by individuals connected to the group.

He said members sent him videos of his daughter trying to harm the family cat and that multiple suicide attempts may have been livestreamed. Penelope died in February 2025, three days before her 16th birthday.

Authorities in Canada have reportedly classified 764 as a terrorist organization, with investigations and charges emerging in multiple jurisdictions.

Public awareness remains dangerously low, and this is another reminder that parents should closely monitor children’s online activity. Once vulnerable youth are drawn into these networks, reversing the psychological damage can be extremely difficult.

Keep reading

Pokémon Go — The Largest Mapped Data Collection Ploy in History

When Pokémon Go was released, it appeared to be a harmless game encouraging people to go outside and explore, yet beneath that surface was a far more sophisticated system that directed human movement into very specific locations where data was needed most, turning millions of users into mobile data collectors. The placement of Pokémon, Gyms, and PokéStops was not random, but concentrated around landmarks, businesses, and dense urban corridors, meaning players were repeatedly funneled into high-value mapping zones, often returning to the same locations over and over again, capturing them from multiple angles, at different times of day, and under varying conditions, which is exactly how high-quality spatial datasets are built.

For many reading this, particularly those who never played the game, it is important to understand what this actually looked like in practice, because this was not some passive background process, it required people to physically walk through neighborhoods, parks, shopping districts, and even residential areas while holding up their phones, actively scanning their surroundings to “catch” virtual creatures that did not exist. The game encouraged users to point their cameras at real-world objects, move around them, and interact with the environment. The system was capturing detailed imagery not just of public landmarks but also of surrounding areas, including streets, entryways, and private homes, all embedded in what appeared to be a simple entertainment experience.

Keep reading

Xbox Now Wants Your Face to Let You Play Games You Already Own in Singapore

Singapore gamers who bought and downloaded Xbox titles years ago are now being told they need to prove they’re adults before they can keep playing them.

Microsoft has started rolling out identity verification requirements across its Xbox and Microsoft Store platforms in Singapore, demanding face scans, government ID uploads, or authentication through the country’s national digital identity system, Singpass.

The price of accessing games you already own is now a biometric selfie or a copy of your passport.

The trigger is Singapore’s Online Safety Code of Practice for App Distribution Services, a regulation from the Infocomm Media Development Authority (IMDA) that took effect on April 1, 2026.

The rule requires app stores to prevent anyone estimated to be under 18 from downloading apps rated for adults, including dating services and content with sexual material. Five storefronts are covered: Apple’s App Store, Google Play, Samsung Galaxy Store, Huawei AppGallery, and Microsoft Store (which includes Xbox).

Each company has chosen its own methods for compliance. The methods vary, but they all share one thing in common: they collect sensitive personal data that didn’t exist in the platform’s records before this regulation.

Microsoft announced its approach on March 17, 2026, framing the verification as optional, while making it mandatory for anyone who wants full access.

“Microsoft users in Singapore will have multiple options to complete age assurance for our stores, giving people flexibility while prioritising privacy,” the company wrote, listing those options as Singpass verification, “secure facial age estimation using a selfie,” or uploading “an official government ID such as a national ID, driver’s license, passport, or residence permit.”

The company describes this as a one-time process. What it doesn’t describe is who processes the data, how long it exists in transit, or what happens if the system holding it gets breached.

Discord learned this lesson last year when its own partner leaked user data. The company that promises to delete your face scan still has to receive it first.

Singapore residents have started receiving emails from Xbox notifying them about the verification requirement, prompting confusion and concern.

Keep reading

FAA Targets Video Gamers to Alleviate Air Traffic Controller Shortage

In an effort to solve the decades-long shortage of air traffic controllers across U.S. airspace, the Federal Aviation Administration (FAA) has announced a new hiring campaign targeted at video gamers interested in new career opportunities.

“To reach the next generation of air traffic controllers, we need to adapt. This campaign’s innovative communication style and focus on gaming taps into a growing demographic of young adults who have many of the hard skills it takes to be a successful controller,” Transportation Secretary Sean Duffy said in a statement on April 10.

Announced last Friday, the FAA’s new air traffic controller hiring window opens at 12 a.m. ET on April 17, allowing interested candidates to apply for what the agency calls “one of the most dynamic jobs in the world.”

The FAA has faced a significant shortage of air traffic controllers since the 1980s, with thousands of retirements during the COVID-19 pandemic exacerbating the deficit. Congress has provided the agency with supplemental funding over the past two years to increase staffing, and the Trump administration said it has thousands of trainees in the pipeline.

The FAA is also not the first federal agency to target video gamers with keen hand-eye coordination and quick decision-making skills for high-stakes positions. Both the Pentagon and the Department of Homeland Security have deployed similar strategies for tech-related roles in complex environments that require hours of focus.

The FAA is rolling out a new YouTube ad with bright and fluid graphics asking gamers, “Are you up for the challenge? You’ve been training for this.”

Keep reading

New York Sues Valve Over Loot Boxes, Calls Them Illegal Gambling

Valve, the maker of Steam and many of PC gaming’s most popular titles, is being sued by New York for its use of loot boxes. New York Attorney General Letitia James filed the lawsuit, claiming that loot box systems enable gambling habits and are particularly harmful for younger people.

The lawsuit specifically cites three games: Counter-Strike 2, Dota 2, and Team Fortress 2. It wants the video game developer to stop using loot boxes in its titles and to pay fines for previously promoting them.

press release from Attorney General James notes that Counter-Strike 2’s loot box system resembles a slot machine, featuring a spinning wheel that reveals a virtual item. Loot boxes are common in online titles, acting as a randomized treasure chest that may provide valuable in-game items.

It explains that valuable items found in loot boxes can be sold on Valve’s Steam Community Market and other third-party stores, indicating they have real-world value. It points to reports of a virtual gun skin within Counter-Strike 2 that sold for over $1 million in 2024.

However, the likelihood of gamers finding a valuable item is low, and the lawsuit alleges that Valve intentionally makes some items harder to win than others to increase value.

“Illegal gambling can be harmful and lead to serious addiction problems, especially for our young people,” said Attorney General James. “Valve has made billions of dollars by letting children and adults alike illegally gamble for the chance to win valuable virtual prizes.”

“These features are addictive, harmful, and illegal, and my office is suing to stop Valve’s illegal conduct and protect New Yorkers.”

Keep reading

‘Roblox’ Programmer Faces 40 Child Porn Charges.

Jamie Borne, a 30-year-old probationer, was arrested in New Orleans after probation officers discovered a child-sized sex doll and electronic devices suspected of containing child sexual abuse material during a routine visit to his home. Borne, who identified himself as a programmer for the Roblox gaming platform, was serving probation for a 2023 conviction involving smoke grenades and a firearm.

Louisiana Attorney General Liz Murrill confirmed that ICE Homeland Security Investigations (HSI) and the Louisiana State Police were involved in the arrest. During the visit on February 26, probation officers reportedly found the child-sized sex doll in Borne’s bedroom, along with child clothing, condoms, and electronic devices believed to contain material related to children under 13 years old.

Borne was booked on 40 counts of possession of child sexual abuse material and one count of possession, trafficking, or importing a child sex doll. He is being held on a bond of $50,000 per count. Louisiana state law stipulates severe penalties for these offenses, including hard labor sentences of up to 20 years without parole for possession of child sexual abuse material and additional penalties for trafficking or importing child sex dolls.

Roblox, the gaming platform where Borne worked, has faced scrutiny in recent years for its lack of age restrictions and alleged facilitation of predatory behavior. According to Louisiana Attorney General Murrill, Roblox allows access to millions of games, including some that allegedly feature explicit content. Murrill has filed a lawsuit against Roblox, citing its failure to implement proper safeguards for minors.

The platform reportedly has 151.5 million daily users, with a significant portion being minors. Twenty percent of the user base is under nine years old, and another 20 percent falls between the ages of nine and twelve. Critics argue that this demographic makeup makes it especially urgent for Roblox to address safety concerns.

Keep reading

‘Pokémon Go’ Players Unknowingly Contributed 30 Billion Images to Train Delivery Robots

Nearly a decade after Pokémon Go transformed the real world into an augmented reality playground, the data collected from hundreds of millions of players is being repurposed to help autonomous delivery robots navigate city streets.

Popular Science reports that Niantic Spatial, part of the team behind the popular augmented reality game Pokémon Go, has announced a partnership with Coco Robotics, a company specializing in short-distance delivery robots for food and groceries. The collaboration will utilize Niantic’s Visual Positioning System, a navigation technology trained on more than 30 billion images captured by Pokémon Go users over the years, to help delivery robots navigate sidewalks and urban environments with unprecedented precision.

The Visual Positioning System can reportedly pinpoint location down to a few centimeters by analyzing nearby buildings and landmarks, offering a significant improvement over traditional GPS technology. This crowdsourced mapping effort represents one of the largest real-world data collection projects ever undertaken through a mobile gaming application, and demonstrates how user-generated content can be repurposed years after its initial collection.

“It turns out that getting Pikachu to realistically run around and getting Coco’s robot to safely and accurately move through the world is actually the same problem,” Niantic Spatial CEO John Hanke said in a recent interview with MIT Technology Review.

When Pokémon Go launched in 2016, it became a cultural phenomenon, attracting approximately 230 million monthly active players at its peak. The game prompted players to physically travel to specific locations and point their phone cameras at various angles while searching for virtual creatures superimposed onto real-world environments. While the game’s popularity has declined since its heyday, it still maintains around 50 million active users by some estimates.

The data collection effort received a significant boost in 2020 when Niantic added a feature called Field Research, which incentivized players to scan real-world statues and landmarks with their cameras in exchange for in-game rewards. Additional data reportedly came from areas designated as Pokémon battle arenas. These scans created detailed 3D models of the real world, capturing the same locations across varying weather conditions, lighting scenarios, angles, and heights.

Keep reading

Scientists Just Taught Lab-Grown Brain Cells to Play Video Games — and People Are Freaked Out

Researchers say they have successfully trained living human brain cells to play the classic video game Doom, marking the latest experiment in so-called “biological computing.”

Australian biotech company Cortical Labs announced the development in a recent demonstration showing neurons grown in a laboratory interacting with the 1993 first-person shooter.

The experiment builds on earlier work from 2022, when the company revealed that clusters of human brain cells grown in a petri dish had learned to play the much simpler video game Pong.

Those early experiments involved “mini-brains” made up of roughly 800,000 to one million living human neurons.

According to the company, the cells demonstrated the ability to adapt and learn basic tasks in real time.

Now researchers say they have taken the technology further by teaching the neurons to interact with the far more complex environment of Doom, a three-dimensional game that requires movement, targeting, and exploration.

To make the system work, engineers translated the digital signals from the game into patterns of electrical stimulation that the neurons could interpret.

“So we showed that biological neurons could play the game Pong,” Cortical Labs chief scientific officer Brett Kagan explained in a video announcement.

“This was a massive milestone because it demonstrated adaptive, real-time, goal-directed learning.”

“Doom was much more complex,” he added. “It’s 3D. It has enemies. It needs to explore, its an environment, and it’s hard.”

Keep reading

UK Consults on Social Media Age Verification While Directing Parents to Report “Hate Speech” to Big Tech

The British government launched a consultation this week that could require age verification for anyone using social media, gaming sites, or AI chatbots.

The consultation, titled “Growing up in the online world,” opened on March 2nd and closes May 26, 2026. It asks the public whether the government should ban under-16s from social media entirely, impose mandatory overnight curfews on platform access, restrict AI chatbot features for minors, and require platforms to disable “addictive design features” like infinite scrolling and autoplay.

The government says it will respond in summer 2026, and Parliament has already handed ministers new legal powers to act on the findings without waiting for fresh primary legislation.

The Prime Minister announced those powers on February 16, weeks before the consultation even opened. The government can now move faster once it decides what it wants. What the public thinks determines the packaging, not the destination.

Technology Secretary Liz Kendall framed it this way: “The path to a good life is a great childhood, one full of love, learning, and play. That applies just as much to the online world as it does to the real one.”

The actual policy tools being considered are a different matter.

Age verification, as a mechanism, works by proving identity. Every user proves who they are.

A social media platform that must exclude under-16s must verify the age of its over-16s. That means collecting identity documents, linking browsing activity to real identities, or building infrastructure that a government can later compel to serve other purposes.

The surveillance architecture required to enforce a children’s safety law is the same architecture required to surveil adults. It gets built for one reason. It gets used for others.

Then there’s the “Help your child stay safe online” campaign site, the government launched alongside the consultation. The site includes a page directing parents to report “bullying, threats, harassment, hate speech, and content promoting self-harm or suicide” directly to platforms, with links to the reporting tools of Instagram, Snapchat, Facebook, WhatsApp, TikTok, Discord, YouTube, and Twitch.

Keep reading