Pokémon Go — The Largest Mapped Data Collection Ploy in History

When Pokémon Go was released, it appeared to be a harmless game encouraging people to go outside and explore, yet beneath that surface was a far more sophisticated system that directed human movement into very specific locations where data was needed most, turning millions of users into mobile data collectors. The placement of Pokémon, Gyms, and PokéStops was not random, but concentrated around landmarks, businesses, and dense urban corridors, meaning players were repeatedly funneled into high-value mapping zones, often returning to the same locations over and over again, capturing them from multiple angles, at different times of day, and under varying conditions, which is exactly how high-quality spatial datasets are built.

For many reading this, particularly those who never played the game, it is important to understand what this actually looked like in practice, because this was not some passive background process, it required people to physically walk through neighborhoods, parks, shopping districts, and even residential areas while holding up their phones, actively scanning their surroundings to “catch” virtual creatures that did not exist. The game encouraged users to point their cameras at real-world objects, move around them, and interact with the environment. The system was capturing detailed imagery not just of public landmarks but also of surrounding areas, including streets, entryways, and private homes, all embedded in what appeared to be a simple entertainment experience.

Keep reading

Xbox Now Wants Your Face to Let You Play Games You Already Own in Singapore

Singapore gamers who bought and downloaded Xbox titles years ago are now being told they need to prove they’re adults before they can keep playing them.

Microsoft has started rolling out identity verification requirements across its Xbox and Microsoft Store platforms in Singapore, demanding face scans, government ID uploads, or authentication through the country’s national digital identity system, Singpass.

The price of accessing games you already own is now a biometric selfie or a copy of your passport.

The trigger is Singapore’s Online Safety Code of Practice for App Distribution Services, a regulation from the Infocomm Media Development Authority (IMDA) that took effect on April 1, 2026.

The rule requires app stores to prevent anyone estimated to be under 18 from downloading apps rated for adults, including dating services and content with sexual material. Five storefronts are covered: Apple’s App Store, Google Play, Samsung Galaxy Store, Huawei AppGallery, and Microsoft Store (which includes Xbox).

Each company has chosen its own methods for compliance. The methods vary, but they all share one thing in common: they collect sensitive personal data that didn’t exist in the platform’s records before this regulation.

Microsoft announced its approach on March 17, 2026, framing the verification as optional, while making it mandatory for anyone who wants full access.

“Microsoft users in Singapore will have multiple options to complete age assurance for our stores, giving people flexibility while prioritising privacy,” the company wrote, listing those options as Singpass verification, “secure facial age estimation using a selfie,” or uploading “an official government ID such as a national ID, driver’s license, passport, or residence permit.”

The company describes this as a one-time process. What it doesn’t describe is who processes the data, how long it exists in transit, or what happens if the system holding it gets breached.

Discord learned this lesson last year when its own partner leaked user data. The company that promises to delete your face scan still has to receive it first.

Singapore residents have started receiving emails from Xbox notifying them about the verification requirement, prompting confusion and concern.

Keep reading

FAA Targets Video Gamers to Alleviate Air Traffic Controller Shortage

In an effort to solve the decades-long shortage of air traffic controllers across U.S. airspace, the Federal Aviation Administration (FAA) has announced a new hiring campaign targeted at video gamers interested in new career opportunities.

“To reach the next generation of air traffic controllers, we need to adapt. This campaign’s innovative communication style and focus on gaming taps into a growing demographic of young adults who have many of the hard skills it takes to be a successful controller,” Transportation Secretary Sean Duffy said in a statement on April 10.

Announced last Friday, the FAA’s new air traffic controller hiring window opens at 12 a.m. ET on April 17, allowing interested candidates to apply for what the agency calls “one of the most dynamic jobs in the world.”

The FAA has faced a significant shortage of air traffic controllers since the 1980s, with thousands of retirements during the COVID-19 pandemic exacerbating the deficit. Congress has provided the agency with supplemental funding over the past two years to increase staffing, and the Trump administration said it has thousands of trainees in the pipeline.

The FAA is also not the first federal agency to target video gamers with keen hand-eye coordination and quick decision-making skills for high-stakes positions. Both the Pentagon and the Department of Homeland Security have deployed similar strategies for tech-related roles in complex environments that require hours of focus.

The FAA is rolling out a new YouTube ad with bright and fluid graphics asking gamers, “Are you up for the challenge? You’ve been training for this.”

Keep reading

New York Sues Valve Over Loot Boxes, Calls Them Illegal Gambling

Valve, the maker of Steam and many of PC gaming’s most popular titles, is being sued by New York for its use of loot boxes. New York Attorney General Letitia James filed the lawsuit, claiming that loot box systems enable gambling habits and are particularly harmful for younger people.

The lawsuit specifically cites three games: Counter-Strike 2, Dota 2, and Team Fortress 2. It wants the video game developer to stop using loot boxes in its titles and to pay fines for previously promoting them.

press release from Attorney General James notes that Counter-Strike 2’s loot box system resembles a slot machine, featuring a spinning wheel that reveals a virtual item. Loot boxes are common in online titles, acting as a randomized treasure chest that may provide valuable in-game items.

It explains that valuable items found in loot boxes can be sold on Valve’s Steam Community Market and other third-party stores, indicating they have real-world value. It points to reports of a virtual gun skin within Counter-Strike 2 that sold for over $1 million in 2024.

However, the likelihood of gamers finding a valuable item is low, and the lawsuit alleges that Valve intentionally makes some items harder to win than others to increase value.

“Illegal gambling can be harmful and lead to serious addiction problems, especially for our young people,” said Attorney General James. “Valve has made billions of dollars by letting children and adults alike illegally gamble for the chance to win valuable virtual prizes.”

“These features are addictive, harmful, and illegal, and my office is suing to stop Valve’s illegal conduct and protect New Yorkers.”

Keep reading

‘Roblox’ Programmer Faces 40 Child Porn Charges.

Jamie Borne, a 30-year-old probationer, was arrested in New Orleans after probation officers discovered a child-sized sex doll and electronic devices suspected of containing child sexual abuse material during a routine visit to his home. Borne, who identified himself as a programmer for the Roblox gaming platform, was serving probation for a 2023 conviction involving smoke grenades and a firearm.

Louisiana Attorney General Liz Murrill confirmed that ICE Homeland Security Investigations (HSI) and the Louisiana State Police were involved in the arrest. During the visit on February 26, probation officers reportedly found the child-sized sex doll in Borne’s bedroom, along with child clothing, condoms, and electronic devices believed to contain material related to children under 13 years old.

Borne was booked on 40 counts of possession of child sexual abuse material and one count of possession, trafficking, or importing a child sex doll. He is being held on a bond of $50,000 per count. Louisiana state law stipulates severe penalties for these offenses, including hard labor sentences of up to 20 years without parole for possession of child sexual abuse material and additional penalties for trafficking or importing child sex dolls.

Roblox, the gaming platform where Borne worked, has faced scrutiny in recent years for its lack of age restrictions and alleged facilitation of predatory behavior. According to Louisiana Attorney General Murrill, Roblox allows access to millions of games, including some that allegedly feature explicit content. Murrill has filed a lawsuit against Roblox, citing its failure to implement proper safeguards for minors.

The platform reportedly has 151.5 million daily users, with a significant portion being minors. Twenty percent of the user base is under nine years old, and another 20 percent falls between the ages of nine and twelve. Critics argue that this demographic makeup makes it especially urgent for Roblox to address safety concerns.

Keep reading

‘Pokémon Go’ Players Unknowingly Contributed 30 Billion Images to Train Delivery Robots

Nearly a decade after Pokémon Go transformed the real world into an augmented reality playground, the data collected from hundreds of millions of players is being repurposed to help autonomous delivery robots navigate city streets.

Popular Science reports that Niantic Spatial, part of the team behind the popular augmented reality game Pokémon Go, has announced a partnership with Coco Robotics, a company specializing in short-distance delivery robots for food and groceries. The collaboration will utilize Niantic’s Visual Positioning System, a navigation technology trained on more than 30 billion images captured by Pokémon Go users over the years, to help delivery robots navigate sidewalks and urban environments with unprecedented precision.

The Visual Positioning System can reportedly pinpoint location down to a few centimeters by analyzing nearby buildings and landmarks, offering a significant improvement over traditional GPS technology. This crowdsourced mapping effort represents one of the largest real-world data collection projects ever undertaken through a mobile gaming application, and demonstrates how user-generated content can be repurposed years after its initial collection.

“It turns out that getting Pikachu to realistically run around and getting Coco’s robot to safely and accurately move through the world is actually the same problem,” Niantic Spatial CEO John Hanke said in a recent interview with MIT Technology Review.

When Pokémon Go launched in 2016, it became a cultural phenomenon, attracting approximately 230 million monthly active players at its peak. The game prompted players to physically travel to specific locations and point their phone cameras at various angles while searching for virtual creatures superimposed onto real-world environments. While the game’s popularity has declined since its heyday, it still maintains around 50 million active users by some estimates.

The data collection effort received a significant boost in 2020 when Niantic added a feature called Field Research, which incentivized players to scan real-world statues and landmarks with their cameras in exchange for in-game rewards. Additional data reportedly came from areas designated as Pokémon battle arenas. These scans created detailed 3D models of the real world, capturing the same locations across varying weather conditions, lighting scenarios, angles, and heights.

Keep reading

Scientists Just Taught Lab-Grown Brain Cells to Play Video Games — and People Are Freaked Out

Researchers say they have successfully trained living human brain cells to play the classic video game Doom, marking the latest experiment in so-called “biological computing.”

Australian biotech company Cortical Labs announced the development in a recent demonstration showing neurons grown in a laboratory interacting with the 1993 first-person shooter.

The experiment builds on earlier work from 2022, when the company revealed that clusters of human brain cells grown in a petri dish had learned to play the much simpler video game Pong.

Those early experiments involved “mini-brains” made up of roughly 800,000 to one million living human neurons.

According to the company, the cells demonstrated the ability to adapt and learn basic tasks in real time.

Now researchers say they have taken the technology further by teaching the neurons to interact with the far more complex environment of Doom, a three-dimensional game that requires movement, targeting, and exploration.

To make the system work, engineers translated the digital signals from the game into patterns of electrical stimulation that the neurons could interpret.

“So we showed that biological neurons could play the game Pong,” Cortical Labs chief scientific officer Brett Kagan explained in a video announcement.

“This was a massive milestone because it demonstrated adaptive, real-time, goal-directed learning.”

“Doom was much more complex,” he added. “It’s 3D. It has enemies. It needs to explore, its an environment, and it’s hard.”

Keep reading

UK Consults on Social Media Age Verification While Directing Parents to Report “Hate Speech” to Big Tech

The British government launched a consultation this week that could require age verification for anyone using social media, gaming sites, or AI chatbots.

The consultation, titled “Growing up in the online world,” opened on March 2nd and closes May 26, 2026. It asks the public whether the government should ban under-16s from social media entirely, impose mandatory overnight curfews on platform access, restrict AI chatbot features for minors, and require platforms to disable “addictive design features” like infinite scrolling and autoplay.

The government says it will respond in summer 2026, and Parliament has already handed ministers new legal powers to act on the findings without waiting for fresh primary legislation.

The Prime Minister announced those powers on February 16, weeks before the consultation even opened. The government can now move faster once it decides what it wants. What the public thinks determines the packaging, not the destination.

Technology Secretary Liz Kendall framed it this way: “The path to a good life is a great childhood, one full of love, learning, and play. That applies just as much to the online world as it does to the real one.”

The actual policy tools being considered are a different matter.

Age verification, as a mechanism, works by proving identity. Every user proves who they are.

A social media platform that must exclude under-16s must verify the age of its over-16s. That means collecting identity documents, linking browsing activity to real identities, or building infrastructure that a government can later compel to serve other purposes.

The surveillance architecture required to enforce a children’s safety law is the same architecture required to surveil adults. It gets built for one reason. It gets used for others.

Then there’s the “Help your child stay safe online” campaign site, the government launched alongside the consultation. The site includes a page directing parents to report “bullying, threats, harassment, hate speech, and content promoting self-harm or suicide” directly to platforms, with links to the reporting tools of Instagram, Snapchat, Facebook, WhatsApp, TikTok, Discord, YouTube, and Twitch.

Keep reading

Epstein ALIVE? Conspiracy Theories Surge Over Fortnite Logs And ‘Fake’ Prison Pics

The Jeffrey Epstein scandal just keeps expanding and evolving. Recent document dumps have reignited wild theories that the convicted sex trafficker didn’t kill himself—or perhaps didn’t die at all. 

With rampant speculation that both Epstein and Ghislaine Maxwell were intelligence assets, controlled by a “supra government” above elected officials, many are arguing nothing in their case is beyond the realms of possibility.

As we previously highlighted, Epstein has an extensive gaming history, where his “littlestjeff1” Fortnite username sparked claims of post-death logins from Israel, amplifying the alive-and-kicking narrative. 

As we reported, Epic Games debunked the original account as a hoax rename, but the gaming angle persists as a gateway for deeper conspiracies.

Ben Swann highlighted this in a recent X video, diving into “a shocking theory based on Fortnite activity linked to him.” 

Keep reading

Xbox UK Age Verification Launch Locks Out Thousands of Players

Xbox’s mandatory age verification rollout in the UK was a disaster almost immediately, locking thousands of players out of games, voice chat, and apps like Discord with no clear path back in.

The failures started overnight. Players report being ejected mid-session to complete age verification checks that then took hours, stalled indefinitely, or simply refused to work regardless of what identification they submitted.

Government ID, mobile numbers, and live video age estimation; the system rejected them all for many users. Others made it through verification only to find their accounts still restricted with no explanation and no recourse beyond contacting Xbox support.

Microsoft’s support page now carries a notice confirming it is “aware of the issue and working to fix it.” That’s the extent of the official guidance.

The verification requirement exists to comply with the UK’s new censorship law, the Online Safety Act, legislation mandating that platforms facilitating online communication verify user ages. The actual system XBox built to deliver that compliance forcibly disconnected players from games in progress, stripped away chat functionality with anyone outside their friends list, and blocked access to third-party services.

Users who have held Xbox accounts for over 18 years found themselves flagged for verification anyway. The system doesn’t consider account age, history, or any contextual signal that might indicate an adult user. Everyone gets treated as potentially underage until they hand over documentation.

“The amount of times I’ve tried to do any method of the verification tonight is stupid,” wrote one user. “Can’t change privacy settings on my Xbox to allow me to see mods on games too. Can’t chat on Discord. Utterly broken.”

“Been trying to verify my ID for the past few hours,” added another. “It finally worked but I can’t access anything still. No Discord access at all.”

Keep reading