ABUSIVE PARENTS SEARCHING for kids who have fled to shelters. Governments targeting the sons and daughters of political dissidents. Pedophiles stalking the victims they encounter in illicit child sexual abuse material.
The online facial recognition search engine PimEyes allows anyone to search for images of children scraped from across the internet, raising a host of alarming possible uses, an Intercept investigation has found.
Often called the Google of facial recognition, PimEyes search results include images that the site labels as “potentially explicit,” which could lead to further exploitation of children at a time when the dark web has sparked an explosion of images of abuse.
“There are privacy issues raised by the use of facial recognition technology writ large,” said Jeramie Scott, director of the Surveillance Oversight Project at the Electronic Privacy Information Center. “But it’s particularly dangerous when we’re talking about children, when someone may use that to identify a child and to track them down.”
Over the past few years, several child victim advocacy groups have pushed for police use of surveillance technologies to fight trafficking, arguing that facial recognition can help authorities locate victims. One child abuse prevention nonprofit, Ashton Kutcher and Demi Moore’s Thorn, has even developed its own facial recognition tool. But searches on PimEyes for 30 AI-generated children’s faces yielded dozens of pages of results, showing how easily those same tools can be turned against the people they’re designed to help.