Face Biometrics Getting Deeper into Policing, Sparking Concerns

Those worried about the use of facial recognition by law enforcement have warned about how the technology could become entrenched in bureaucracies, growing in use and getting harder to question from outside governments.

A trio of recent reports, from Germany, The Netherlands, and the United Kingdom seem to bear that out.

In Germany, a civil rights activist, Matthias Monroy, writing in his own blog, says a facial recognition system used to identify unknown people has grown “dramatically” from 2021 to 2022.

The database reportedly belongs to Germany’s federal police. According to Monroy, it was searched about 7,700 in 2022, compared to 6,100 times in 2021.

About 2,800 people were identified using the police’s algorithm last year, compared to 1,300 in 2021.

The advocate says that the Federal Ministry offered the information after being asked by a party in parliament. He also said that, according to the ministry, the same data has not been received from German states.

The images are gathered from CCTV cameras and from phones used by police to record the faces of suspects of crimes. Asylum seekers are in the same database.

Reportedly, the number of facial images in the police database grew by about 1.5 million last year compared to the previous year primarily because only 400,000 images were deleted.

If German police are starting to hold on to photos longer, they might be in good company.

Trade publication ComputerWeekly is reporting that some in the UK feel the government is adopting a biometrics “culture of retention.”

Keep reading

Dystopia Down Under: Facial Mood-Tracking CCTV Cameras Deployed at Mardi Gras Pride Parade

Surveillance cameras with the ability to measure a crowd’s “mood” and track the number of people by counting cell phone frequencies were deployed at the Lesbian and Gay Mardi Gras Parade on Saturday in Sydney, Australia.

The CCTV technology from the Dynamic Crowd Measurement firm was used to monitor Oxford Street in Sydney as the city’s LGBTQ-themed Mardi Gras parade was held for the first time since 2019, having been cancelled for the past three years due to the Chinese coronavirus.

According to a report from the Sydney Morning Herald, the cutting-edge cameras came equipped with the ability to track the mood of the crowd, with software being able to track facial expressions and determine whether they are displaying signs of happiness, anger, or neutrality.

The cameras also come with the ability to measure crowd density by counting the number of cell phone frequencies emitted in a given area. It was estimated that around 12,000 people attended on Saturday.

Keep reading

The Air Force’s Drones Can Now Recognize Faces. Uh-Oh.

The U.S. Air Force now has the capability to use facial recognition on drones that could target specific people. Special operations forces can use the drones to gather intelligence and to aid in other missions, according to a contract first spotted by New Scientist. It’s part of a growing movement to develop automated weaponry that raises legal and ethical questions.

The drone software maker, Seattle-based firm RealNetworks, claims the uncrewed craft will use artificial intelligence (AI) to fly itself and discriminate between friend and foe. The company has said that its software can also be used for rescue missions, perimeter protection, and domestic search operations.

The new Air Force drone system isn’t the only drone system to try to use facial recognition. An Israeli company is working on a drone that uses AI to help a drone find the best angles for facial recognition.

Keep reading

Another Facial Recognition Snafu Leads to False Arrest, Wrongful Imprisonment; ACLU Asks Lawmakers to Ban Police Use

Instead of enjoying a late Thanksgiving meal with his mother in Georgia, Randal Reid spent nearly a week in jail in November after he was falsely identified as a luxury purse thief by Louisiana authorities using facial recognition technology.

That’s according to Monday reporting by NOLA.com, which caught the attention of Fight for the Future, a digital rights group that has long advocated against law enforcement and private entities using such technology, partly because of its shortcomings and the risk of outcomes like this.

“So much wrong here,” Fight for the Future said Tuesday, sharing the story on Twitter. The group highlighted that many cops can use facial recognition systems without publicly disclosing it, and anyone’s “life can be upended because of a machine’s mistake.”

Keep reading

‘Power Run Amok’: Madison Square Garden Uses Face-Scanning Tech to Remove Perceived Adversaries

BARBARA HART WAS celebrating her wedding anniversary and waiting for Brandi Carlile to take the stage at Madison Square Garden on Oct. 22, when a pair of security guards approached her and her husband by their seats and asked for the couple to follow them. At first, Hart tells Rolling Stone she was excited, thinking it was some sort of surprise before the concert started. Her excitement turned to anxiety soon after, however, as she spoke with security and gathered that she’d been identified using facial-recognition technology. Then they escorted her out of the venue. 

Hart was initially confused, having no idea why she was flagged. She says security informed her that she was being ejected because of her job as an attorney at Grant & Eisenhofer, a law firm currently litigating against Madison Square Garden’s parent company in a Delaware class-action suit involving several groups of shareholders.

Madison Square Garden Entertainment, owned by James Dolan (who has been known to kick out fans who anger him), confirms to RS that it enacted a policy in recent months forbidding anyone in active litigation against the company from entry to the company’s venues — which include the New York arena that gives the company its name, along with Radio City Music Hall, Beacon Theatre, and the Chicago Theatre. The company’s use of facial recognition tools itself dates back to at least 2018, when the New York Times reported on it; anyone who enters the venue is subject to scanning, and that practice now seems to coincide with the policy against opposing litigants.

“This is retaliatory behavior of powerful people against others, and that should be concerning to us,” says Hart, who also spoke of the incident in a sworn affidavit last month, as Reuters reported. Hart recalls that she declined to give MSG security her ID, but that they were able to correctly identify her anyway; she says security mentioned her picture appearing on Grant & Eisenhofer’s website, leading her to the conclusion that facial recognition was involved. “It was a very eerie experience to be on the receiving end of at that moment.”

Keep reading

FACIAL RECOGNITION SEARCH ENGINE PULLS UP “POTENTIALLY EXPLICIT” PHOTOS OF KIDS

ABUSIVE PARENTS SEARCHING for kids who have fled to shelters. Governments targeting the sons and daughters of political dissidents. Pedophiles stalking the victims they encounter in illicit child sexual abuse material.

The online facial recognition search engine PimEyes allows anyone to search for images of children scraped from across the internet, raising a host of alarming possible uses, an Intercept investigation has found.

Often called the Google of facial recognition, PimEyes search results include images that the site labels as “potentially explicit,” which could lead to further exploitation of children at a time when the dark web has sparked an explosion of images of abuse.

“There are privacy issues raised by the use of facial recognition technology writ large,” said Jeramie Scott, director of the Surveillance Oversight Project at the Electronic Privacy Information Center. “But it’s particularly dangerous when we’re talking about children, when someone may use that to identify a child and to track them down.”

Over the past few years, several child victim advocacy groups have pushed for police use of surveillance technologies to fight trafficking, arguing that facial recognition can help authorities locate victims. One child abuse prevention nonprofit, Ashton Kutcher and Demi Moore’s Thorn, has even developed its own facial recognition tool. But searches on PimEyes for 30 AI-generated children’s faces yielded dozens of pages of results, showing how easily those same tools can be turned against the people they’re designed to help.

Keep reading

Nebraska wants to test body and facial scans that work from a distance

The state of Nebraska is planning to test whole-body and facial recognition technology from far-off sensors. The project, funded by the Department of Defense, aims to test the accuracy of AI in identifying subjects from images and videos captured by stationary towers and drones positioned far from the subjects.

The project is backed by the Intelligence Advanced Research Projects Activity (IARPA) as part of its Biometric Recognition and Identification at Altitude or Range, aka Briar, program. The first phase, dubbed WatchID, of the three-part program will run for 18 months.

Researchers from the University of Nebraska’s Omaha and Lincoln campuses, University of Maryland College Park, Resonant Sciences, and BlueHalo Co. will participate in WatchID. The program will require 200 volunteers who will stand and walk in circles and straight lines in an open space. Once the first phase is successful, it will be expanded to require 600 volunteers.

Keep reading

STATEN ISLAND DA BOUGHT CLEARVIEW FACE RECOGNITION SOFTWARE WITH CIVIL FORFEITURE CASH

THE STATEN ISLAND district attorney’s use of the highly controversial Clearview face recognition system included attempts to dig up the social media accounts of homicide victims and was paid for with equally controversial asset forfeiture cash, according to city records provided to The Intercept.

Clearview has garnered international attention and intense criticism for its simple premise: What if you could instantly identify anyone in the world with only their picture? Using billions of images scraped from social media sites, Clearview sells police and other governmental agencies the ability to match a photo to a name using face recognition, no search warrant required — a power civil libertarians and privacy advocates say simply places too much unsupervised power in the hands of police.

The use of Clearview by the Staten Island district attorney’s office was first reported by Gothamist, citing city records obtained by the Legal Aid Society. Subsequent records procured via New York State Freedom of Information Law request and provided to The Intercept now confirm the initial concerns about the tool’s largely unsupervised use by prosecutors. According to spokesperson Ryan Lavis, the DA’s office “completely stopped utilizing Clearview as an investigative tool last year.”

Yet the documents provide new information about how Staten Island prosecutors used the notorious face recognition tool and show that the software was paid for with funds furnished by the Justice Department’s Equitable Sharing Program. The program lets state and local police hand seized cash and property over to a federal law enforcement agency, whereupon up to 80 percent of the proceeds are then sent back the original state or local department to pocket.

Keep reading

Ukraine is scanning faces of dead Russians, then contacting the mothers

Ukrainian officials have run more than 8,600 facial recognition searches on dead or captured Russian soldiers in the 50 days since Moscow’s invasion began, using the scans to identify bodies and contact hundreds of their families in what may be one of the most gruesome applications of the technology to date.

The country’s IT Army, a volunteer force of hackers and activists that takes its direction from the Ukrainian government, says it has used those identifications to inform the families of the deaths of 582 Russians, including by sending them photos of the abandoned corpses.

The Ukrainians champion the use of face-scanning software from the U.S. tech firm Clearview AI as a brutal but effective way to stir up dissent inside Russia, discourage other fighters and hasten an end to a devastating war.

Keep reading

The creeping authoritarianism of facial recognition

In an effort to lower crime rates, American law enforcement is pushing to combine facial recognition with expanded video surveillance. Politicians worried about their re-election chances due to a perceived crime wave see the expansion as necessary. It’s a sharp swing from 2019 and 2020, when cities like San Francisco and New Orleans were banning or at least enacting limits on facial recognition technology due to privacy concerns.

Now, New Orleans plans to roll back its facial recognition prohibition. The Virginia State Senate gave law enforcement a late Valentine’s Day gift by passing a facial recognition expansion bill on February 15 — the Democrats who unanimously approved a ban on facial recognition last year suddenly changed their minds, as did five Republicans. New York City wants to expand its facial recognition program to fight gun violence.

Law enforcement has a long history of pining for any tool that might give it some sort of edge, citizen due process be damned. Supporters avow that the technology will help investigators find violent crime suspects, including those involved in the January 6 storming of the US Capitol. OneZero reported in 2020 that Wolfcom promoted its real-time face tracking software as perfect for police organizations looking to quickly identify suspects with outstanding warrants.

Keep reading