The Air Force’s Drones Can Now Recognize Faces. Uh-Oh.

The U.S. Air Force now has the capability to use facial recognition on drones that could target specific people. Special operations forces can use the drones to gather intelligence and to aid in other missions, according to a contract first spotted by New Scientist. It’s part of a growing movement to develop automated weaponry that raises legal and ethical questions.

The drone software maker, Seattle-based firm RealNetworks, claims the uncrewed craft will use artificial intelligence (AI) to fly itself and discriminate between friend and foe. The company has said that its software can also be used for rescue missions, perimeter protection, and domestic search operations.

The new Air Force drone system isn’t the only drone system to try to use facial recognition. An Israeli company is working on a drone that uses AI to help a drone find the best angles for facial recognition.

Keep reading

Another Facial Recognition Snafu Leads to False Arrest, Wrongful Imprisonment; ACLU Asks Lawmakers to Ban Police Use

Instead of enjoying a late Thanksgiving meal with his mother in Georgia, Randal Reid spent nearly a week in jail in November after he was falsely identified as a luxury purse thief by Louisiana authorities using facial recognition technology.

That’s according to Monday reporting by NOLA.com, which caught the attention of Fight for the Future, a digital rights group that has long advocated against law enforcement and private entities using such technology, partly because of its shortcomings and the risk of outcomes like this.

“So much wrong here,” Fight for the Future said Tuesday, sharing the story on Twitter. The group highlighted that many cops can use facial recognition systems without publicly disclosing it, and anyone’s “life can be upended because of a machine’s mistake.”

Keep reading

‘Power Run Amok’: Madison Square Garden Uses Face-Scanning Tech to Remove Perceived Adversaries

BARBARA HART WAS celebrating her wedding anniversary and waiting for Brandi Carlile to take the stage at Madison Square Garden on Oct. 22, when a pair of security guards approached her and her husband by their seats and asked for the couple to follow them. At first, Hart tells Rolling Stone she was excited, thinking it was some sort of surprise before the concert started. Her excitement turned to anxiety soon after, however, as she spoke with security and gathered that she’d been identified using facial-recognition technology. Then they escorted her out of the venue. 

Hart was initially confused, having no idea why she was flagged. She says security informed her that she was being ejected because of her job as an attorney at Grant & Eisenhofer, a law firm currently litigating against Madison Square Garden’s parent company in a Delaware class-action suit involving several groups of shareholders.

Madison Square Garden Entertainment, owned by James Dolan (who has been known to kick out fans who anger him), confirms to RS that it enacted a policy in recent months forbidding anyone in active litigation against the company from entry to the company’s venues — which include the New York arena that gives the company its name, along with Radio City Music Hall, Beacon Theatre, and the Chicago Theatre. The company’s use of facial recognition tools itself dates back to at least 2018, when the New York Times reported on it; anyone who enters the venue is subject to scanning, and that practice now seems to coincide with the policy against opposing litigants.

“This is retaliatory behavior of powerful people against others, and that should be concerning to us,” says Hart, who also spoke of the incident in a sworn affidavit last month, as Reuters reported. Hart recalls that she declined to give MSG security her ID, but that they were able to correctly identify her anyway; she says security mentioned her picture appearing on Grant & Eisenhofer’s website, leading her to the conclusion that facial recognition was involved. “It was a very eerie experience to be on the receiving end of at that moment.”

Keep reading

FACIAL RECOGNITION SEARCH ENGINE PULLS UP “POTENTIALLY EXPLICIT” PHOTOS OF KIDS

ABUSIVE PARENTS SEARCHING for kids who have fled to shelters. Governments targeting the sons and daughters of political dissidents. Pedophiles stalking the victims they encounter in illicit child sexual abuse material.

The online facial recognition search engine PimEyes allows anyone to search for images of children scraped from across the internet, raising a host of alarming possible uses, an Intercept investigation has found.

Often called the Google of facial recognition, PimEyes search results include images that the site labels as “potentially explicit,” which could lead to further exploitation of children at a time when the dark web has sparked an explosion of images of abuse.

“There are privacy issues raised by the use of facial recognition technology writ large,” said Jeramie Scott, director of the Surveillance Oversight Project at the Electronic Privacy Information Center. “But it’s particularly dangerous when we’re talking about children, when someone may use that to identify a child and to track them down.”

Over the past few years, several child victim advocacy groups have pushed for police use of surveillance technologies to fight trafficking, arguing that facial recognition can help authorities locate victims. One child abuse prevention nonprofit, Ashton Kutcher and Demi Moore’s Thorn, has even developed its own facial recognition tool. But searches on PimEyes for 30 AI-generated children’s faces yielded dozens of pages of results, showing how easily those same tools can be turned against the people they’re designed to help.

Keep reading

Nebraska wants to test body and facial scans that work from a distance

The state of Nebraska is planning to test whole-body and facial recognition technology from far-off sensors. The project, funded by the Department of Defense, aims to test the accuracy of AI in identifying subjects from images and videos captured by stationary towers and drones positioned far from the subjects.

The project is backed by the Intelligence Advanced Research Projects Activity (IARPA) as part of its Biometric Recognition and Identification at Altitude or Range, aka Briar, program. The first phase, dubbed WatchID, of the three-part program will run for 18 months.

Researchers from the University of Nebraska’s Omaha and Lincoln campuses, University of Maryland College Park, Resonant Sciences, and BlueHalo Co. will participate in WatchID. The program will require 200 volunteers who will stand and walk in circles and straight lines in an open space. Once the first phase is successful, it will be expanded to require 600 volunteers.

Keep reading

STATEN ISLAND DA BOUGHT CLEARVIEW FACE RECOGNITION SOFTWARE WITH CIVIL FORFEITURE CASH

THE STATEN ISLAND district attorney’s use of the highly controversial Clearview face recognition system included attempts to dig up the social media accounts of homicide victims and was paid for with equally controversial asset forfeiture cash, according to city records provided to The Intercept.

Clearview has garnered international attention and intense criticism for its simple premise: What if you could instantly identify anyone in the world with only their picture? Using billions of images scraped from social media sites, Clearview sells police and other governmental agencies the ability to match a photo to a name using face recognition, no search warrant required — a power civil libertarians and privacy advocates say simply places too much unsupervised power in the hands of police.

The use of Clearview by the Staten Island district attorney’s office was first reported by Gothamist, citing city records obtained by the Legal Aid Society. Subsequent records procured via New York State Freedom of Information Law request and provided to The Intercept now confirm the initial concerns about the tool’s largely unsupervised use by prosecutors. According to spokesperson Ryan Lavis, the DA’s office “completely stopped utilizing Clearview as an investigative tool last year.”

Yet the documents provide new information about how Staten Island prosecutors used the notorious face recognition tool and show that the software was paid for with funds furnished by the Justice Department’s Equitable Sharing Program. The program lets state and local police hand seized cash and property over to a federal law enforcement agency, whereupon up to 80 percent of the proceeds are then sent back the original state or local department to pocket.

Keep reading

Ukraine is scanning faces of dead Russians, then contacting the mothers

Ukrainian officials have run more than 8,600 facial recognition searches on dead or captured Russian soldiers in the 50 days since Moscow’s invasion began, using the scans to identify bodies and contact hundreds of their families in what may be one of the most gruesome applications of the technology to date.

The country’s IT Army, a volunteer force of hackers and activists that takes its direction from the Ukrainian government, says it has used those identifications to inform the families of the deaths of 582 Russians, including by sending them photos of the abandoned corpses.

The Ukrainians champion the use of face-scanning software from the U.S. tech firm Clearview AI as a brutal but effective way to stir up dissent inside Russia, discourage other fighters and hasten an end to a devastating war.

Keep reading

The creeping authoritarianism of facial recognition

In an effort to lower crime rates, American law enforcement is pushing to combine facial recognition with expanded video surveillance. Politicians worried about their re-election chances due to a perceived crime wave see the expansion as necessary. It’s a sharp swing from 2019 and 2020, when cities like San Francisco and New Orleans were banning or at least enacting limits on facial recognition technology due to privacy concerns.

Now, New Orleans plans to roll back its facial recognition prohibition. The Virginia State Senate gave law enforcement a late Valentine’s Day gift by passing a facial recognition expansion bill on February 15 — the Democrats who unanimously approved a ban on facial recognition last year suddenly changed their minds, as did five Republicans. New York City wants to expand its facial recognition program to fight gun violence.

Law enforcement has a long history of pining for any tool that might give it some sort of edge, citizen due process be damned. Supporters avow that the technology will help investigators find violent crime suspects, including those involved in the January 6 storming of the US Capitol. OneZero reported in 2020 that Wolfcom promoted its real-time face tracking software as perfect for police organizations looking to quickly identify suspects with outstanding warrants.

Keep reading

IRS To Require Facial Recognition To View Tax Returns

The US Internal Revenue Service (IRS) has partnered with a Virginia-based private identification firm which requires a facial recognition selfie among other things, in order to create or access online accounts with the agency.

According to KrebsonSecurity, the IRS announced that by the summer of 2022, the only way to log into irs.gov will be through ID.me. Founded by former Army Rangers in 2010, the McLean-based company has evolved to providing online ID verification services which several states are using to help reduce unemployment and pandemic-assistance fraud. The company claims to have 64 million users.

Keep reading