Amazon patents show new level of surveillance

Amazon has registered 17 new patents for biometric technology intended to help its doorbell cameras identify “suspicious” people by scent, skin texture, fingerprints, eyes, voice, and gait.

The tech giant has been developing its doorbell security camera system since 2018, when Amazon acquired the firm named Ring and, with it, the original technology. According to media reports, Jeff Bezos’ company is now preparing to enable the devices to identify “suspicious” people with the help of biometric technology, based on skin texture, gait, finger, voice, retina, iris, and even odor.

On top of that, if Amazon’s new patents are anything to go by, all Ring doorbell cameras in a given neighborhood would be interconnected, sharing data with each other and creating a composite image of “suspicious” individuals.

One of the patents for what is described in the media as a “neighborhood alert mode” would allow users in one household to send photos and videos of someone they deem ‘suspicious’ to their neighbors’ Ring cameras so that they, too, start recording and can assemble a “series of ‘storyboard’ images for activity taking place across the fields of view of multiple cameras.

Aside from the possible future interconnectivity among the Ring devices themselves, Amazon’s doorbell cameras, as it stands now, already exchange information with 1,963 police and 383 fire departments across the US, according to Business Insider. Authorities do not even need a warrant to access Ring footage.

Keep reading

You’d Better Watch Out: The Surveillance State Has a Naughty List, and You’re On It

“He sees you when you’re sleeping
He knows when you’re awake
He knows when you’ve been bad or good
So be good for goodness’ sake!”

—“Santa Claus Is Coming to Town”

Santa’s got a new helper.

No longer does the all-knowing, all-seeing, jolly Old St. Nick need to rely on antiquated elves on shelves and other seasonal snitches in order to know when you’re sleeping or awake, and if you’ve been naughty or nice.

Thanks to the government’s almost limitless powers made possible by a domestic army of techno-tyrants, fusion centers and Peeping Toms, Santa can get real-time reports on who’s been good or bad this year. This creepy new era of government/corporate spying—in which we’re being listened to, watched, tracked, followed, mapped, bought, sold and targeted—makes the NSA’s rudimentary phone and metadata surveillance appear almost antiquated in comparison.

Consider just a small sampling of the tools being used to track our movements, monitor our spending, and sniff out all the ways in which our thoughts, actions and social circles might land us on the government’s naughty list.

Keep reading

Australia Traded Away Too Much Liberty

Up to now one of Earth’s freest societies, Australia has become a hermit continent. How long can a country maintain emergency restrictions on its citizens’ lives while still calling itself a liberal democracy?

Australia has been testing the limits.

Before 2020, the idea of Australia all but forbidding its citizens from leaving the country, a restriction associated with Communist regimes, was unthinkable. Today, it is a widely accepted policy. “Australia’s borders are currently closed and international travel from Australia remains strictly controlled to help prevent the spread of COVID-19,” a government website declares. “International travel from Australia is only available if you are exempt or you have been granted an individual exemption.” The rule is enforced despite assurances on another government website, dedicated to setting forth Australia’s human-rights-treaty obligations, that the freedom to leave a country “cannot be made dependent on establishing a purpose or reason for leaving.”

The nation’s high court struck down a challenge to the country’s COVID-19 restrictions. “It may be accepted that the travel restrictions are harsh. It may also be accepted that they intrude upon individual rights,” it ruled. “But Parliament was aware of that.” Until last month, Australians who are residents of foreign countries were exempt from the rule so they could return to their residence. But the government tightened the restrictions further, trapping many of them in the country too.

Intrastate travel within Australia is also severely restricted. And the government of South Australia, one of the country’s six states, developed and is now testing an app as Orwellian as any in the free world to enforce its quarantine rules. People in South Australia will be forced to download an app that combines facial recognition and geolocation. The state will text them at random times, and thereafter they will have 15 minutes to take a picture of their face in the location where they are supposed to be. Should they fail, the local police department will be sent to follow up in person. “We don’t tell them how often or when, on a random basis they have to reply within 15 minutes,” Premier Steven Marshall explained. “I think every South Australian should feel pretty proud that we are the national pilot for the home-based quarantine app.”

Keep reading

Army Wants to Install Facial Recognition, Video Analytics at Child Development Centers

The Army wants to use facial recognition and advanced machine learning algorithms to monitor kids at base Children Development Centers and plans to launch a pilot program at Fort Jackson in the near future.

Army contracting officers posted a solicitation to SAM.gov for a vendor capable of developing a facial recognition and video analytics system and integrating that with the Fort Jackson CDC’s closed-circuit television system.

If successful, the system will be used for “monitoring the health and well-being of children in the CDC,” according to the performance work statement.

“The use of close-circuit television video-recording is common in CDCs for security purposes, however these feeds are not continually monitored during all hours of operation in live time,” the solicitation notes. “Instead, CDC staff log scheduled hours by watching the live video feeds periodically throughout the day for the mandated metrics.”

The center is hoping adding video analytics to the CCTV system will allow for continuous monitoring of students, “used as an addition to the human CCTV monitoring,” and capable of automatically alerting staff to situations as they arise.

Keep reading

Documentary Exposes How Facial Recognition Tech Doesn’t See Dark Faces Accurately

What not to NOT love about Artificial Intelligence (AI)?

  • Millions of jobs being lost (see 12)
  • Censorship, unwarranted surveillance, and other unethical and dangerous applications (see 12345)
  • Inaccuracies that can lead to life-altering consequences

One documentary reveals more unscrupulous details:

CODED BIAS explores the fallout of MIT Media Lab researcher Joy Buolamwini’s discovery that facial recognition does not see dark-skinned faces accurately, and her journey to push for the first-ever legislation in the U.S. to govern against bias in the algorithms that impact us all.

Modern society sits at the intersection of two crucial questions: What does it mean when artificial intelligence increasingly governs our liberties? And what are the consequences for the people AI is biased against? When MIT Media Lab researcher Joy Buolamwini discovers that most facial-recognition software does not accurately identify darker-skinned faces and the faces of women, she delves into an investigation of widespread bias in algorithms. As it turns out, artificial intelligence is not neutral, and women are leading the charge to ensure our civil rights are protected.

Keep reading

This is how we lost control of our faces

In 1964, mathematician and computer scientist Woodrow Bledsoe first attempted the task of matching suspects’ faces to mugshots. He measured out the distances between different facial features in printed photographs and fed them into a computer program. His rudimentary successes would set off decades of research into teaching machines to recognize human faces.

Now a new study shows just how much this enterprise has eroded our privacy. It hasn’t just fueled an increasingly powerful tool of surveillance. The latest generation of deep-learning-based facial recognition has completely disrupted our norms of consent.

Deborah Raji, a fellow at nonprofit Mozilla, and Genevieve Fried, who advises members of the US Congress on algorithmic accountability, examined over 130 facial-recognition data sets compiled over 43 years. They found that researchers, driven by the exploding data requirements of deep learning, gradually abandoned asking for people’s consent. This has led more and more of people’s personal photos to be incorporated into systems of surveillance without their knowledge.

It has also led to far messier data sets: they may unintentionally include photos of minors, use racist and sexist labels, or have inconsistent quality and lighting. The trend could help explain the growing number of cases in which facial-recognition systems have failed with troubling consequences, such as the false arrests of two Black men in the Detroit area last year.

People were extremely cautious about collecting, documenting, and verifying face data in the early days, says Raji. “Now we don’t care anymore. All of that has been abandoned,” she says. “You just can’t keep track of a million faces. After a certain point, you can’t even pretend that you have control.”

Keep reading

DHS’s Facial/Iris Recognition Can ID Airline Passengers Wearing Masks

It is official, unless airline passengers are willing to wear motorcycle helmets or Daft Punk style masks, the Feds can use facial and iris recognition to identify nearly everyone.

According to an S&T press release, a pilot program run by DHS proves they can use facial/iris recognition to identify airline passengers.

The in-person rally, held at the Maryland Test Facility (MdTF), included 10 days of human testing during which six face and/or iris acquisition systems and 13 matching algorithms were tested with help from 582 diverse test volunteers representing 60 countries.

What is DHS’s so-called motivation to ID everyone?

Keep reading

Stanford researchers claim new facial tracking software can determine your political affiliation

Because artificial intelligence wasn’t already frightening enough, researchers decided to teach computers how to identify a person’s political ideology based upon their facial appearance and expressions.

The study was led by Stanford researcher Michal Kosinski, who already caused a stir in 2017 by programming machines that could determine whether you are gay or straight based on your appearance.

Keep reading

DHS Works to Improve Biometric Scanning of Masked Faces

Perhaps the most-worn accessories of 2020, face masks mark an unexpected new constant in people’s lives providing necessary protection against COVID-19—but they’re also known to pose some trouble for contemporary facial recognition systems. 

The Homeland Security Department, one of the government’s biggest biometrics systems users, is now steering research to confront the complexities limiting existing technology and help push forward tools to safely verify people’s identities at security checkpoints in a pandemic.    

Initial results from one recent effort “are actually quite promising,” according to Arun Vemury, director of DHS’ Biometric and Identity Technology Center. 

“We’re getting to the point with this technology, where at least from the preliminary results, it looks like there’s some combinations of biometric acquisition systems, the camera systems and the matching algorithms—when you combine them together, you could match eight or nine out of 10 people without asking them to remove their masks,” Vemury told Nextgov during a recent interview. “This means that for the vast majority of people in airports, they might not have to remove their masks anymore to even go through the security checks, and we could do a really good job of still matching them. So, I think it’s very promising from that perspective. Is it 100%? Is it perfect? No. But it reduces the number of people who potentially have to take their masks off.”

Keep reading