Australia Traded Away Too Much Liberty

Up to now one of Earth’s freest societies, Australia has become a hermit continent. How long can a country maintain emergency restrictions on its citizens’ lives while still calling itself a liberal democracy?

Australia has been testing the limits.

Before 2020, the idea of Australia all but forbidding its citizens from leaving the country, a restriction associated with Communist regimes, was unthinkable. Today, it is a widely accepted policy. “Australia’s borders are currently closed and international travel from Australia remains strictly controlled to help prevent the spread of COVID-19,” a government website declares. “International travel from Australia is only available if you are exempt or you have been granted an individual exemption.” The rule is enforced despite assurances on another government website, dedicated to setting forth Australia’s human-rights-treaty obligations, that the freedom to leave a country “cannot be made dependent on establishing a purpose or reason for leaving.”

The nation’s high court struck down a challenge to the country’s COVID-19 restrictions. “It may be accepted that the travel restrictions are harsh. It may also be accepted that they intrude upon individual rights,” it ruled. “But Parliament was aware of that.” Until last month, Australians who are residents of foreign countries were exempt from the rule so they could return to their residence. But the government tightened the restrictions further, trapping many of them in the country too.

Intrastate travel within Australia is also severely restricted. And the government of South Australia, one of the country’s six states, developed and is now testing an app as Orwellian as any in the free world to enforce its quarantine rules. People in South Australia will be forced to download an app that combines facial recognition and geolocation. The state will text them at random times, and thereafter they will have 15 minutes to take a picture of their face in the location where they are supposed to be. Should they fail, the local police department will be sent to follow up in person. “We don’t tell them how often or when, on a random basis they have to reply within 15 minutes,” Premier Steven Marshall explained. “I think every South Australian should feel pretty proud that we are the national pilot for the home-based quarantine app.”

Keep reading

Army Wants to Install Facial Recognition, Video Analytics at Child Development Centers

The Army wants to use facial recognition and advanced machine learning algorithms to monitor kids at base Children Development Centers and plans to launch a pilot program at Fort Jackson in the near future.

Army contracting officers posted a solicitation to SAM.gov for a vendor capable of developing a facial recognition and video analytics system and integrating that with the Fort Jackson CDC’s closed-circuit television system.

If successful, the system will be used for “monitoring the health and well-being of children in the CDC,” according to the performance work statement.

“The use of close-circuit television video-recording is common in CDCs for security purposes, however these feeds are not continually monitored during all hours of operation in live time,” the solicitation notes. “Instead, CDC staff log scheduled hours by watching the live video feeds periodically throughout the day for the mandated metrics.”

The center is hoping adding video analytics to the CCTV system will allow for continuous monitoring of students, “used as an addition to the human CCTV monitoring,” and capable of automatically alerting staff to situations as they arise.

Keep reading

Documentary Exposes How Facial Recognition Tech Doesn’t See Dark Faces Accurately

What not to NOT love about Artificial Intelligence (AI)?

  • Millions of jobs being lost (see 12)
  • Censorship, unwarranted surveillance, and other unethical and dangerous applications (see 12345)
  • Inaccuracies that can lead to life-altering consequences

One documentary reveals more unscrupulous details:

CODED BIAS explores the fallout of MIT Media Lab researcher Joy Buolamwini’s discovery that facial recognition does not see dark-skinned faces accurately, and her journey to push for the first-ever legislation in the U.S. to govern against bias in the algorithms that impact us all.

Modern society sits at the intersection of two crucial questions: What does it mean when artificial intelligence increasingly governs our liberties? And what are the consequences for the people AI is biased against? When MIT Media Lab researcher Joy Buolamwini discovers that most facial-recognition software does not accurately identify darker-skinned faces and the faces of women, she delves into an investigation of widespread bias in algorithms. As it turns out, artificial intelligence is not neutral, and women are leading the charge to ensure our civil rights are protected.

Keep reading

This is how we lost control of our faces

In 1964, mathematician and computer scientist Woodrow Bledsoe first attempted the task of matching suspects’ faces to mugshots. He measured out the distances between different facial features in printed photographs and fed them into a computer program. His rudimentary successes would set off decades of research into teaching machines to recognize human faces.

Now a new study shows just how much this enterprise has eroded our privacy. It hasn’t just fueled an increasingly powerful tool of surveillance. The latest generation of deep-learning-based facial recognition has completely disrupted our norms of consent.

Deborah Raji, a fellow at nonprofit Mozilla, and Genevieve Fried, who advises members of the US Congress on algorithmic accountability, examined over 130 facial-recognition data sets compiled over 43 years. They found that researchers, driven by the exploding data requirements of deep learning, gradually abandoned asking for people’s consent. This has led more and more of people’s personal photos to be incorporated into systems of surveillance without their knowledge.

It has also led to far messier data sets: they may unintentionally include photos of minors, use racist and sexist labels, or have inconsistent quality and lighting. The trend could help explain the growing number of cases in which facial-recognition systems have failed with troubling consequences, such as the false arrests of two Black men in the Detroit area last year.

People were extremely cautious about collecting, documenting, and verifying face data in the early days, says Raji. “Now we don’t care anymore. All of that has been abandoned,” she says. “You just can’t keep track of a million faces. After a certain point, you can’t even pretend that you have control.”

Keep reading

DHS’s Facial/Iris Recognition Can ID Airline Passengers Wearing Masks

It is official, unless airline passengers are willing to wear motorcycle helmets or Daft Punk style masks, the Feds can use facial and iris recognition to identify nearly everyone.

According to an S&T press release, a pilot program run by DHS proves they can use facial/iris recognition to identify airline passengers.

The in-person rally, held at the Maryland Test Facility (MdTF), included 10 days of human testing during which six face and/or iris acquisition systems and 13 matching algorithms were tested with help from 582 diverse test volunteers representing 60 countries.

What is DHS’s so-called motivation to ID everyone?

Keep reading

Stanford researchers claim new facial tracking software can determine your political affiliation

Because artificial intelligence wasn’t already frightening enough, researchers decided to teach computers how to identify a person’s political ideology based upon their facial appearance and expressions.

The study was led by Stanford researcher Michal Kosinski, who already caused a stir in 2017 by programming machines that could determine whether you are gay or straight based on your appearance.

Keep reading

DHS Works to Improve Biometric Scanning of Masked Faces

Perhaps the most-worn accessories of 2020, face masks mark an unexpected new constant in people’s lives providing necessary protection against COVID-19—but they’re also known to pose some trouble for contemporary facial recognition systems. 

The Homeland Security Department, one of the government’s biggest biometrics systems users, is now steering research to confront the complexities limiting existing technology and help push forward tools to safely verify people’s identities at security checkpoints in a pandemic.    

Initial results from one recent effort “are actually quite promising,” according to Arun Vemury, director of DHS’ Biometric and Identity Technology Center. 

“We’re getting to the point with this technology, where at least from the preliminary results, it looks like there’s some combinations of biometric acquisition systems, the camera systems and the matching algorithms—when you combine them together, you could match eight or nine out of 10 people without asking them to remove their masks,” Vemury told Nextgov during a recent interview. “This means that for the vast majority of people in airports, they might not have to remove their masks anymore to even go through the security checks, and we could do a really good job of still matching them. So, I think it’s very promising from that perspective. Is it 100%? Is it perfect? No. But it reduces the number of people who potentially have to take their masks off.”

Keep reading

Secret Facial Recognition Program Could Cover Every State

America’s law enforcement has been secretly using a facial recognition program that can be used to ID activists and protesters.The first-ever acknowledgement of the program was recently revealed by the Washington Post.

“The court documents are believed to be the first public acknowledgment that authorities used the controversial technology in connection with the widely criticized sweep of largely peaceful protesters ahead of a photo op by President Trump.”

What makes this so troubling are two things. One, it appears to be used by law enforcement nationwide.

As the Washington Post explains, “the case is one of a growing number nationwide in which authorities have turned to facial recognition software to help identify protesters accused of violence.”

And two, this secret law enforcement facial recognition database contains images of at least 1.4 million Americans.

“The case also provides the first detailed look at a powerful new regional facial recognition system that officials said has been used more than 12,000 times since 2019 and contains a database of 1.4 million people but operates almost entirely outside the public view. Fourteen local and federal agencies have access.”

The name of this new facial recognition program is called the “National Capital Region Facial Recognition Investigative Leads System” (NCRFRILS).

A thousand law enforcement agencies could have access to a billion public records.

Keep reading

Activists Build Their Own Facial Recognition System to ID Bad Cops Who Hide Badges

“Beat them at their own game.” “Flip the script.” “Give them a taste of their own medicine.” Any of these phrases will apply to the absolutely ingenious measures taken by  activists who are creating their own facial recognition systems to identify cops who hide their badges.

As TFTP reported during the George Floyd protests, across the country, in dozens of cities, cops were doling out unprecedented violence in the face of angry protests stemming from unchecked police brutality. Though Floyd’s death was the flash point of the unrest, the uprising represented something far deeper — systemic abuse by law enforcement of minorities, the poor, and everyone else not directly connected to the establishment. Countless incidents throughout this unrest involved officers who could not be identified and as a result of their anonymous instigation and violence, there has been no accountability.

In multiple states, police have been seemingly taking measures to avoid this accountability by removing their name tags or covering their badges. This is in direct violation of most departments’ policies. But no politicians, mayors, governors, or mainstream media seem to care.

Cops hiding their identification is ominous for two reasons. The first reason being that they can enact brutality against the innocent and we do not know who they are to hold them accountable. The second reason is the fact that anyone can dress up like a cop with no badge number and start doing whatever they want, up to and including inciting violence, detaining people, or any other numerous unscrupulous acts.

As we reported at the time, many of these departments appeared to have been given orders from the top down to cover their badge numbers and remove their name plates. This is not acceptable and thanks to a self-taught programmer, Christopher Howell, it no longer has to be.

Howell created a program that identifies cops who were permitted by their supervisors to cover their names while responding to protests.

“I am involved with developing facial recognition to in fact use on Portland police officers, since they are not identifying themselves to the public,” Howell told the NY Times.

Because Portland made it illegal to use facial recognition against the police, Howell had some barriers to using his software. However, as the NY Times reports, Portland’s mayor, Ted Wheeler, told Mr. Howell that his project was “a little creepy,” but a lawyer for the city clarified that the bills would not apply to individuals. The Council then passed the legislation in a unanimous vote.

“There’s a lot of excessive force here in Portland,” Howell told the NY Times. “Knowing who the officers are seems like a baseline.”

This self-taught computer programmer has since created a system that has led to flipping the script on police accountability.

Keep reading

ROBOCOP IS HERE – NEW POLICE HELMET SCANS FOR SIGNS OF COVID-19 AND USES FACIAL RECOGNITION

It took 33 years but Robocop is now here. Well, not exactly, but the rise of the police state fueled by advancements in technology has given birth to a heads-up display equipped helmet sure to please the most anxious of peace officers. It’s called a “Smart Helmet” and it can screen airport passengers for symptoms the COVID-19 virus as well as provide the scanning officer with other vital records.

Public officials in Flint, Michigan cannot provide clean drinking water to their residents but travelers to Bishop International Airport can get a glimpse of the new robotic cop helmets where they’re currently deployed.

Under the guise of screening passengers for COVID-19, the Smart Helmet, produced by KeyBiz based in Italy, can scan travelers’ body temperatures from over 20 feet away.

But the Smart Helmet is not limited to temperature body scans which any laser guided thermometer can do, not in the slightest. Facial recognition software is installed which can provide the police officer with information related to outstanding warrants, if an individual is identified on a terror watch list or a no-fly list, and can read license plates for outstanding warrants, stolen vehicle information, criminal histories, etc. Even if you are completely innocent, you will be subject to these scans.

Keep reading