Google Veterans Team Up With Gov’t to Fill the Sky with AI Drones That Predict Your Behavior

Imagine in the near future, a swarm of tiny drones patrolling the skies across the country. These drones are not being flown by any pilot and are entirely autonomous, carrying out their directives coded into them during manufacturing — surveil, record, follow, and even predict your next move. Sounds like something out of a dystopian Sci-Fi flick, right? Well, there is no need to imagine this scenario or to watch it in a movie.

It is already here.

Adam Bry and Abraham Bachrach, the CEO and CTO, respectively at a company called Skydio have helped usher in this new reality. The duo started together at MIT before moving on to Google and working on Project Wing Google. 

After moving on from building self-flying aircraft at Google, the duo founded Skydio and has been giving their autonomous drones to police departments ever since — for free.

“We‘re solving a lot of the core problems that are needed to make drones trustworthy and able to fly themselves,” Bry told Forbes in an interview this week. “Autonomy—that core capability of giving a drone the skills of an expert pilot built in, in the software and the hardware—that’s really what we’re all about as a company.”

According to Forbes, Skydio “claims to be shipping the most advanced AI-powered drone ever built: a quadcopter that costs as little as $1,000, which can latch on to targets and follow them, dodging all sorts of obstacles and capturing everything on high-quality video. Skydio claims that its software can even predict a target’s next move, be that target a pedestrian or a car.”

Keep reading

POLICE DEPARTMENTS USE “CITIGRAF” TO SURVEIL EVERYONE, EVEN SCHOOL KIDS

Five years ago, law enforcement asked Genetec, a company closely associated with Homeland Security, to help them develop a public surveillance program which can monitor anyone at the touch of a button.

As Wired.com warned Genetec’s “Citigraf” allows police departments to use a combination of surveillance devices to monitor the public 24/7.

“To get a clear picture of an emergency in progress, officers often had to bushwhack through dozens of byzantine databases and feeds from far-flung sensors, including gunshot detectors, license plate readers, and public and private security cameras.”

At the click of the “INVESTIGATE” button, Citigraf gives law enforcement the ability to go through a city’s historical police records and live sensor feeds, looking for patterns and connections of person.

Keep reading

Chatbots That Resurrect the Dead: Legal Experts Weigh in on “Disturbing” Technology

It was recently revealed that in 2017 Microsoft patented a chatbot which, if built, would digitally resurrect the dead. Using AI and machine learning, the proposed chatbot would bring our digital persona back to life for our family and friends to talk to. When pressed on the technology, Microsoft representatives admitted that the chatbot was “disturbing”, and that there were currently no plans to put it into production.

Still, it appears that the technical tools and personal data are in place to make digital reincarnations possible. AI chatbots have already passed the “Turing Test”, which means they’ve fooled other humans into thinking they’re human, too. Meanwhile, most people in the modern world now leave behind enough data to teach AI programmes about our conversational idiosyncrasies. Convincing digital doubles may be just around the corner.

But there are currently no laws governing digital reincarnation. Your right to data privacy after your death is far from set in stone, and there is currently no way for you to opt out of being digitally resurrected. This legal ambiguity leaves room for private companies to make chatbots out of your data after you’re dead.

Our research has looked at the surprisingly complex legal question of what happens to your data after you die. At present, and in the absence of specific legislation, it’s unclear who might have the ultimate power to reboot your digital persona after your physical body has been put to rest.

Keep reading

SWISS RESEARCHERS DEVELOP WEARABLE MICROCHIP THAT ELIMINATES BODILY PRIVACY ONCE AND FOR ALL

“In people who suffer from stress-related diseases, this circadian rhythm is completely thrown off and if the body makes too much or not enough cortisol, that can seriously damage an individual’s health, potentially leading to obesity, cardiovascular disease, depression or burnout.” – Adrian lonescu, Swiss Federal Institute of Technology Lausanne (EPFL), lead Nanoelectronic Devices Laboratory researcher While these devices may be helpful in a hospital setting, technology companies fully intent to integrate them into wearable tech like smart watches, pushing us closer to a world where everything we do is being tracked and recorded around the clock. “The joint R&D team at EPFL and Xsensio reached an important R&D milestone in the detection of the cortisol hormone,” said Xsensio CEO Esmeralda Magally. “Xsensio will make the cortisol sensor a key part of its Lab-on-SkinTM platform to bring stress monitoring to next-gen wearables.” These microchips are intended to eventually connect to the ‘internet of things,’ a comprehensive array of devices which track and record us at all times from our homes to our places of work. Former US intelligence chief James Clapper admitted over five years ago that the government ‘might’ use the internet of things to spy on you.

Keep reading

‘We have a hint… it may be possible’: Controversial stem cell therapy repaired injured spinal cords in 13 patients

Using a somewhat controversial stem cell therapy, a joint team of Japanese and US-based researchers have successfully repaired some damage in 13 patients with spinal cord injuries (SCI).

SCIs can often cause permanent loss of movement and physical sensation from resultant nerve damage. While physical rehabilitation programs can partially improve outcomes, actual treatment and recovery of lost mobility and function is nigh on impossible. Until now, perhaps.

According to new results from a phase 2 clinical trial conducted in an experimental collaboration by scientists in Japan and the US, patients treated with an intravenous infusion of their own mesenchymal stem cells (MSCs), harvested from their bone marrow, saw significant functional improvements.

Keep reading

Documentary Exposes How Facial Recognition Tech Doesn’t See Dark Faces Accurately

What not to NOT love about Artificial Intelligence (AI)?

  • Millions of jobs being lost (see 12)
  • Censorship, unwarranted surveillance, and other unethical and dangerous applications (see 12345)
  • Inaccuracies that can lead to life-altering consequences

One documentary reveals more unscrupulous details:

CODED BIAS explores the fallout of MIT Media Lab researcher Joy Buolamwini’s discovery that facial recognition does not see dark-skinned faces accurately, and her journey to push for the first-ever legislation in the U.S. to govern against bias in the algorithms that impact us all.

Modern society sits at the intersection of two crucial questions: What does it mean when artificial intelligence increasingly governs our liberties? And what are the consequences for the people AI is biased against? When MIT Media Lab researcher Joy Buolamwini discovers that most facial-recognition software does not accurately identify darker-skinned faces and the faces of women, she delves into an investigation of widespread bias in algorithms. As it turns out, artificial intelligence is not neutral, and women are leading the charge to ensure our civil rights are protected.

Keep reading

Arizona’s $24-Million Prison Management Software Is Keeping People Locked Up Past The End Of Their Sentences

The Arizona Department of Corrections is depriving inmates of freedom they’ve earned. Its $24 million tracking software isn’t doing what it’s supposed to when it comes to calculating time served credits. That’s according to whistleblowers who’ve been ignored by the DOC and have taken their complaints to the press. Here’s Jimmy Jenkins of KJZZ, who was given access to documents showing the bug has been well-documented and remains unfixed, more than a year after it was discovered.

According to Arizona Department of Corrections whistleblowers, hundreds of incarcerated people who should be eligible for release are being held in prison because the inmate management software cannot interpret current sentencing laws.

KJZZ is not naming the whistleblowers because they fear retaliation. The employees said they have been raising the issue internally for more than a year, but prison administrators have not acted to fix the software bug. The sources said Chief Information Officer Holly Greene and Deputy Director Joe Profiri have been aware of the problem since 2019.

The management software (ACIS) rolled out during the 2019 Thanksgiving holiday weekend, which is always the best time to debut new systems that might need a lot of immediate tech support. Since its rollout, the software has generated 19,000 bug reports. The one at the center of this ongoing deprivation of liberty arose as the result of a law passed in June of that year. The law gave additional credit days to inmates charged with low-level drug offenses, increasing the credit from one day for every six served to three days for every seven.

Qualified inmates are only supposed to serve 70% of their sentences, provided they also complete some other prerequisites, like earning a GED or entering a substance abuse program. That law hasn’t been implemented in the Arizona prison system because the $24 million software can’t seem to figure out how to do it.

To be sure, legislation that changes time served credits for only a certain percentage of inmates creates problems for prison management systems. But that’s why you spend $24 million buying one, rather than just asking employees if they’re any good at Excel.

But that’s what has actually happened. With the expensive software unable to correctly calculate time served credits, prison employees are doing it by hand.

Keep reading

“Spot’s Rampage” Event Awakens Us With Reality Of Dystopian World Ahead

To summarize the company’s manifesto. It said: “See Spot KILL!! Spot is an empathy-building tool, because: Cute and approachable!” 

Here’s the manifesto: 

See Spot Run. It tops out at a blistering 3mph.

See Spot Roll Over. Spot is an empathy missile, shaped like man’s best friend and targeted straight at our fight or flight instinct. When killer robots come to America they will be wrapped in fur, carrying a ball. Spot is Rob Rhinehart’s ideal pet: it never shits.

Good Boy, Spot! Everyone in this world takes one look at cute little Spot and knows: this thing will definitely be used by police and the military to murder people. And what do police departments have? Strong unions! Spot is employee of the month. You never need to union bust a robot – but a robot can union bust you.

The manifesto continued, “Boston Dynamics and they HATED this idea.” They said the robotics company even offered them two free robots to call off the event. 

See Spot KILL!! Spot is an empathy building tool, because: Cute and approachable! We talked with Boston Dynamics and they HATED this idea. They said they would give us another TWO Spots for FREE if we took the gun off. That just made us want to do this even more and if our Spot stops working just know they have a backdoor override built into each and every one of these little robots.

See Spot Fall Over And Freak Out. Quite an experience to live in fear, isn’t it? That’s what it is to be a slave. Our saving grace: Spot is evil but not very good at its job.

Boston Dynamics wasn’t thrilled with the stunt. 

Keep reading