President-elect Joe Biden has received applause across the political spectrum over his picks for top foreign policy and national security roles in his incoming administration. But human rights groups and progressives have expressed concern about his choice for director of national intelligence (DNI), Avril Haines.
Haines, a former deputy CIA director who would be the first woman to serve as the top US spy chief if confirmed, played a central role in crafting the legal framework surrounding the Obama administration’s controversial, secretive drone war.
As a widely-cited 2013 Newsweek profile put it: “Haines was sometimes summoned in the middle of the night to weigh in on whether a suspected terrorist could be lawfully incinerated by a drone strike.”
“My concerns about her are more my concerns about the Obama administration,” Andrea J. Prasow, the deputy Washington director of Human Rights Watch, told the New York Times. “With these cabinet picks, we are returning to the previous administration instead of making bold and forward-leaning picks.”
Former President Barack Obama still can’t shake his legacy as the “drone president” given he still holds the record for number of ordered covert assassination strikes via drones.
“There were ten times more air strikes in the covert war on terror during President Barack Obama’s presidency than under his predecessor, George W. Bush,” one prior human rights study found.
“Obama embraced the US drone program, overseeing more strikes in his first year than Bush carried out during his entire presidency. A total of 563 strikes, largely by drones, targeted Pakistan, Somalia and Yemen during Obama’s two terms, compared to 57 strikes under Bush,” the study said.
This infamously included not only the killing of Yemeni-American citizen Anwar al-Awlaki due to his suspected al-Qaeda links, but also his son, 16-year-old US citizen and Colorado native Abdulrahman Anwar al-Awlaki, by a drone airstrike ordered by Obama on October 14, 2011. The boy was not even suspected of a crime upon his death while he had been casually eating dinner with this friends at a cafe in Yemen.
The Obama administration later claimed the teen’s death was “collateral damage” and despite lawsuits related to the CIA operation, no US official has ever been held accountable for literally assassinating two US citizens without trial or so much as filing official charges.
The Air Force’s clandestine flight test center deep inside the Nevada Test and Training Range, known as Area 51 or Groom Lake, among more colorful nicknames, continues to grow as it approaches its seventh decade of operations. Constant construction has grown the remote facility dramatically since the turn of the millennium, including the addition of a massive and still mysterious hangar built at the base’s remote southern end. Now, an even larger extension to an existing hangar facility that is quite peculiar in nature points to the very real possibility that the age of large swarms of unmanned combat air vehicles (UCAVs) has finally arrived.
From self-driving cars, to digital assistants, artificial intelligence (AI) is fast becoming an integral technology in our lives today. But this same technology that can help to make our day-to-day life easier is also being incorporated into weapons for use in combat situations.
Weaponised AI features heavily in the security strategies of the US, China and Russia. And some existing weapons systems already include autonomous capabilities based on AI, developing weaponised AI further means machines could potentially make decisions to harm and kill people based on their programming, without human intervention.
Countries that back the use of AI weapons claim it allows them to respond to emerging threats at greater than human speed. They also say it reduces the risk to military personnel and increases the ability to hit targets with greater precision. But outsourcing use-of-force decisions to machines violates human dignity. And it’s also incompatible with international law which requires human judgement in context.
Indeed, the role that humans should play in use of force decisions has been an increased area of focus in many United Nations (UN) meetings. And at a recent UN meeting, states agreed that it’s unacceptable on ethical and legal grounds to delegate use-of-force decisions to machines – “without any human control whatsoever”.
But while this may sound like good news, there continues to be major differences in how states define “human control”.