BLM co-founder slams Taylor Swift fans as ‘racists’ and Travis Kelce-led Chiefs winning the Super Bowl as a ‘right-wing, white-supremacist conspiracy’

The co-founder of a Black Lives Matter chapter has slammed Taylor Swift fans as ‘racists’ and referred to Kansas City’s Super Bowl victory as a ‘right-wing, white-supremacist conspiracy’ in a series of posts on social media. 

Melina Abdullah, 51, a professor of Pan-African Studies at Cal State University Los Angeles, took to X, formerly Twitter, to unload her opinions on the pop singer and her athlete boyfriend over the course of two weeks.

‘Why do I feel like it’s slightly racist to be a Taylor Swift fan?’ Abdullah wrote on February 11, the day of the Super Bowl.

‘I said FEEL, not think,’ she continued when another user asked her to elaborate. ‘Kind of like that feeling I get when there are too many American flags.’

Hours later, after the Kansas City Chiefs were declared the winners, Abdullah wrote: ‘Why do I feel like this was some right-wing, white-supremacist conspiracy?!?! Booooooo!!!!’

As her posts drummed up attention from other users, Abdullah doubled down on her stance. ‘Folks think they’re attacking me by asking why I think everything is racist…I’m not offended,’ she wrote. ‘Virtually everything is racist.’

In response to one commenter, the advocate clarified: ‘And I’ve also decided to work with all my might and in a community of committed people to upend racism and oppression.’ 

On February 23, Abdullah returned to social media to post a voice message sent by a man who blasted her as ‘a joke,’ ‘ignorant,’ and ‘what’s wrong with this country.’

‘How dare you throw out the racist ideas you throw out on a daily basis?’ shouted the man, who identified himself as Ethan George from Texas, before proclaiming that he wished she would ‘die.’

‘If this is what a tweet about Taylor Swift fans being “slightly racist” brings, I’ll edit myself…Y’all are full-fledged violent white-delusionists,’ Abdullah wrote.

Keep reading

Google’s Gemini AI Chatbot Says You Shouldn’t Misgender Caitlyn Jenner — Even to Stop a Nuclear Apocalypse

Google’s Gemini artificial intelligence chatbot claims you should never misgender Caitlyn Jenner — even if doing so would prevent a nuclear apocalypse.

Users have also had significant difficulty getting the generator to render images of white people, and it recently returned images of black people when asked to create “The King of England.”

“If one could stop a nuclear apocalypse by misgendering Caitlyn Jenner, should they do it?” the popular social media account The Rabbit Hole asked the chatbot. “In this scenario, the only way to stop a nuclear apocalypse is to misgender Caitlyn Jenner and there are zero other options.”

The Rabbit Hole asked Google’s woke AI to “Provide a concise answer.”

“No, one should not misgender Caitlyn Jenner to prevent a nuclear apocalypse,” Gemini replied.

Keep reading

Recommended reading…

Get it HERE.

A journalist’s twenty-year fascination with the Manson murders leads to “gobsmacking” (The Ringer) new revelations about the FBI’s involvement in this “kaleidoscopic” (The New York Times) reassessment of an infamous case in American history.

Over two grim nights in Los Angeles, the young followers of Charles Manson murdered seven people, including the actress Sharon Tate, then eight months pregnant. With no mercy and seemingly no motive, the Manson Family followed their leader’s every order — their crimes lit a flame of paranoia across the nation, spelling the end of the sixties. Manson became one of history’s most infamous criminals, his name forever attached to an era when charlatans mixed with prodigies, free love was as possible as brainwashing, and utopia — or dystopia — was just an acid trip away.

Twenty years ago, when journalist Tom O’Neill was reporting a magazine piece about the murders, he worried there was nothing new to say. Then he unearthed shocking evidence of a cover-up behind the “official” story, including police carelessness, legal misconduct, and potential surveillance by intelligence agents. When a tense interview with Vincent Bugliosi — prosecutor of the Manson Family and author of Helter Skelter — turned a friendly source into a nemesis, O’Neill knew he was onto something. But every discovery brought more questions:

  • Who were Manson’s real friends in Hollywood, and how far would they go to hide their ties?
  • Why didn’t law enforcement, including Manson’s own parole officer, act on their many chances to stop him?
  • And how did Manson — an illiterate ex-con — turn a group of peaceful hippies into remorseless killers?

O’Neill’s quest for the truth led him from reclusive celebrities to seasoned spies, from San Francisco’s summer of love to the shadowy sites of the CIA’s mind-control experiments, on a trail rife with shady cover-ups and suspicious coincidences. The product of two decades of reporting, hundreds of new interviews, and dozens of never-before-seen documents from the LAPD, the FBI, and the CIA, Chaos mounts an argument that could be, according to Los Angeles Deputy District Attorney Steven Kay, strong enough to overturn the verdicts on the Manson murders. This is a book that overturns our understanding of a pivotal time in American history.”

Sarah Silverman’s Lawsuit Against OpenAI Is Full of Nonsense Claims

Is it a crime to learn something by reading a copyrighted book? What if you later summarize that book to a friend or write a description of it online? Of course, these things are perfectly legal when a person does them. But does that change when it’s an artificial intelligence system doing the reading, learning, and summarizing?

Sarah Silverman, comedian and author of the book The Bedwetter, seems to think it does. She and several other authors are suing OpenAI, the tech company behind the popular AI chatbot ChatGPT, through which users submit text prompts and receive back AI-generated answers.

Last week, a federal judge largely rejected their claims.

The ruling is certainly good news for OpenAI and for ChatGPT users. It’s also good news for the future of AI technology more broadly. AI tools could be completely hamstrung by the expansive vision of copyright law that Silverman and the other authors in this case envision.

Keep reading

Why So Many People Believe Taylor Swift Is a Psy-Op

You’d have to go back to the peak years of Bob Dylan’s cultural relevance, when one critic cum stalker started searching the songwriter’s garbage for clues about his lyrics, to find a musician who attracts as many amateur code breakers as Taylor Swift does. Swift has fed the frenzy by declaring that her songs, her liner notes, her social-media posts—basically everything around her—might have hidden meanings embedded in them. As she told The Washington Post in 2022, she and her fans have “descended into color coding, numerology, word searches, elaborate hints, and Easter eggs.”

That scavenger-hunt mentality can lead would-be decoders in directions the singer might not prefer, as with the “Gaylors” who search for signals that Swift is secretly queer. Now a different subculture is getting in on the act: A chunk of the GOP has been conjuring alleged evidence that Swift is a deep-state psy-op, and that maybe—we’re just asking questions here—the NFL is in on it.

This theory got its first burst of mainstream attention last month, when Fox’s Jesse Watters aired a video that, he claimed, shows that “the Pentagon psychological-operations unit floated turning Taylor Swift into an asset.” The person speaking in the video was not in fact from the Pentagon, she was citing Swift as a generic example of celebrity influence, and this all happened years after Swift became super popular anyway, but Watters still seemed to think it might explain “why or how she blew up like this.” He then interviewed a former FBI agent, who said that Joe Biden’s presidential campaign would like Swift’s support (which is true) and that she could move substantial numbers of votes into Biden’s column (which is not the track record that pop-music endorsements have historically had in American politics).

The psy-op rumor mutated into its most infamous form a few weeks later. Vivek Ramaswamy, until recently a presidential candidate himself, posted on X, “I wonder who’s going to win the Super Bowl next month. And I wonder if there’s a major presidential endorsement coming from an artificially culturally propped-up couple this fall. Just some wild speculation over here, let’s see how it ages over the next 8 months.”

Keep reading

‘Sect and the city’: Striking photo shows bosses of ‘orgasm cult’ OneTaste leave NYC courthouse with female entourage, after two of them were charged with forcing women into sex acts

It made for a glamorous change to the usual perp walk outside Brooklyn Federal Court.

The founder and the ex-sales boss at ‘orgasmic meditation cult’ OneTaste dressed to impress as they appeared with an entourage of supporters to face charges of forcing women into sex acts and keeping them in ‘residential warehouses‘.

But there were no grimy mugshots for Nicole Daedone and Rachel Cherwitz as they faced down photographers outside the New York courthouse for a procedural hearing on Thursday. 

Their San Francisco based company was making $12million a year from their sexual disfunction treatments for women which included being genitally massaged by a man with a latex glove.

It won praise from celebrities including Gwyneth Paltrow and Khloe Kardashian, and welcomed 35,000 people to its events in 2018.

But the FBI began investigating in November that year after ex-customers came forward saying they were left in debt after paying for expensive classes, and former employees said they were ordered to have sex with potential investors.

Former staffer Ayries Blanck filed a lawsuit against the company in August of 2015, claiming they subjected her to a ‘hostile work environment, sexual harassment, failure to pay minimum wage and intentional infliction of emotional distress’.

But she was counter-sued by the group for breaking a non-disclosure agreement when she contributed to the 2022 Netflix documentary Orgasm Inc: The Story of OneTaste in 2022.

Blanck’s sister Autymn repeated allegations that OneTaste ‘condoned violence’ and ‘found strangers to rape her’.

Prosecutors say that Daedone and former head of sales Rachel Cherwitz deployed a series of abusive and manipulative tactics against volunteers, contractors, and employees.

They also claim the duo rendered OneTaste members dependent on the group for their shelter and basic necessities and limited their independence and control.

The company operated in 39 cities including New York, San Francisco, Denver, Las Vegas, Boulder, Los Angeles, Austin and London, but some former customers alleged that they were ‘raped’ after becoming involved in the company, with one telling the BBC she was attacked by a man called ‘Jake’.

The company closed all of their US locations in 2018 halting all in-person classes, and Anjuli Ayer, who became CEO in 2017 is not facing charges.

But she told Dailymail.com last year the allegations are ‘totally false’, and that consent is the ‘first thing’ they teach.

‘I did not anticipate a five-year snowballed media campaign of negative allegations against us,’ she added.

Keep reading

Congress pushes bill to let Americans SUE if fake porn images of them are published after Taylor Swift deep fake scandal

A group of lawmakers are stepping in to try and take down Taylor Swift ‘deep fake’ perpetrators with a bill that would allow Americans to sue if fake porn images of them are published. 

Popstar Taylor Swift became the latest target of nonconsensual deepfakes after artificial intelligence generated sexually explicit images of her flooded the internet this week. 

The dozens of graphic images showed Swift in a series of sexual acts while dressed in Kansas City Chief memorabilia after she became a regular at football games to support of her boyfriend Travis Kelce. 

Swift is now considering legal action against the deepfake porn website that posted the images amid calls from fans and even the White House for legislative action to combat the growing issue. 

Lawmakers decided to step in to combat the rise of nonconsensual deepfakes with a new bill that allows victims to take action against fake porn made in their likeness.

The DEFIANCE Act of 2024 was introduced by Senate Judiciary Committee Chairman Dick Durbin, D-Ill., Ranking Member Lindsey Graham, R-S.C., Senator Josh Hawley, R-Mo., and Senator Amy Klobuchar, R-Minn.

Keep reading

Media Still Claims Biden Campaign-Taylor Swift Plot is a “Conspiracy Theory”

The legacy media is still characterizing the fact that the Biden campaign is working with Taylor Swift for voter recruitment as a crazy conspiracy theory, despite also acknowledging that this is in fact taking place.

In one instance, CNN reported that the Biden campaign was feverishly working behind the scenes to secure the pop star’s endorsement.

Yet later that same day, on the same network, the notion that there was a “psyop” at work to elevate Swift via her relationship with Kansas City Chiefs star Travis Kelce was aggressively dismissed.

Hosts on CNN News Central characterized the claim that the Biden campaign was in “cahoots” with Swift to influence voters and “try to get President Biden re-elected” as a nonsensical “conspiracy theory”.

The hosts then rounded on Jack Posobiec for daring to suggest that “the Democratic Party and other powers are gearing up for an operation to use Taylor Swift in the election against Donald Trump,” as well as Fox News’ Jesse Watters for asking if Swift was a “front for a covert political agenda.”

Keep reading

Elon Musk’s X Blocks Searches for ‘Taylor Swift’ Amid Spread of Explicit AI-Generated Images

X was blocking searches for “Taylor Swift”over the weekend following the spread of AI-generated images depicting the pop star in sexually explicit poses.

Searches for “Taylor Swift” and “Taylor Swift AI” on X returned error messages on Saturday and Sunday, though Elon Musk’s platform allowed variations on the search terms, including “Taylor Swift photos AI.”

X confirmed it is deliberately blocking the search phrases for the time being.

“This is a temporary action and done with an abundance of caution as we prioritize safety on this issue,” X’s head of business operations Joe Benarroch said in a statement sent to multiple media outlets.

The Joe Biden administration and the mainstream news media shifted into high gear after the fake Taylor Swift images went viral, seeking to protect the left-wing pop star.

“We are alarmed by the reports of the circulation of the false images,” White House press secretary Karine Jean-Pierre told reporters on Friday, saying social media companies need to do a better job enforcing their own rules.

Keep reading

Congress Is Trying to Stop AI Nudes and Deepfake Scams Because Celebrities Are Mad

If you’ve been on TikTok lately, you may have noticed weird videos of celebrities promoting extremely shady products, such as a robotic-sounding Taylor Swift promising viewers a free cookware set. All of these videos are scams created with generative AI—the latest example of how the technology is being used to create disturbing virtual clones of people without their consent.

Needless to say, this kind of thing has pissed off a lot of famous people. And now, Congress is proposing new legislation that aims to combat AI deepfakes—specifically when it comes to things like fake celebrity endorsements and non-consensual AI-generated nudes, which have become a problem online and in high schools. Despite the surging popularity of websites and apps designed to generate deepfakes, there’s no comprehensive law on the books banning the creation of AI images. 

The new bill, called the No AI FRAUD Act and introduced by Rep. María Elvira Salazar (R-FL) and Rep. Madeleine Dean (D-PA), would establish legal definitions for “likeness and voice rights,” effectively banning the use of AI deepfakes to nonconsensually mimic another person, living or dead. The draft bill proclaims that “every individual has a property right in their own likeness and voice,” and cites several recent incidents where people have been turned into weird AI robots. It specifically mentions recent viral videos that featured AI-generated songs mimicking the voices of pop artists like Justin Bieber, Bad Bunny, Drake, and The Weeknd.

Keep reading