
Danger, Will Robinson!


You may now be able to discuss coronavirus origins on Facebook – because the current US administration happened to say so – but it appears people are still most decidedly banned from discussing their own scientific work – if it happens to be related to how exactly Facebook goes about censoring and canceling people.
That circle is now complete for renowned UK academic Nicholas O’Shaughnessy, an information and propaganda control researcher, who has been presented with an abrupt, but “lifetime” ban on the platform – for posting about his own research that has in the meanwhile been cited by the academic community.
On June 26, reports say, this professor emeritus at two London University colleges and a Cambridge Quondam fellow was informed that his account got banned. Facebook at least was “egalitarian” here in that it provided the noted academic with no more explanation about what he had supposedly done wrong than it does any of its “ordinary” censorship victims – beyond accusing them of violating unspecified “community standards.”
But the note did have a tone of sinister finality to it, reading, “unfortunately, we won’t be able to reactivate it (the account) for any reason. This will be our last message regarding your account.”
And because even after failing in his attempts to pry the reason for his banning from Facebook’s cold censorship hands, the professor, like the rest of us, is now left guessing why this happened.
Facebook users have recently reported being sent warning messages from the social media giant relating to “extremists” or “extremist content.”
“Are you concerned that someone you know is becoming an extremist?” one of the purported messages read. “We care about preventing extremism on Facebook. Others in your situation have received confidential support,” it adds before offering the button to “Get Support,” which ostensibly leads to another Facebook page about extremism.
Redstate editor Kira Davis, who said was sent a screenshot of the message from a friend, wrote: “Hey has anyone had this message pop up on their FB? My friend (who is not an ideologue but hosts lots of competing chatter) got this message twice. He’s very disturbed.”
Before losing a recent landmark court case, Facebook attempted to distance itself from the human trafficking taking place on its platform. If only it were as hands-on about child sex crimes as it is about political speech.
The Texas Supreme Court ruled on Friday that Facebook can be held liable for the conduct of pimps and traffickers on its platform – a landmark decision that opens the firm up to further legal action from a trio of teenage trafficking victims.
In the suit, filed in 2018, the three young women accused the company of running “an unrestricted platform to stalk, exploit, recruit, groom, and extort children into the sex trade.” One was 15 when an older man contacted her on Facebook, offered her a modeling job, photographed her, posted the pictures on the now-defunct BackPage website, and prostituted her to other men, leading her to be “raped, beaten, and forced into further sex trafficking.” The other two girls were 14, and reported almost identical experiences, with one openly pimped out for “dates” on Instagram, a Facebook subsidiary.
“Facebook recently announced they’ll be moderating satire to make sure it doesn’t ‘punch down.’ Anything that punches down—that is, anything that takes aim at protected targets Facebook doesn’t want you joking about—doesn’t qualify as ‘true satire,’” Dillon wrote. “In fact, they’ve made it clear they’ll consider jokes that ‘punch down’ to be hatred disguised as satire.”
Dillon noted that Slate recently published a piece that accused the Bee of punching down.
“This is not a coincidence. Having failed in their effort to lump us in with fake news, the media and Big Tech are looking for new ways to work together to deplatform us. They now hope to discredit us by saying we’re spreading hatred—rather than misinformation—under the guise of satire,” Dillon wrote. “But we’re not punching down.’ We’re punching back.”
Dillon feels “the left’s new prohibition of ‘punching down’ is speech suppression in disguise” and blasted anyone who plays along.
“It’s people in positions of power protecting their interests by telling you what you can and cannot joke about. Comedians who self-censor in deference to that power are themselves a joke,” he wrote.
A number of emails seen by CNN — which uses them to make the case that the platform isn’t censoring enough — show that the Biden campaign repeatedly pressured Facebook to censor posts from the Trump campaign and its supporters about election integrity.
CNN’s own reporting confirms that Facebook changed its policies following the email exchange with Biden officials, yet goes on to quote Democrat activists who complain that the platform is still not censoring enough conservative content.
One post that the Biden campaign tried to have censored during the 2020 election was a video from Donald Trump Jr. in September 2020 calling for supporters to monitor early voting and counting boards.
Biden campaign officials tried to characterize the video as a call for violence, because Don Jr used the term “army” to refer to the volunteer effort, claims that were rebuffed by Facebook.

Emails obtained by CNN reveal how the Biden campaign pressured Facebook to censor President Donald Trump before the 2020 election.
The messages reveal how Biden campaign officials repeatedly insisted that Facebook remove information that it deemed to be ‘violent rhetoric’, a concern that seemed to be absent during months of leftists rioting and burning down entire city blocks throughout the summer.
After a deluge of public and private complaints by members of Biden’s team and other Democrats, a former Biden campaign staffer said Facebook “essentially did nothing” in response.
The focus was primarily on the official Team Trump account, with Biden officials infuriated that Facebook didn’t remove enough videos that warned people of upcoming election fraud.
Gee, I wonder why they were concerned about that.
“It was the most frustrating series of conversations,” a Biden aide said. “We went to Facebook with a series of letters, public complaints, private emails and all throughout, they essentially did nothing.”
Naturally, CNN spins the story as an example of how Facebook failed to clamp down on “misinformation,” despite the social network giant banning many of Trump’s most prominent supporters before the election and engaging in industrial-scale levels of censorship of pro-Trump content.
On a press call a few years ago, I asked Facebook’s head of cybersecurity policy, Nathaniel Gleicher, if the company would treat a misinformation campaign orchestrated by the US government the same as it would as one from a foreign adversary.
Facebook had organized the call to tout how it had discovered and deleted dozens of Iranian accounts, groups, and pages linked to “coordinated inauthentic behavior”—the company’s term for when people and organizations create fake accounts in an attempt to mislead and manipulate other users and the broader information landscape. The conversation came at a time when Facebook was conducting a spate of such announcements and media briefings championing its work removing phony networks tied to foreign governments.Recent reporting says US operatives “engage in campaigns to influence and manipulate social media.”
Gleicher’s response to my hypothetical question about whether they would react the same way was quite clear: “Yes. Part of the key of our operations here is that we engage based on behavior—not based on content and not based on the nature of the actor. And that’s been a very intentional choice on our part.”
Facebook announced it’s partnering with leading health organizations, including the WHO, and experts in other industries, to help solve “vaccine hesitancy.”
The coalition, called the Alliance for Advancing Health Online, was announced via a blog post on Facebook’s Newsroom. The purpose of the alliance is “to advance public understanding of how social media and behavioral sciences can be leveraged to improve the health of communities around the world.”
Facebook’s partners in the initiative include, the WHO, the CDC Foundation, the Bay Area Global Health Alliance, the MIT Initiative on the Digital Economy, Merck, Sabin Vaccine Institute, the Vaccine Confidence Project at the London School of Hygiene and Tropical Medicine, and the World Bank.
You must be logged in to post a comment.