UK’s regulator Ofcom, which is tasked with enforcing the sweeping online censorship and age verification law, the Online Safety Act, has appointed members of its “Online Information Advisory Committee” (formerly known as “Advisory Committee on Disinformation and Misinformation”), which will advise Ofcom on “misinformation” and “disinformation.”
Lord Richard Allan, who was last November appointed a non-executive director of Ofcom’s Board for a four-year term, now chairs the Committee, comprised of five members – most of whom have prominent track records as pro-censorship advocates.
One is Jeffrey Howard, a political philosophy professor at University College London (UCL), whose website’s research page includes an upcoming article titled, “The Ethics of Social Media: Why Content Moderation is a Moral Duty.”
Howard says the article defends platforms’ “moral responsibility” to “proactively” moderate “wrongfully harmful or dangerous speech” as one of justifications for platforms to censor out of a sense of “moral duty.”
Elisabeth Costa, Chief of Innovation and Partnerships at the Business Insights Team (BIT, which started off as the “Nudge Unit“) is another Committee member.
Costa should feel right at home helping enforce the Online Safety Act, given that BIT has close ties to many governments and international organizations that push for the kinds of censorship like “prebunking.”