YouTube Shuts Down Channels Using AI To Create Fake Movie Trailers Watched By Millions

 YouTube has terminated two prominent channels that used artificial intelligence to create fake movie trailers, Deadline can reveal.

The Google-owned video giant has switched off Screen Culture and KH Studio, which together boasted well over 2 million subscribers and more than a billion views.

The channels have been replaced with the message: “This page isn’t available. Sorry about that. Try searching for something else.”

Screen Culture and KH Studio were approached for comment. They are based in India and Georgia, respectively.

Earlier this year, YouTube suspended ads on Screen Culture and KH Studio following a Deadline investigation into fake movie trailers plaguing the platform since the rise of generative AI.

The channels later returned to monetization when they started adding “fan trailer,” “parody” and “concept trailer” to their video titles. But those caveats disappeared In recent months, prompting concern in the fan-made trailer community.

YouTube’s position is that the channels’ decision to revert to their previous behavior violated its spam and misleading-metadata policies. This resulted in their termination.

“The monster was defeated,” one YouTuber told Deadline following the enforcement action.

Deadline’s investigation revealed that Screen Culture spliced together official footage with AI images to create franchise trailers that duped many YouTube viewers.

Screen Culture founder Nikhil P. Chaudhari said his team of a dozen editors exploited YouTube’s algorithm by being early with fake trailers and constantly iterating with videos.

For example, Screen Culture had created 23 versions of a trailer for The Fantastic Four: First Steps by March, some of which outranked the official trailer in YouTube search results. More recent examples include HBO’s new Harry Potter series and Netflix’s Wednesday.

Our deep dive into fake trailers revealed that instead of protecting copyright on these videos, a handful of Hollywood studios, including Warner Bros Discovery and Sony, secretly asked YouTube to ensure that the ad revenue from the AI-heavy videos flowed in their direction. The studios declined to comment.

Disney properties featured prominently on Screen Culture and KH Studio. The Mouse House sent a cease-and-desist letter to Google last week, claiming that its AI training models and services infringe on its copyrights on a “massive scale.”

Keep reading

Liberals want to control what you watch online

New regulations from the Liberal Government’s Canadian Radio-television and Telecommunications Commission (CRTC) are trying to apply ‘Canadian content’ (CanCon) requirements to online platforms like YouTube and Spotify.

What could this mean for your online experience?

Will content that the Government doesn’t designate as sufficiently ‘Canadian’ disappear from your streaming platforms? Could companies like Netflix decide to pull out of Canada altogether rather than try to comply with onerous requirements?

Host Kris Sims is joined by longtime journalist and former CRTC vice-chair Peter Menzies to discuss what it all means.

Keep reading

YouTube says it will comply with Australia’s teen social media ban

Google’s YouTube shared a “disappointing update” to millions of Australian users and content creators on Wednesday, saying it will comply with a world-first teen social media ban by locking out users aged under 16 from their accounts within days.

The decision ends a stand-off between the internet giant and the Australian government which initially exempted YouTube from the age restriction, citing its use for educational purposes. Google (GOOGL.O) had said it was getting legal advice about how to respond to being included.

“Viewers must now be 16 or older to sign into YouTube,” the company said in a statement.

“This is a disappointing update to share. This law will not fulfill its promise to make kids safer online and will, in fact, make Australian kids less safe on YouTube.”

The Australian ban is being closely watched by other jurisdictions considering similar age-based measures, setting up a potential global precedent for how the mostly U.S. tech giants behind the biggest platforms balance child safety with access to digital services.

The Australian government says the measure responds to mounting evidence that platforms are failing to do enough to protect children from harmful content.

Keep reading

YouTube deletes hundreds of videos documenting Israeli war crimes

YouTube, owned by Google LLC, has deleted more than 700 videos documenting Israeli human rights violations, citing compliance with US sanctions imposed on Palestinian human rights groups cooperating with the International Criminal Court (ICC), according to an investigation by The Intercept published on 5 November.

The investigation revealed that the videos were removed after US President Donald Trump’s administration sanctioned three Palestinian organizations over their work with the ICC on war crimes cases against Israeli leaders.

The organizations sanctioned are Al-Haq, Al Mezan Center for Human Rights, and the Palestinian Centre for Human Rights.

The deletions, carried out in early October, erased years of archives detailing Israeli atrocities in Gaza and the occupied West Bank, including footage of home demolitions, civilian killings, and torture testimonies from Palestinians. 

Among the deleted material were investigations into the murder of Palestinian-American journalist Shireen Abu Akleh and documentaries such as ‘The Beach’, which recounts the killing of children by an Israeli airstrike as they played by the sea.

YouTube confirmed the removals were made in compliance with “trade and export laws” after Trump sanctioned the groups. 

Human rights advocates said the company’s decision effectively aided US efforts to suppress evidence of Israeli atrocities.

“It’s really hard to imagine any serious argument that sharing information from these Palestinian human rights organizations would somehow violate sanctions,” said Sarah Leah Whitson of Democracy for the Arab World Now.

The Center for Constitutional Rights condemned the decision as an attempt to erase war crimes evidence, while Al-Haq described the move as “an alarming setback for human rights and freedom of expression.” 

The Palestinian Centre for Human Rights said YouTube’s action “protects perpetrators from accountability,” accusing Google of complicity in silencing victims of Israeli aggression.

Al Mezan stated that its channel was removed without warning. The three organizations warned that US-based platforms hosting similar content could soon face the same censorship, potentially erasing further documentation of Israeli war crimes.

The Intercept investigation highlighted YouTube’s bias, noting that pro-Israel material remains largely untouched while Palestinian narratives are disproportionately targeted.

Keep reading

Popular YouTube personality ‘Mr. Crafty Pants’ busted for child porn

A popular influencer and YouTube creator known as “Mr. Crafty Pants” was busted for allegedly possessing and trading disturbing sexual images of children on social media, according to reports.

Michael David Booth — who has nearly 600,000 subscribers on his YouTube channel featuring arts and crafts tutorials with smart cutting machines — was arrested Wednesday in Kentucky after repeatedly sharing the sickening images on the social media messaging app Kik, Wave News reported.

The twisted online personality allegedly possessed six explicit photos — three showing children under the age of 12 and the others depicting teens — which he shared more than 15 times with other users between Aug. 4 and Aug. 7, according to an arrest report obtained by the outlet.

Police said they launched their probe into the depraved 39-year-old in August after the social app flagged his account, which was traced to an IP address linked to Booth’s Norton Commons home.

“It’s scary to think what he could have had access to,” his neighbor, Laura Nash, told the outlet, adding that she immediately asked her children if they had ever interacted with the creep after he was arrested.

Keep reading

Researchers expose large-scale YouTube malware distribution network

Check Point researchers have uncovered, mapped and helped set back a stealthy, large-scale malware distribution operation on YouTube they dubbed the “YouTube Ghost Network.”

The network published more than 3,000 videos across compromised or fake channels, luring viewers with game cheats, cracked software, or pirated tools, but instead delivering malware or phishing pages. 

The YouTube Ghost Network

The YouTube Ghost Network is strikingly similar to the Stargazers Ghost Network, a previously uncovered network of fake or hijacked GitHub accounts that served as a malware and phishing link Distribution-as-a-Service.

In the Stargazers Ghost Network, different accounts filled different roles. Some accounts directed targets to malicious downloads, others served malware, and others still starred, forked, and subscribed to malicious repositories, in an obvious attempt to make the other accounts appear legitimate to potential victims.

Similarly, the YouTube Ghost Network consists of video accounts, post accounts, and interact accounts.

Video accounts, which are either hijacked or created by the malware peddlers, upload videos that promise something appealing, e.g., a free/cracked version of Adobe Photoshop, or game hacks for popular games like Roblox. The descriptions contain download links or direct viewers to password-protected archives on services like Dropbox, Google Drive or MediaFire, and they often tell users to temporarily disable Windows Defender before installing the downloaded cracked software.

Post accounts publish community posts with the same links and passwords, and interact accounts flood comment sections with fake endorsements, creating a false sense of trust.

Keep reading

‘Massive legal siege’ against social media companies looms

Thousands of plaintiffs’ complaints, millions of pages of internal documents and transcripts of countless hours of depositions are about to land in U.S. courtrooms, threatening the future of the biggest social media companies.

The blizzard of paperwork is a byproduct of two consolidated lawsuits accusing Snap Inc.’s Snapchat; Meta Platforms Inc.’s Facebook and Instagram; ByteDance Ltd.’s TikTok; and Alphabet Inc.’s YouTube of knowingly designing their platforms to addict users — allegedly resulting in youth depression, anxiety, insomnia, eating disorders, self-harm and even suicide.

The litigation, brewing for more than three years, has had to overcome numerous hurdles, including the liability shield that has protected social media platforms from facing user-harm lawsuits. The social media companies have filed multiple motions to dismiss the cases on the grounds that Section 230 of the Communications Decency Act prevents them from being held accountable for content posted on their sites.

Those motions have been largely unsuccessful, and courtrooms across the country are poised to open their doors for the first time to the alleged victims of social media. The vast majority of cases have been folded into two multijurisdictional proceedings, one in state and the other in federal court, to streamline the pretrial discovery process.

The first bellwether trial is scheduled to begin in Los Angeles Superior Court in late January. It involves a 19-year-old woman from Chico, California, who says she’s been addicted to social media for more than a decade and that her nonstop use of the platforms has caused anxiety, depression and body dysmorphia. Two other trials will follow soon after, with thousands more waiting in the wings. If successful, these cases could result in multibillion-dollar settlements — akin to tobacco and opioid litigation — and change the way minors interact with social media.

“This is going to be one of the most impactful litigations of our lifetime,” said Joseph VanZandt, an attorney at Beasley Allen Law Firm in Montgomery, Alabama, and co-lead plaintiffs’ attorney for the coordinated state cases. “This is about large corporations targeting vulnerable populations — children — for profit. That’s what we saw with the tobacco companies; they were also targeting adolescents and trying to get them addicted while they were young.”

Matthew Bergman, founder of the Social Media Victims Law Center in Seattle, makes a similar comparison to tobacco litigation in the Bloomberg documentary Can’t Look Away: The Case Against Social Media. “In the case of Facebook, you have internal documents saying ‘tweens are herd animals,’ ‘kids have an addict’s narrative’ and ‘our products make girls feel worse about themselves.’ You have the same kind of corporate misconduct,” Bergman says in the film, which will be available to view on Bloomberg’s platforms on October 30.

Bergman’s firm was the first to file user-harm cases against social media companies, in 2022, after Frances Haugen, a former Meta product manager-turned-whistleblower, released a trove of internal documents showing the company knew social media was negatively impacting youth mental health. The first case, which is part of the consolidated federal litigation, alleged that an 11-year-old Connecticut girl killed herself after suffering from extreme social media addiction and sexual exploitation by online predators.

What set that case apart was how it got around Section 230’s immunity blanket. Bergman argued that his case wasn’t about third-party content, which the federal law protects. Instead, he said it hinged on the way social media companies were intentionally designing their products to prioritize engagement and profit over safety.

Keep reading

YouTube Bows to Trump in Censorship Lawsuit, Will Pay Millions to Avoid Court

And then there were none.

YouTube, a Google subsidiary, became the last of three tech titans to settle a lawsuit brought forth by President Donald Trump, according to a blistering report from The Wall Street Journal.

The video sharing platform agreed to pay a hefty $24.5 million to settle lawsuits brought forth by Trump in 2021.

At the time, the president’s YouTube account had been banned following the Jan. 6 incursion at the U.S. Capitol.

YouTube claimed that they had gone to those extraordinary lengths to remove Trump’s channel to nix potential videos that may incite violence.

(The channel was reinstated in March 2023.)

The YouTube settlement is the second-biggest of the lawsuits brought against various tech titans by Trump — and that appears to be intentional.

The biggest settlement Trump had was with Facebook parent company Meta Platforms, which was for $25 million.

“Google executives were eager to keep their settlement smaller than the one paid by rival Meta, according to people familiar with the matter,” The Wall Street Journal reported.

While $24.5 million does come in lower than the $25 million Meta paid, it’s more than double what X, formerly Twitter, paid Trump for a similar lawsuit, as the now-Elon Musk owned platform paid $10 million.

Interestingly, while Trump will “keep” most of this settlement money — $22 million — none of it will actually be going to him.

The Wall Street Journal noted that the money will be immediately rerouted to the nonprofit Trust for the National Mall, tasked with building a grand ballroom near the White House.

The other $2.5 million will be dispersed among various other plaintiffs. There is no mention of attorney fees.

This decision comes months after YouTube was apparently having “productive conversations” with the Trump administration in June, per The Hill.

Keep reading

Critics Accuse YouTube of Dragging Out Return Process for Banned Channels

YouTube is being criticized for what many see as backpedaling on its commitment to free speech, after pledging to restore banned accounts, only to continue removing new channels created by previously banned figures.

The initial assurance came in a letter dated September 23, 2025, addressed to House Judiciary Committee Chairman Jim Jordan.

In that communication, YouTube acknowledged its past enforcement actions, which included terminating channels over election-related and COVID-19 content under policies that have since changed. The company claimed that its current guidelines permit more room for such topics and asserted:

“Reflecting the Company’s commitment to free expression, YouTube will provide an opportunity for all creators to rejoin the platform if the Company terminated their channels for repeated violations of COVID-19 and elections integrity policies that are no longer in effect.”

The same day, YouTube posted a message on X describing a “limited pilot project” that would provide “a pathway back to YouTube for some terminated creators to set up a new channel.”

However, the platform immediately added that this option would only apply to a “subset” of creators.

The vagueness of the commitment raised suspicion, which intensified when two prominent figures, Infowars founder Alex Jones and “America First” host Nick Fuentes, launched new channels that were almost immediately taken down.

Keep reading