FDACS removes over 85K illegal hemp products in child safety crackdown

Florida Agriculture Commissioner Wilton Simpson announced results of “Operation Safe Summer,” a statewide enforcement effort resulting in the removal of more than 85,000 hemp packages that were found in violation of state child-protection standards.

In the first three weeks of the operation, hemp-derived products were seized across 40 counties for “violations of Florida’s child-protection standards for packaging, labeling, and marketing,” according to a press release from the Department of Agriculture and Consumer Services.

Simpson said they will continue to “aggressively enforce the law, hold bad actors accountable, and put the safety of Florida’s families over profits.”

The state previously issued announcements advising hemp food establishments on the planned enforcement of amendments to Rule 5K-4.034, Florida Administrative Code, a press release said.

Keep reading

Government REFUSES to release ‘eSafety’ data behind YouTube kids ban

Labor Communications Minister Anika Wells has refused to release the research that underpins the eSafety Commissioner’s push to ban 15-year-olds from using YouTube.

The contentious recommendation, made by eSafety Commissioner Julie Inman Grant, has sparked widespread concern among stakeholders and the public. Yet Wells has declined to release the data informing the advice, citing the regulator’s preference to delay publication.

Sky News reports that the eSafety regulator has repeatedly blocked its attempts to access the full research, instead opting to “drip feed” select findings to the public over several months. This is despite the Albanese government expected to make a final decision in just weeks.

A spokesperson for Wells said: “The minister is taking time to consider the eSafety Commissioner’s advice. The minister has been fully briefed by the eSafety Commissioner including the research methodology behind her advice.”

However, the Commissioner’s own “Keeping Kids Safe Online: Methodology” report reveals several weaknesses in the data. The survey relied entirely on self-reported responses taken at one point in time and used “non-probability-based sampling” from online panels, described in the report as “convenience samples”.

Keep reading

Australia Orders Search Engines to Enforce Digital ID Age Checks

Australia has moved to tighten control over the digital environment with the introduction of three new online safety codes, measures that raise pressing privacy and censorship concerns.

These codes, formalized on June 27 under the Online Safety Act, go beyond introducing digital ID checks for adult websites; they also place substantial obligations on tech companies, from search engines and internet service providers (ISPs) to hosting platforms.

Businesses that fail to comply face the threat of significant financial penalties, with fines reaching as high as 49.5 million Australian dollars, or about $32.5 million US.

The codes seek to restrict Australian users’ exposure to material classified under two categories: Class 1C and Class 2.

Class 1C encompasses “online pornography – material that describes or depicts specific fetish practices or fantasies.”

Class 2 covers a broader range of content, from “online pornography – other sexually explicit material that depicts actual (not simulated) sex between consenting adults” (Class 2A), to “online pornography – material which includes realistically simulated sexual activity between adults. Material which includes high-impact nudity” or “other high-impact material which includes high-impact sex, nudity, violence, drug use, language and themes. ‘Themes’ includes social Issues such as crime, suicide, drug and alcohol dependency, death, serious illness, family breakdown, and racism” (Class 2B).

Schedule 1 – Hosting Services Online Safety Code, companies that provide hosting services within Australia, including social media platforms and web hosts, are compelled to implement six compliance measures.

A core requirement obliges these services to manage the risks posed by significant changes to their platforms that could make Class 1C or Class 2 material more accessible to Australian children.

Schedule 2 – Internet Carriage Services Online Safety Code targets ISPs. It mandates the provision of filtering tools and safety guidance to users and empowers the eSafety Commissioner to order the blocking of material deemed to promote or depict abhorrent violent conduct.

The Commissioner has previously exercised similar powers, as in the directive to block footage of a stabbing circulated on X.

Schedule 3 – Internet Search Engine Services Online Safety Code directs search engine providers to roll out age verification for account creation within six months.

These platforms are also instructed to develop systems capable of detecting and filtering out online pornography and violent material by default, where technically feasible and practicable.

Keep reading

South Dakota Follows Texas with Broader Online Digital ID Law

The Supreme Court’s endorsement of Texas’ age verification law for adult websites has paved the way for a surge of similar online digital ID measures across the country.

South Dakota is the first to follow, as its new statute requiring age verification or estimation for sites distributing adult content takes effect today.

However the South Dakota law is much broader and applies to a wider range of websites, not just those that have a large percentage of adult content.

We obtained a copy of the bill for you here.

The law applies broadly to any platform that regularly deals in explicit material, without setting a specific threshold for how much of the site’s content qualifies.

This contrasts with Texas’ approach, where the rule kicks in if at least one-third of a site’s material is deemed pornographic.

Keep reading

Supreme Court Greenlights Online Digital ID Checks

With a landmark ruling that could shape online content regulation for years to come, the US Supreme Court has upheld Texas’s digital ID age-verification law for adult websites and platforms, asserting that the measure lawfully balances the state’s interest in protecting minors with the free speech rights of adults.

The 6-3 decision, issued on June 27, 2025, affirms the constitutionality of House Bill 1181, a statute that requires adult websites to verify the age of users before granting access to sexually explicit material.

Laws like House Bill 1181, framed as necessary safeguards for children, are quietly eroding the rights of adults to access lawful content or speak freely online without fear of surveillance or exposure.

Under such laws, anyone seeking to view legal adult material online (and eventually even those who want to access social media platforms because may contain content “harmful” to minors) is forced to provide official identification, often a government-issued digital ID or even biometric data, to prove their age.

Supporters claim this is a small price to pay to shield minors from harmful content. Yet these measures create permanent records linking individuals to their browsing choices, exposing them to unprecedented risks.

We obtained a copy of the opinion for you here.

Keep reading

COPPA 2.0: The Age Check Trap That Means Surveillance for Everyone

A new Senate bill designed to strengthen online privacy protections for minors could bring about major changes in how age is verified across the internet, prompting platforms to implement broader surveillance measures in an attempt to comply with ambiguous legal standards.

The Children and Teens’ Online Privacy Protection Act (S.836) (COPPA 2.0), now under review by the Senate Commerce Committee, proposes raising the protected age group from under 13 to under 17. It also introduces a new provision allowing teens aged 13 to 16 to consent to data collection on their own.

The bill has drawn praise from lawmakers across party lines and received backing from several major tech companies.

We obtained a copy of the bill for you here.

Supporters frame the bill as a long-overdue update to existing digital privacy laws. But others argue that a subtle change in how platforms are expected to identify underage users may produce outcomes that are more intrusive and far-reaching than anticipated.

Under the current law, platforms must act when they have “actual knowledge” that a user is a child.

The proposed bill replaces that threshold with a broader and less defined expectation: “knowledge fairly implied on the basis of objective circumstances.” This language introduces uncertainty about what constitutes sufficient awareness, making companies more vulnerable to legal challenges if they fail to identify underage users.

Instead of having to respond only when given explicit information about a user’s age, platforms would be required to interpret behavioral cues, usage patterns, or contextual data. This effectively introduces a negligence standard, compelling platforms to act preemptively to avoid accusations of noncompliance.

As a result, many websites may respond by implementing age verification systems for all users, regardless of whether they cater to minors. These systems would likely require more detailed personal information, including government-issued identification or biometric scans, to confirm users’ ages.

Keep reading

Senate Pushes Bill That Could End Private Messaging

Under the pretext of strengthening measures against child exploitation online, a controversial Senate bill is resurfacing with provisions that privacy advocates say would gut critical internet protections and compromise the security and privacy of all citizens.

Known as the STOP CSAM Act of 2025 (S. 1829), the legislation is being criticized for using broad language and vague legal standards that could severely weaken encryption and open the floodgates for content takedowns, including legal content, across a wide range of online services.

We obtained a copy of the bill for you here.

The bill’s stated aim is to curb the spread of child sexual abuse material, a crime already strictly prohibited under federal law. Current regulations already compel online platforms to report known instances of such material to the National Center for Missing and Exploited Children, which coordinates with law enforcement.

However, S. 1829 goes well beyond this existing mandate, targeting a wide spectrum of internet platforms with new forms of criminal and civil liability that could penalize even the most privacy-conscious and compliant services.

The scope of the legislation is sweeping. Its provisions apply not only to large social media platforms but also to private messaging apps, cloud storage services, and email providers.

Keep reading

France considers requiring Musk’s X to verify users’ age

The French government is considering designating X as a porn platform — a move that will likely have the platform implementing strict age verification requirements.

Such a designation could effectively ban children from accessing the social media app unless it curtailed adult content. Paris has recently upped its efforts to protect kids online by requiring age verification by porn platforms.

“X has indicated since 2024 that it accepts the distribution of pornographic content. It must therefore be treated as such,” Digital Minister Clara Chappaz’s office told POLITICO.

Her team has been tasked with “examining the designation of X in the decree concerning pornographic sites that must verify the age of their users.”

The confirmation follows an appearance by Chappaz on French TV show “Quotidien” on Thursday evening, where she said X will soon receive “the same pretty papers as YouPorn” instructing X to ban adult content or implement age screening.

Porn platforms serving content in France are required to implement age verification measures with a final deadline of June 7, although some are protesting.

Failure to comply could see sites fined, delisted from search engines or blocked completely.

Keep reading

‘The View’ Co-Host Sunny Hostin Says Elon Musk’s DOGE Cuts Have Killed 300,000 People — ‘Mostly Children’

The View co-host Sunny Hostin has claimed that Elon Musk’s government cuts have killed over 300,000 people, most of whom are children.

In Wednesday’s episode of the political talk show, Hostin and her fellow panelists reflected on Musk’s legacy after he recently departed the role of leading the Department of Government Efficiency (DOGE).

“But the damage that he did was just really incredible,” Hostin declared.

“He slashed 250,000 federal employees, more than 8500 contracts, more than 10,000 grants, and his cutbacks on medical research cost the lives of — the foreign aid — cost 300,000 lives, mostly children.”

“That’s the damage that Elon Musk did,” she continued. “So I don’t think anyone should be listening to him about anything.”

It is unclear where Hostin sourced this so-called statistic, although her fellow co-hosts did nothing to push back on it.

“Elon knows the 411 on everything,” fellow co-host Whoopi Goldberg chimed in.

“Yeah, he got all that information,” Hostin agreed.

“So Trump should be afraid of him,” added Joy Behar. “He has the receipts on the election, too.”

Keep reading

Texas Ban On Social Media For Under 18s Fails To Pass Senate

Legislation that would have banned anyone under the age of 18 from using or creating social media accounts in Texas stalled in the Senate this week after lawmakers failed to vote on it.

House Bill 186, filed by state Rep. Jared Patterson (R-Frisco), would have prohibited minors from creating accounts on social media sites such as Instagram, TikTok, Facebook, Snapchat, and others by requiring the platforms to verify users’ age.

The measure previously passed the GOP-controlled state House with broad bipartisan support in April, but momentum behind the bill slowed at the eleventh hour in the state Senate this week as lawmakers face a weekend deadline to send bills to Gov. Greg Abbott’s desk.

The legislative session ends on Monday.

In a statement on the social media platform X late Thursday, Patterson said the bill’s failure to pass in the Senate was “the biggest disappointment of my career,” adding that no other bill filed this session “would have protected more kids in more ways than this one.”

The Republican lawmaker said he believed its failure to pass meant “I’ve failed these kids and their families.”

I felt the weight of an entire generation of kids who’ve had their mental health severely handicapped as a result of the harms of social media,”  the lawmaker said. “And then there’s the others – the parents of Texas kids who’ve died as a result of a stupid social media ‘challenge’ or by suicide after being pulled down the dangerous rabbit holes social media uses to hook their users, addict them on their products, and drive them to depression, anxiety, and suicidal ideation.”

“Finally, there’s the perfectly happy and healthy teens in Texas today, who will find themselves slowly falling off the edge before the legislature meets again in 2027,” he stated.

Patterson suggested he would try and pass the measure again when the Texas Legislature meets in 2027.

House Bill 186 would have prohibited a child from entering into a contract with a social media platform to become an account holder and required platforms to verify that a person seeking to become an account holder is 18 years of age or older before allowing them to create an account.

The legislation would have also required social media platforms to delete accounts belonging to individuals under the age of 18 at a parent or guardian’s request.

Keep reading