The War on (Some) Drugs: Why Are We Still Talking About This?

Prohibition is an awful flop.
We like it.
It can’t stop what it’s meant to stop.
We like it.
It’s left a trail of graft and slime,
It don’t prohibit worth a dime,
It’s filled our land with vice and crime,
Nevertheless, we’re for it.

— “Prohibition” by Franklin P. Adams, 1931.

William Stewart Halsted is known as the “father of modern surgery.” He was one of the four founders of Johns Hopkins Hospital in 1886, and he is credited with surgical innovations including promoting antiseptic practices and the discovery that cocaine, when injected into the skin, could be used as a local anesthetic. He was also a drug addict.

Halsted’s drug use began with cocaine, and after a few failed attempts at kicking the habit, he switched to morphine. He spent more than 40 years addicted to the drug, all while maintaining one of the most distinguished careers in the history of surgery. According to Sir William Osler, one of the co-founders of Johns Hopkins, Halsted could not get through the day without a minimum of 180 milligrams of morphine. “On this,” said Osler, “he could do his work comfortably, and maintain his physical vigor.”

Halsted’s story illustrates the reality that—while perhaps not desirable—it is possible to both be addicted to narcotics and still function very well in society. Imagine if America had been in the throes of the War on (Some) Drugs in the 19th century, and instead of doing groundbreaking work as a surgeon and helping to build one of the country’s most prestigious hospitals, Halsted had been thrown into a prison cell. Who would have benefited from that outcome?

More to the point: How many Halsteds are rotting away in prison today, and what gifts are we all missing out on as a result?

In Halsted’s day, drug addiction looked very different from what it looks like today. Federal control of narcotics only came about in 1914, with the passage of the Harrison Narcotics Act.

Before that, anyone could walk into a drug store and purchase medicines—and even soft drinks—that contained opium or cocaine. And some did become addicted.

But, as Mike Gray writes in Drug Crazy:

“It was not until the late 1800s that the public began to realize that some of their favorite medicines could be highly addictive. … At that time, the highest credible estimates put the number of U.S. addicts at about three people in a thousand. Others thought it was half that.” (Note: Some estimates put the number as high as one in two hundred.)

“All the leading authorities now agree,” he writes, “that addiction peaked around 1900, followed by a steady drop. The reason was simple common sense coupled with growing awareness.”

Keep reading

The Myth of the “Robber Barons”: James Hill versus the Crony Competitors

Whether we like it or not, the Progressive Era and its mainstream historical interpretation—even when fictional—has virtually defined our last century. The dominant, though false, narrative is basically that unfettered free market capitalism led to negative outcomes, “robber barons” monopolized the market to their benefit, and that disinterested federal regulation brought discipline to this system, keeping its benefits while curbing its excesses. For that reason, among others, entrepreneurs and businesses have been maligned, even as society enjoyed their benefits.

Thankfully, important historical work has been done to attempt to correct the dominant narrative. One such work is Burton Fulsom’s The Myth of the Robber Barons: A New Look at the Rise of Big Business in America. This work—rather than relying on popular, but inaccurate, historical narratives—examines the contributions of several key American entrepreneurs. Unfortunately, rather than learning positively from real-life examples of successful entrepreneurs and the dangers of government interventions and cronyism, “many historians have been teaching the opposite lesson for years” (p. 121). Fulsom continues,

They have been saying that entrepreneurs, not the state, created the problem. Entrepreneurs, according to these historians, were often “robber barons” who corrupted politics and made fortunes bilking the public. In this view, government intervention in the economy was needed to save the public from greedy businessmen. This view, with some modifications, still dominates in college textbooks in American history. (pp. 121-122)

Crucially, Fulsom makes the useful distinctions between “political entrepreneurs” and “market entrepreneurs” (p. 1):

Those who tried to succeed in [business] through federal aid, pools, vote buying, or stock speculation we will classify as political entrepreneurs. Those who tried to succeed in [business] primarily by creating and marketing a superior product at a low cost we will classify as market entrepreneurs.

This distinction is critical because it qualitatively differentiates those who succeed through the production-and-exchange mechanism and those who use the political means and cronyism to gain wealth at the expense of the public. One example, though imperfect, is the main subject of this article—James J. Hill and his Great Northern transcontinental railroad.

Keep reading

The New York Times Wants An America Without Americans

n Tuesday, Leighton Woodhouse wrote for The New York Times that conservatives are “spinning” a “mythology” that is “historically delusional.”

The delusional mythology Woodhouse is referring to? The belief that Americans are a “group of people with a shared history.”

According to Woodhouse, “The founding fathers were an assortment of people from different histories and backgrounds who coexisted — often just barely.” These “different” histories, however, were all rooted in Christianity. But Woodhouse wants readers to believe that this type of variety in Christianity proves America was born out of a multicultural diversity experiment.

Of course it wasn’t. The colonists shared a common language, moral framework, and writ large, a lineage. Yet Woodhouse insists otherwise.

The United States isn’t exceptional because of our common cultural heritage; we’re exceptional because we’ve been able to cohere despite faiths, traditions and languages that set us apart, and sometimes against one another. The drafters of the Constitution tried to create that cohesion by building a government that could transcend our divisions.

In other words, Woodhouse is arguing that America is not the product of Americans at all. Rather, it’s just a cosmopolitan conglomerate held together by particular processes but not people. It’s why Woodhouse invokes “Mexican, Korean, Somalian” “anestries” as similar examples of American heritage just like English, Irish, and Scottish settlers. The implication of course is that America would be just as American even without “heritage Americans.”

But that’s not how nations work. As The Federalist’s John Daniel Davidson wrote in these pages, the very premise of the entire American legal and civic culture emerged from the specifically Christian claim that “All men are created equal,” and such conviction “arrived in America by way of settlers and pioneers who came here specifically to establish a nation where they could practice their Christian faith as they saw fit.”

“The only people who ever took that self-evident truth [that all men are created equal] and used it as a foundation on which to forge a new nation were the English colonists in America,” Davidson pointed out. Not Mexicans, not Koreans, not Somalians, but English colonists who created America and thus became the first Americans.

And despite Woodhouse’s best efforts, there is in fact such a thing as a heritage American. They are the descendants of those who settled this land, fought for its independence, and built our institutions. The great statesmen of our nation understood this. They spoke not of a diverse collection of foreigners as tying the nation together, but of a people bound by blood, memory, and the sacrifices of the generations that came before them.

Keep reading

Alaska Schools’ Social Studies Standards Omit Washington, Lincoln, And Christianity 

Alaska’s new social studies standards don’t mention the Nome Gold Rush. They don’t mention the Trans-Alaska Pipeline System. They don’t mention William Egan, the state of Alaska’s first governor, and they don’t mention Sarah Palin, who ran for Vice President of the United States. There’s a lot more that’s missing in the Alaska social studies standards, but you can tell right away that something is wrong when Alaska’s social studies standards leave Alaska’s children ignorant of the headlines of Alaska’s history and the most famous Alaskans.

Education departments in every state are on radical autopilot when they make social studies standards. Americans expect blue states to use their state social studies standards to impose identity politics ideology and action civics (vocational training in progressive activism) on schools and students, strip out factual content, and ignore or slander the history of Western civilization and America, and call it “social studies instruction” — that’s what you get in states such as ConnecticutRhode Island, and Minnesota. But radical activists embedded in state education departments do the same thing in red states whenever policymakers and citizens aren’t looking. That’s what just happened in Alaska.

The Alaska Social Studies Standards (2024), produced by Alaska’s Department of Education and Early Development, avoided the worst of the blue-state social studies standards’ extreme politicization, unprofessional vocabulary, and ideologically extreme content. That’s because there’s hardly any historical content. The standards’ absences include basic facts of American history, much of how our government works, and our foundational documents of liberty. The standards also introduced substantial new amounts of politicized material.

How did Alaska’s Department get its curriculum so badly wrong?

The department outsourced much of the standards to the radical activists who have captured the national social studies establishment. Alaska’s standards take their structure and emphases from the National Council for the Social Studies’ (NCSS) ideologically extreme definition of social studies, as well as from its College, Career, and Civic Life (C3) Framework for Social Studies State Standards. The C3 Framework replaces content knowledge with insubstantial and opaque “inquiry”; lards social studies with identity politics ideologies such as Critical Race Theory; and inserts ideologically extreme activism pedagogies such as Action Civics.

Keep reading

The Left’s ‘stolen-land’ rhetoric threatens private property

Left-wing “land acknowlegements” could be having real-world consequences for property owners in Canada. And the United States may be next.

It began as a polite ritual. Before meetings or ceremonies, institutions began acknowledging that their buildings sit on land once inhabited by Indigenous peoples: “We recognize this is the unceded territory of the [tribe name].” The practice, with roots in Australia as far back as the 1970s, was picked up in Canada following the 2015 Truth and Reconciliation Commission report and moved quickly from Canada to left-leaning universities, city councils and churches in the 2020s. Many saw it as a mere courtesy. But beneath the symbolism lies deeper political movement that could erode the very foundation of private property.

In Canada, that shift is already underway. A British Columbia Supreme Court ruling this year suggested that even privately owned, fee-simple land might rest on “defective and invalid” title if an Aboriginal title still exists. For a nation built on English common-law property rights, that’s quite a statement. As columnist Kevin Klein warns in the Winnipeg Sun, Ottawa’s silence on the issue is turning Crown land — once considered secure — into “conditional land.” If the Crown’s title is conditional, how long before yours is?

Land acknowledgements may sound harmless, but they prepare the rhetorical ground for these legal arguments. Once governments, universities, and corporations declare publicly that their property sits on “stolen land,” they’ve already accepted the premise that they don’t actually own it. Activists then insist that recognition demands restitution — and suddenly the issue moves from ceremony to court.

That’s what’s happening in Canada, where some judges now treat Indigenous land claims as concurrent with existing titles. For investors, homeowners, and farmers alike, that’s a recipe for uncertainty — and eventually, seizure of land.

The Left insists this is “reconciliation,” not revolution. But the outcome is the same. Private property rights are fundamental to Western liberty. If property is always subject to retroactive moral judgments or undefined shared stewardship, ownership loses to temporary permission.

In the United States, land acknowledgements have also run rapid, typically in the same academic and bureaucratic circles that look askance at capitalism and private property.

None of this means ignoring history or dismissing past injustices, just refusing to let symbolic guilt erode the legal system. Reconciliation should not come at the cost of the rule of law. Governments must make clear that while we honor history, property rights remain absolute under modern law.

The growing unease north of the border is a warning to America: beware the moral language that undermines legal foundations. Today’s “land acknowledgement” may be tomorrow’s title challenge. And once you concede the premise that your land isn’t really yours, it may not be for long.

Keep reading

The Propaganda of American Schooling: A History of Lies and Indoctrinated Youth

“History is a set of lies agreed upon.” These were the words of the infamous French dictator and military strategist Napoleon Bonaparte.

It is a well-known concept that history is often written by the victor—that when two cultures or ideologies clash, the one that prevails and gains more power and influence is the one whose side of the story the record favors. Yet, despite this being a fairly common idiom, it is often overlooked just how profoundly it shapes our understanding of the present—or, more aptly, our misunderstandings. 

Many still fail to grasp that the history they cling to so fervently—often as a cornerstone of political or national identity—is a carefully curated fable, designed to secure their allegiance through misbelief. Likewise, few recognize how the formalized education system of the early 20th century was deliberately shaped by the robber barons of the predator class, particularly Rockefeller and Carnegie, not as institutions of higher learning, but as tools for controlling the public and molding the minds of the masses to serve their interests.

Reverend Frederick T. Gates, the business advisor to John D. Rockefeller Sr. who helped him found the General Education Board in 1902, elaborated their vision in his book The Country School Of Tomorrow —

“In our dream we have limitless resources, and the people yield themselves with perfect docility to our molding hand. The present educational conventions fade from our minds; and, unhampered by tradition, we work our own good will upon a grateful and responsive rural folk. We shall not try to make these people or any of their children into philosophers or men of learning or of science. We are not to raise up among them authors, orators, poets, or men of letters. We shall not search for embryo great artists, painters, musicians. Nor will we cherish even the humbler ambition to raise up from among them lawyers, doctors, preachers, statesmen, of whom we now have ample supply.

For the task we set before ourselves is very simple as well as a very beautiful one, to train these people as we find them to a perfectly ideal life just where they are.”

In the context of modern American society, much of the mythology that makes up the concept of “American exceptionalism” is in fact a fabrication in line with this agenda, creating the docile public of Rockefeller’s vision.

Keep reading

L.A. School District to Ban Fifth-Grade Plays About U.S. History: ‘Culturally Insensitive’

The Los Angeles Unified School District (LAUSD) is banning a celebrated series of fifth-grade musical plays about American history at a local charter school because, the district says, they are “culturally insensitive.”

For nearly three decades, the fifth-graders at Marquez Charter Elementary in Pacific Palisades have performed musicals about crucial periods in the formation of the United States.

These include Miracle in Philadelphia, about the Constitutional Convention; Hello, Louisiana!, about the voyage of Lewis and Clark; and Water and Power, about the Industrial Revolution. (A fourth-grade play, Gold Dust or Bust, focuses on the history of California.)

The musicals, co-written by Jeff Lantos (with music composed by the late jazz pianist Bill Augustine), are so successful in conveying historical details that Marquez students consistently score off the charts in history assessments.

A 2004 academic study of the Marquez plays observed: “Students who attended Marquez Elementary School scored more than twice as many items correctly [on history tests] as did students from other schools.”

Keep reading

Ukraine’s Embrace of Suicidal Nationalism

The recent assassination of the Ukrainian neo-fascist politician Andriy Parubiy are a grim reminder of the far-right origins of the 2014 Ukrainian revolution — a revolution which eventually gave way to the full-scale Russian invasion of February 2022 and a war that has decimated the Ukrainian state.

At two key moments over the past 20 years, during 2004’s Orange Revolution and, a decade later, during the Maidan uprising, Ukraine’s nationalist political elites, at the urging of the American foreign policy establishment, sought to marginalize, stigmatize and eventually disenfranchise the substantial bloc of ethnic Russian citizens living in the country’s east and south.

That such an eventuality was possible (if not likely) was foreseen some 35 years ago by the last decent foreign policy president we’ve had, George H.W. Bush, who crafted a post Cold War policy based on (1) a refusal to rub Russia’s diminished fortunes in its face and (2) a wariness of re-awakening the poisonous sectarianism that so marked the politics of Eastern and Central Europe at mid-century.

Bush’s emphasis was on avoiding creating unnecessary crises within the post-Soviet space rather than provoking new ones (as subsequent Republican and Democratic administrations have chosen to do). As Bush’s secretary of state James A. Baker later wrote: “Time and again, President Bush demanded that we not dance on the ruins of the Berlin Wall. He simply wouldn’t hear of it.”

The nature of the Cold War had changed with Mikhail Gorbachev’s UN Speech of December 7, 1988. Gorbachev announced that the USSR was abandoning the class struggle that for decades served as the basis for Soviet foreign policy. In place of that, Gorbachev declared that Eastern European states were now free to choose their own paths, declaring that “the compelling necessity of the principle of freedom of choice” was “a universal principle to which there should be no exceptions.”

Gorbachev continued:

…The next U.S. administration, headed by President-elect George Bush, will find in us a partner who is ready – without long pauses or backtracking – to continue the dialogue in a spirit of realism, openness and good will, with a willingness to achieve concrete results working on the agenda which covers the main issues of Soviet-U.S. relations and world politics.”

Initially, Bush and his team were skeptical of Gorbachev. In his memoirs, Bush’s National Security Advisor Brent Scowcroft dismissed Gorbachev’s overture, writing that the speech “had established, with a largely rhetorical flourish, a heady atmosphere of optimism.” Scowcroft, echoing the analysis offered to him by the CIA, worried that Gorbachev would then be able to “exploit an early meeting with a new president as evidence to declare the Cold War over without providing substantive actions from a ‘new’ Soviet Union.”

The caution with which Bush and his team treated Gorbachev likewise was extended to the newly or soon-to-be independent states in Eastern Europe.

There was to be no dancing on the ruins of the Berlin Wall.

Keep reading

American History’s Stark Warning Against Tolerating Political Violence

In the days since Charlie Kirk’s murder, many have expressed incredulity about the condition of the country. Our circumstances may be unique but the movements of political societies follow clear patterns. We have been deeply polarized before and the cause, now and then, is the same. Disagreement about the fundamental type of country we believe that we should be is what divides us.

In May, 1856 Charles Sumner of Massachusetts took to the floor of the U.S. Senate to denounce the use of force and fraud to plant slavery and its inevitable offspring, oligarchy, in the territory of Kansas. Southern statesmen who composed the inter-state oligarchy in the slave states sought to admit Kansas with slavery into the Union, expanding their power.

Since at least 1854 Sumner was among a few who had recognized that the fight over slavery had taken on a new character. Not only did the fate of slavery depend on the outcome of that fight, but also the future form of American government – whether all America would be republican, as the Founders intended and as the northern states were, or whether America would be converted to an oligarchy, the prevalent form of government in the South.

Sumner’s “Crime Against Kansas” speech was long, direct, and forceful. A few days later, Representative Preston Brooks of South Carolina entered the Senate chamber with his lieutenants, Representatives Laurence Keitt of South Carolina and Henry Edmundson of Virginia, and commenced caning Sumner, who was sitting, his legs locked beneath his desk.

While Sumner could not defend himself from the blows, Keitt (brandishing a pistol) and Edmundson stood by, Antifa-like, and prevented anyone from coming to Sumner’s aid. Brooks beat Sumner over the head nearly to death and left him unconscious in a pool of blood. Southern newspapers, the mainstream media at the time, praised the attack and blamed Sumner’s words for bringing the violence upon himself. Supporters of Brooks feted him in person and mailed him new canes and congratulatory letters.

Ironically, the violence and the approving response to the violence verified exactly what Sumner claimed at the beginning of his speech. Everyone could feel that the country was polarized down to its core. But why? Sumner contrasted ordinary and extraordinary politics, ordinary and extraordinary political disagreements. Statesmen representing the country were not merely debating whether a number on a tariff schedule should be 5 or 10 percent.

Kansas was a flashpoint in a more consequential, extraordinary struggle. Each side was contending for a way of life and form of government abhorrent to the other. The general consensus had broken down; the American political regime was seriously destabilized. The oligarchy of the South rejected the basis of American republicanism, natural equality and fundamental liberties, including freedom of speech. The violence and the approval of violence in reaction to Sumner’s claims had proven Sumner’s claims that the southerners were oligarchic in character.

Both Sumner and Kirk advanced their particular causes in the way of American republicanism. They used words; they exercised their freedom of speech to persuade. On the other hand, the attackers and their supporters showed their contempt for free speech in favor of force, and therefore showed that they actively rejected the principles and general consensus that had underpinned the American political regime.

Keep reading

How one million white Europeans – many seized on the south coast of England – were sold to the Muslim world and brutally exploited in the slavery scandal the Left DON’T want to speak about

When Englishman Thomas Pellow was 27, he led a slave-hunting expedition to the West African coast. His orders were to plunder the villages, kill the adults and capture the children.

But Pellow was not a mercenary employed in the transatlantic slave trade, which sent millions of its victims across the ocean. He was a slave himself – taken prisoner as a child by the Moroccan Sultan Moulay Ismail. And 300 years ago, he was far from alone. 

The sultan owned an estimated 25,000 European slaves, many seized in raiding expeditions on the south coast of England as well as countries as far afield as Iceland.

Though it is almost forgotten today – suppressed, perhaps, by some squeamish historians – the Muslim trade in both black African and white European slaves was deeply feared for three centuries.

Yet, at the time, dozens of memoirs, many of them bestsellers, were published by former slaves who had escaped from captivity, with horrendous stories of torture, rape and cold-blooded murder.

Now, a book by historian Justin Marozzi unflinchingly reveals the extent of slavery in Arab countries, which was conducted with unequalled brutality.

More shocking still, he shows that it continued in much of the Islamic world well into the 20th century – and, for hundreds of thousands of West Africans born into life as slaves, carries on to this day.

For Marozzi to investigate these stories, let alone publish, is courageous. His book invites an inevitable backlash from Left-wing academics and broadcasters who focus solely on the slave trade triangle between Europe, West Africa and the Americas that operated from the 16th to the 19th centuries.

Keep reading