Smarter, Colder, Faster: Quantum Amplifier Breakthrough Makes Quantum Computing Up to 10x More Efficient

As quantum computing systems scale toward thousands—if not millions—of qubits, the role of the often overlooked quantum amplifier that listens to each qubit becomes increasingly critical. Researchers in Sweden have reported that the development of a smarter, ultra-low-power quantum amplifier could significantly alleviate one of quantum computing‘s major engineering challenges. 

Researchers in Sweden say they’ve engineered a smarter, ultra-low-power quantum amplifier that could dramatically ease one of quantum computing’s biggest engineering headaches.

A new study from Chalmers University of Technology, in collaboration with Low Noise Factory AB, unveils a cryogenic amplifier that switches on only when needed. This reduces energy consumption and thermal noise that threaten the fragile state of quantum bits or qubits. 

The breakthrough, detailed in IEEE Transactions on Microwave Theory and Techniques, has the potential to pave the way for the realization of truly large-scale, fault-tolerant quantum computers, marking a significant advancement in the field.

“This is the most sensitive amplifier that can be built today using transistors,” lead author and doctoral student at Chalmers​​, Yin Zeng, said in the Chalmers press release. “We’ve now managed to reduce its power consumption to just one-tenth of that required by today’s best amplifiers – without compromising performance. We hope and believe that this breakthrough will enable more accurate readout of qubits in the future.”

Keep reading

X-Ray Telescopes Reveal 23-Million-Light-Year Filament That May Help Solve “Missing Matter” Mystery

A potential solution to the decades-long “missing matter” problem has been uncovered as astronomers’ recent analysis of X-ray data identifies a filament of hot gas, 10 times the size of the Milky Way, filling the space between four galaxy clusters.

While the discovery does not completely answer the question of where all of the currently unaccounted for matter resides, the filament does appear to represent a significant chunk of it. Astronomers sourced the data used in the new research from the European Space Agency’s XMM-Newton and JAXA’s Suzaku X-ray space telescopes.

Missing Matter

Current models of the universe have a major shortcoming: they can’t fully account for all the matter that should exist. While dark matter and dark energy—detectable only by their effects—compose most of the cosmos, visible matter accounts for just about 5%. Yet even among that 5%, nearly half of the expected matter remains missing.

One possible explanation is the existence of long, tenuous strings of gas called “filaments.” However, detecting these structures is notoriously difficult, as they are extremely faint and often obscured by brighter cosmic phenomena like galaxies and black holes. The breakthrough in the new research lies in the team’s successful identification and characterization of a hot gas filament connecting four galaxy clusters.

“For the first time, our results closely match what we see in our leading model of the cosmos – something that’s not happened before,” says lead researcher Konstantinos Migkas of Leiden Observatory in the Netherlands. “It seems that the simulations were right all along.”

Identifying the Missing Matter

The four galaxy clusters and the filament linking them are part of the Shapley Supercluster, one of the largest known structures in the universe, containing around 8,000 galaxies. Two clusters sit on each side of the filament, which stretches 23 million light-years diagonally away from Earth.

XMM-Newton and Suzaku’s X-ray data were crucial to mapping the filament’s properties, supported by optical data from multiple sources. Each telescope contributed a unique perspective: Suzaku scanned a broad area of space, while XMM-Newton focused on identifying supermassive black holes within the filament and removing their interference from the data.

“Thanks to XMM-Newton we could identify and remove these cosmic contaminants, so we knew we were looking at the gas in the filament and nothing else,” adds co-author Florian Pacaud of the University of Bonn, Germany. “Our approach was really successful, and reveals that the filament is exactly as we’d expect from our best large-scale simulations of the Universe.”

Keep reading

How China Is Censoring Scientific Research Across The Globe

We all know how serious environmental degradation is in China. Its emissions have skyrocketed, air and water quality have plummeted, and critical habitat and ecosystems have disappeared. That’s why unadulterated research on the topic is critical to better informed policy. But my recent experience shows that China’s censorship model is spreading to the West, hindering that research from taking place.

In 2012 I published an academic paper in the journal Environmental Politics coining the term “authoritarian environmentalism” to describe the way that environmental policy is made in China. This year, I was approached by Lu Liao, a professor of urban planning at Renmin University in Beijing, to submit a paper to a special issue on China in Environmental Policy and Governance, a respected journal published by the major academic publisher Wiley, based in New Jersey.

I suggested reviewing what we have learned about “authoritarian environmentalism” since 2012. “The idea of revisiting the 2012 paper sounds very timely and meaningful,” replied Liao, who sits on the editorial board of Environmental Policy and Governance.

That’s when things went awry. The proposal I sent her included a new research question about whether the policy model in China is flawed by design, a form of greenwashing intended to legitimate one-party rule rather than improve the environment.

After a few days, Liao wrote back to report some “intriguing context from my own position,” as she called it. “Due to current sensitivities around ideology and international relations in China, many Chinese universities are quite cautious about discussions involving certain terms, and faculty are prohibited from publish[ing] work on some sensitive topics.”

I was “invited” to withdraw my submission and seek publication elsewhere. China’s censorship regime was being extended to a Western scholar and to a Western academic journal.

I reached out to the journal’s editor, Andy Gouldson, professor of environmental policy at Leeds University, who has done work in China, seeking clarification. He confirmed that “there are sensitivities for the guest editors of the special issue” and invited me to submit the paper as a regular contribution. I’ll decline. I won’t publish in a journal that bends to China’s censorship regime.

Put aside the irony that my research on authoritarianism in China was sidelined by authoritarianism in China. The bigger scandal here is how Western academics and publishers are willing to allow PRC censorship to dictate the terms of their trade.

Keep reading

Study Confirms Controversial 23,000-Year-Old Human Footprints, Challenging Past Views on Peopling of the Americas

New radiocarbon dating of purportedly 23,000-year-old footprints discovered in a dried lakebed in White Sands, New Mexico, has confirmed their age, reigniting controversy regarding the earliest arrival of humans in the Americas.

Several scientists have questioned the early dating of the fossil footprints, and have noted the lack of artifacts found at the location. However, the scientists behind the newly confirmed dates say the transitory nature of their location supports the idea that the makers of the 23,000-year-old footprints were likely only passing through and did not leave any objects behind.

23,000-Year-Old Human Footprints Appear 10,000 Years Too Early

For much of the 19th and early 20th  centuries, archaeologists believed humans had not arrived in the Americas until as recently as 3,000-4,000 years ago. In the late 1920s, archaeological discoveries at sites like Folsom and Clovis in New Mexico pushed that date back thousands of years, with the most commonly accepted date for human arrival being extended to 13,000 years ago. This date is supported by geological history, indicating that the land bridge between Asia and North America would not have been passable 10,000 years earlier.

The situation changed in 2019 when researchers from the UK’s Bournemouth University and the U.S. National Park Service unearthed a series of undoubtedly human footprints in White Sands dated to between 21,000 and 23,000 years ago. As noted, those findings, which were published in 2021, remain highly controversial since they seem to go against a relatively well-established timeline.

“The immediate reaction in some circles of the archeological community was that the accuracy of our dating was insufficient to make the extraordinary claim that humans were present in North America during the Last Glacial Maximum,” said study author and U.S. Geological Survey USGS research geologist Jeff Pigati in a later statement.

Even Pigati and colleagues’ 2023 follow-up analysis lending support for the extremely ancient date, as well as a separate study offering evidence of 22,000-year-old transport technology in the same area, and the discovery of an alternate, ancient ice-highway route from Asia to North America still did not manage to settle the debate.

Recently, Vance Holiday, an archaeologist and geologist from the University of Arizona whose 2012 study of the White Sands area just a few yards from the location of the footprints assisted with their initial 2021 dating, returned to perform a new analysis of the footprints. Unlike previous tests that relied on seeds and pollen to date the footprints, Holliday and his team used radiocarbon dating of ancient mud in an independent lab to confirm the controversial dates.

New Soil Radiocarbon Dates Confirm Ancient Origin

Before returning for a new set of tests, Holliday enlisted the help of Jason Windingstad, a doctoral candidate in environmental sciences who worked as a consulting geoarchaeologist for previous research projects at White Sands.

During several outings in 2022 and 2023, the duo dug a new series of trenches in the dried ancient lakebeds. These efforts included collecting ancient mud samples taken from the beds of a stream where the supposedly 23,000-year-old footprints were discovered. Holliday says even more ancient evidence was likely here at one time, but millennia of wind erosion have left scarce material for his team to study.

“The wind erosion destroyed part of the story, so that part is just gone,” he explained. “The rest is buried under the world’s biggest pile of gypsum sand.”

Keep reading

“We Have Finally Found the Last Piece of the Puzzle”: Scientists Solve a Long-Standing Seismic Mystery

seismic mystery has been solved as earthquake waves, traveling almost 3,000 kilometers below ground and demonstrating anomalous behavior in their rush toward the planet’s center, have now been explained with the help of observational data.

ETH Zurich Professor of Experimental Mineral Physics Motohiko Murakami led the new study, attempting to recreate the extreme conditions of the inner Earth. Their laboratory work demonstrated a unique rock flow, distinct from that of liquid lava or brittle solid rock.

The D” Layer Anomaly

The lowest part of Earth’s mantle, the D” layer, sits 2700 kilometers deep, just above the boundary with the planet’s core. Strangely, earthquake waves suddenly alter their behavior at this depth, increasing in speed. This acceleration would typically indicate that the waves had passed into an entirely different type of material, a long-standing seismic mystery that has baffled seismologists.

Murakami made an important discovery over two decades ago, when in 2004 he found that around the D” layer barrier, the primary mineral changes from the perovskite that makes up the rest of the lower mantle. This new “post-perovskite” mineral endures extreme temperatures and pressure at that depth.

For a few years, Murakami and his team believed that the change over to this post-perovskite mineral provided an explanation for the seismic acceleration. Yet, in 2007, Murakami uncovered further evidence that the mineral change was insufficient to account for the shift in earthquake waves.

It was a complex computer model that provided the researchers with the missing piece of the puzzle: post-perovskite hardness changes based on the direction that its crystals point. The cause of the acceleration appears to result from when all the minerals’ crystals become aligned in the same direction, a phenomenon that occurs at depths of around 2700 kilometers.  

“We have finally found the last piece of the puzzle,” Murakami recently said in a statement.

Laboratory Pressure

As their medium to simulate post-perovskite, the team synthesized pure MgGeO3 orthopyroxene by using an electric furnace to heat a mixture of fine-grain germanium oxide and magnesium oxide at 1000 °C for 104 hours. The resulting substance was placed under extreme pressure measured with diamond anvils and heated with a CO2 laser to recreate the intense conditions found in the D” layer. The researchers took high-pressure acoustic velocity and X-ray diffraction measurements, which were analyzed with multiple spectroscopic techniques.

The team’s laboratory work successfully recreated the formation needed for the acceleration observed at the edge of the D” layer, demonstrating that heat and pressure can align the crystals in one direction, where seismic waves speed up. This suggests that instead of a change in material causing the anomaly, a change in deformation is responsible for the effect.

Solving the Seismic Mystery

Exactly how these crystals manage to align in parallel relies on a type of movement long suspected by geoscientists, yet one that has been lacking direct evidence until now. The hypothesis is that a form of convection similar to the boiling of water allows the solid rock in the lower mantle to flow horizontally. Murakami’s team’s experiments have finally demonstrated this long-suggested convection action. 

Keep reading

NIH Renews Grants for Harvard Monkey Lab, Fauci’s Beagle and Primate Tests

Despite President Donald Trump’s Department of Veterans Affairs and Navy moving to end cruel animal testing, the National Institutes of Health, under Director Jay Bhattacharya, has renewed millions in funding for controversial experiments, including THC tests on monkeys at Harvard, tick bites on beagle puppies, and Anthony Fauci’s notorious “Monkey Island,” prompting criticism from watchdog group White Coat Waste.

Last month, Gateway Pundit reported how President Trump’s Secretary of Veterans Affairs, Doug Collins, confirmed the department will end primate testing before a 2026 deadline set by Congress.

Following years of campaigning by the watchdog organization White Coat Waste, President Trump’s first administration set the VA on the path to ending testing on dogs, cats and primates after WCW exposed how the agency was giving puppies heart attacksinjecting monkeys with angel dustcrippling kittensdrilling into cat’s skulls, and much more.

Also in May, Trump’s U.S. Navy banned all testing on dogs and cats. The Navy credited WCW, as well as journalist Laura Loomer, the Department of Government Efficiency, and Senator Rand Paul, “for bringing the issue of animal abuse to our attention, leading to the Navy’s decision to ban medical research testing on cats and dogs.”

But holdovers from the Obama and Biden Administrations appear to be preventing this kind of progress at the National Institutes of Health (NIH).

A Barack Obama-era NIH staffer, Dr. Nicole Kleinstreuer, has been appointed by NIH Director Jay Bhattacharya to be the NIH’s Acting Deputy Director for Program Coordination, Planning, and Strategic Initiatives. Earlier this month, Kleinstreuer told NPR that the NIH has “no intention of just phasing out animal studies overnight.”

The NIH has renewed several controversial animal testing projects initiated by Dr. Anthony Fauci and other NIH staff members.

Gateway Pundit has learned that the NIH has re-upped grant funds for THC experiments on young monkeys at Harvard University’s McLean Hospital that WCW exposed through a Freedom of Information Act request and that Gateway covered in April. The NIH has committed five more years of taxpayer funding to the project, which was initially scheduled to end on April 30, 2025, and has now received nearly $4.5 million.

Keep reading

Physics Demonstrates That Increasing Greenhouse Gases Cannot Cause Dangerous Warming, Extreme Weather or Any Harm

At the outset it is important to understand that carbon dioxide has two relevant properties, as a creator of food and oxygen, and as a greenhouse gas (GHG).

As to food and oxygen, carbon dioxide is essential to nearly all life on earth by creating food and oxygen by photosynthesis.  Further, it creates more food as its level in the atmosphere increases.  For example, doubling carbon dioxide from today’s approximately 420 ppm to 840 ppm would increase the amount of food available to people worldwide by roughly 40%, and doing so would have a negligible effect on temperature.

As to carbon dioxide as a GHG, the United States and countries worldwide are vigorously pursuing rules and subsidies under the Net Zero Theory that carbon dioxide  and other GHG emissions must be reduced to Net Zero and the use of fossil fuels must be eliminated by 2050 to avoid catastrophic global warming and more extreme weather.  A key premise stated by the Intergovernmental Panel on Climate Change (IPCC) is  the “evidence is clear that carbon dioxide (CO2) is the main driver of climate change,” where “main driver means responsible for more than 50% of the change.”[1]

The Biden Administration adopted over 100 rules and Congress has provided enormous subsidies promoting alternatives to fossil fuel premised on the Net Zero Theory. The EPA Endangerment Finding, for example, asserts “elevated concentrations of greenhouse gases in the atmosphere may reasonably be anticipated to endanger the public health and to endanger the public welfare of current and future generations.”[2]

On April 9, 2025 President Trump issued a “Memorandum on Directing Repeal of Unlawful Rules” and Fact Sheet stating “agencies shall immediately take steps to effectuate the repeal of any [unlawful] regulation” under Supreme Court precedents, inter alia, where “the scientific and policy premises undergirding it had been shown to be wrong,” or “where the costs imposed are not justified by the public benefits.”[3]  We understand the Supreme Court has also ruled in the leading case State Farm[4] that an agency regulation is arbitrary, capricious and thus invalid where, inter alia:

  • “the agency has … entirely failed to consider an important aspect of the problem”
  • “the agency has relied on factors which Congress has not intended it to consider.”

We are career physicists with a special expertise in radiation physics, which describes how CO2  and GHGs affect heat flow in Earth’s atmosphere.  In our scientific opinion, contrary to most media reporting and many people’s understanding, the “scientific premises undergirding” the Net Zero Theory, all the Biden Net Zero Theory rules and congressional subsidies are scientifically false and “wrong,” and  violate these two State Farm mandates.

First, Scientific Evidence Ignored.  All the agency rules, publications and studies we have seen supporting the Endangerment Finding and other Biden Net Zero Theory rules ignored, as if it does not exist, the  robust and reliable scientific evidence that:

  • carbon dioxide, GHGs and fossil fuels will not cause catastrophic global warming and more extreme weather, detailed in Part III.
  • there will be disastrous consequences for the poor, people worldwide, future generations, Americans, America, and other countries if CO2, other GHGs are reduced to Net Zero and fossil fuels eliminated that will endanger public health and welfare, detailed in Part IV.

Second, Unscientific Evidence at the Foundation. Unscientific evidence is all we have seen underlying the Endangerment Finding and all the other Biden Net Zero rules, detailed in Part V.

Keep reading

Third Chinese scientist charged with smuggling illegal biological pathogen into US from Wuhan

A third Chinese scientist has been charged with smuggling biological materials into the United States after a University of Michigan student and her boyfriend were caught last week.

Chengxuan Han was arrested on Sunday at Detroit Metropolitan airport and charged with smuggling goods into the US.

Police allege Han sent four packages which ‘contained biological material related to round worms’ from China to the US.

The packages were sent between September 2024 and March 2025 and addressed to people linked to the laboratory at the University of Michigan. 

Han initially denied sending the packages at all, according to court documents. She later insisted they contained plastic cups, rather than petri dishes. 

According to the documents, she ultimately admitted sending the samples, which she had collected during her research as a Ph.D. student in Wuhan, China.

The charges come less than a week after University of Michigan postdoctoral fellow Yunqing Jian, 33, was charged alongside Zunyong Liu, 34, for attempting to smuggle a weapon of ‘agroterrorism’ into the United States in a sinister plot allegedly tied to the Chinese Communist Party.

Liu arrived in the United States from China in July 2024 carrying four small baggies of Fusarium graminearum – a product responsible for causing billions of dollars worth of damage to livestock, wheat, barley, maize and rice globally each year.

All three of the accused have links to the same university laboratory.

Keep reading

Japan Has Created the First Artificial Womb in the World

Researchers in Japan are developing artificial womb technology, a groundbreaking innovation that could change how we care for premature infants and even reshape the future of childbirth.

This isn’t science fiction—it’s a reality scientists are working toward, and Japan is leading the way.

Let’s explore what this technology is, how it works, and what it means for the world.

What Is Artificial Womb Technology?

An artificial womb is a device designed to mimic the environment of a natural womb. It provides a safe, controlled space for a fetus to grow outside the mother’s body.

The system uses a fluid-filled chamber that acts like amniotic fluid, along with machines to supply oxygen and nutrients through the umbilical cord.

In Japan, scientists have tested this technology on animals like goats and sharks, successfully keeping embryos alive for weeks.

For example, researchers at Juntendo University sustained goat fetuses for up to three weeks in a plastic tank filled with artificial amniotic fluid.

This is a big step toward using the technology for human babies, especially those born extremely premature.

The goal is to help babies born before 37 weeks, who often face serious health risks.

According to the World Health Organization, 15 million babies are born prematurely each year, and 1 million die due to complications.

Artificial wombs could offer a lifeline by allowing these infants to continue developing in a womb-like environment, improving their chances of survival and healthy growth.

Keep reading

Nearly Everything That We’ve Been Told about Genes and Autism Is Wrong

The University of Sydney caps doctoral theses at 80,000 words (excluding references). The theory is that external reviewers don’t want to read more than that (true!). One can apply to the Dean to increase the word limit to 100,000, which is what I did. But my doctoral thesis, as initially written, was closer to 140,000 words. So I had to cut three chapters that I really liked — the political economy of theories of genetic causation, how evidence-based medicine was captured by Big Pharma, and the history of the regulation of mercury.

I believe that some of the information in those excised chapters would be useful to policymakers in Washington, D.C. trying to figure out how to deal with the epidemics of chronic disease in children. So today I am sharing my original (slightly updated), never-before-seen, chapter 6, which challenges the entire paradigm of genetic determinism in disease causation. 

I. Introduction

In the first chapter, I showed that the rise in autism prevalence is primarily a story of environmental triggers (with some smaller percentage due to diagnostic expansion and genetics). The story of how genetic theories became the dominant narrative in the autism debate thus needs to be explained. The hegemony of genetic theories of disease causation comes at a tremendous cost to society because they crowd out more promising alternatives. This problem is particularly acute in connection with autism, where genetic research swallows up the vast majority of research funding — and has for more than twenty years. So, one of the keys to effectively addressing the autism epidemic will be to demonstrate the flaws in the genetic approach to disease causation and replace it with a more comprehensive ontology that has better explanatory power.

To put this debate in context, I want to recap the genetic argument in connection with autism as I have presented it thus far. In the 1990s, it was routine for scientists, doctors, and policymakers to assure worried parents that autism was genetic. To the extent that anyone ventured a guess, the explanation was that autism was 90% genetic, 10% environmental. Then the state of California commissioned 16 of the top geneticists in the country (Hallmayer et al. 2011) to study birth records of all twins born in the state between 1987 and 2004. Hallmayer et al. (2011) concluded that at most, genetics explains 38% of the autism epidemic, and they pointed out twice that this was likely an overestimate. Blaxill (2011) argues that the eventual consensus will be 90% environmental, 10% genetic. And in chapter 5, I showed a model from Ioannidis, (2005b, p. 700) that suggests that only 1/10th of 1% of “discovery oriented exploratory research studies” (which include nutrition and genetic studies with massive numbers of competing variables) are replicable.

And yet, a disproportionate share of federal research money in connection with autism is going to study genetic theories of disease causation. In 2013, the Interagency Autism Coordinating Committee spent $308 million on autism research across all federal agencies and private funders participating in research (IACC, 2013a). This is a shockingly low amount to spend on research given estimates that autism is currently costing the US $268 billion a year (Leigh and Du, 2015).

When one drills down into how the IACC spent the $308 million, it is largely focused on genetic research (especially if one examines the funding in the funding category “What Caused This To Happen And Can This Be Prevented?”) (IACC, 2013b). This is in spite of the fact that several groups of leading doctors and scientists including Gilbert and Miller (2009), Landrigan, Lambertini, and Birnbaum (2012), the American College of Obstetricians and Gynecologists (2013), and Bennett et al. (2016) have all concluded that autism and other neurodevelopment disorders are likely caused by environmental triggers.

Keep reading