Scientists mimicking the Big Bang accidentally turn lead into gold

Medieval alchemists dreamed of transmuting lead into gold.

Today, we know that lead and gold are different elements, and no amount of chemistry can turn one into the other.

But our modern knowledge tells us the basic difference between an atom of lead and an atom of gold: the lead atom contains exactly three more protons. So can we create a gold atom by simply pulling three protons out of a lead atom?

As it turns out, we can. But it’s not easy.

While smashing lead atoms into each other at extremely high speeds in an effort to mimic the state of the universe just after the Big Bangphysicists working on the ALICE experiment at the Large Hadron Collider in Switzerland incidentally produced small amounts of gold.

Extremely small amounts, in fact: a total of some 29 trillionths of a gram.

How to steal a proton

Protons are found in the nucleus of an atom. How can they be pulled out?

Well, protons have an electric charge, which means an electric field can pull or push them around. Placing an atomic nucleus in an electric field could do it.

However, nuclei are held together by a very strong force with a very short range, imaginatively known as the strong nuclear force. This means an extremely powerful electric field is required to pull out protons – about a million times stronger than the electric fields that create lightning bolts in the atmosphere.

Keep reading

Bay Area scientist launches new company with sights on gene-edited babies

Last month, as he announced the launch, he said that Preventive has raised almost $30 million from private funding.

The funding is reportedly coming from some heavy hitters in the tech world, including OpenAI CEO Sam Altman and his husband Oliver Mulherin.

Harrington also said his team included leading experts in the fields of reproductive technology, reproductive medicine and genome-editing.

“Our goal is straightforward,” he wrote, “to determine through rigorous preclinical work whether preventive gene editing can be developed safely to spare families from severe disease.”

Harrington acknowledged the major ethical concerns around the science and the gray areas in the regulatory process, which he said, have opened the field to potentially detrimental outcomes. 

“The combination of limited expert involvement and lack of a clear regulatory pathway has created conditions for fringe groups to take dangerous shortcuts that could harm patients and stifle responsible investigation,” the researchers said, adding, “Given that this technology has the potential to save millions of lives, we do not want this to happen.”

Gene editing can only be used in in vitro fertilization to allow for the first step of genetic testing on an embryo.

“It requires IVF because you have to have the embryo in a dish,” explained Stanford law professor Henry (Hank) Greely, a leading expert on ethical, legal, and social implications in bioscience technologies.

Once a test determines an embryo has the DNA makeup of a genetic disease, for example, like Huntington’s or cystic fibrosis, scientists would then use the DNA editing technique known as Clustered Regularly Interspaced Short Palindromic Repeats, or CRISPR, to make alterations to the DNA.

Keep reading

Ancient Cannabis Enzymes Reveal How THC and CBD First Evolved

Scientists are taking a deeper look at the origins of cannabis chemistry by reconstructing enzymes from ancient plants, offering new insight into how cannabis first developed the ability to produce compounds like THC and CBD.

In a recent study published in Plant Biotechnology Journal, researchers at Wageningen University & Research rebuilt molecular structures that existed millions of years ago, revealing that ancient forms of cannabis enzymes were more flexible and robust than those found in modern plants.

The team behind the research says they have successfully traced the evolution of cannabinoid chemistry and identified molecular tools that could improve the biotechnological production of modern medicinal cannabinoids.

The Origin of Cannabinoids

In modern cannabis plants, specialized enzymes are responsible for making individual cannabinoids like THC or CBD. Each enzyme is highly efficient at producing one specific compound. The new study shows that this precision is a recent development in cannabis evolution, rather than something that existed from the start.

Early ancestors of cannabis used versatile enzymes that could create several cannabinoids at once. These enzymes became more specialized over time as gene duplication occurred. This led to the distinct chemical profiles seen in cannabis plants today.

The research team provided direct evidence for this evolutionary process by reconstructing ancient cannabis enzymes in the lab. Their results show that the pathways for creating specific cannabinoids like THC appeared relatively recently and became more specialized over time through natural selection.

Rebuilding Lost Enzymes

The team relied on ancestral sequence reconstruction to study this evolutionary history. They compared DNA from modern cannabis and related species to determine what cannabinoid-producing enzymes looked like millions of years ago.

The researchers synthesized the predicted enzymes and tested their functions in the lab. Many of the reconstructed enzymes converted precursor molecules into several different cannabinoids, unlike the more specialized modern enzymes.

These experiments enabled the team to directly test evolutionary hypotheses that had previously relied solely on genetic comparisons.

Ancient Enzymes as Biotech Tools

The most immediate implications of the study are for biotechnology rather than evolutionary biology. When the researchers expressed ancient enzymes in microbial systems, they found that the reconstructed enzymes were often easier to use than those found in modern cannabis plants.

“What once seemed evolutionarily ‘unfinished’ turns out to be highly useful,” said Robin van Velzen, who led the study with colleague Cloé Villard. “These ancestral enzymes are more robust and flexible than their descendants, which makes them very attractive starting points for new applications in biotechnology and pharmaceutical research.”

Keep reading

New insight into light-matter thermalization could advance neutral-atom quantum computing

Light and matter can remain at separate temperatures even while interacting with each other for long periods, according to new research that could help scale up an emerging quantum computing approach in which photons and atoms play a central role.

In a theoretical study published in Physical Review Letters, a University at Buffalo-led team reports that interacting photons and atoms don’t always rapidly reach thermal equilibrium as expected.

Thermal equilibrium is the process by which interacting particles exchange energy before settling at the same temperature, and it typically happens quickly when trapped light repeatedly interacts with matter. Under the right circumstances, however, physicists found that photons and atoms can instead settle at different—and in some cases opposite—temperatures for extended periods.

Implications for quantum computing

These so-called prethermal states are fleeting on human timescales, but they can last long enough to matter for neutral-atom quantum computers, which rely on interactions between photons and atoms to store and process information.

“Thermal equilibrium alters quantum properties, effectively erasing the very information those properties represent in a quantum computer,” says the study’s lead author, Jamir Marino, Ph.D., assistant professor of physics in the UB College of Arts and Sciences. “So delaying thermal equilibrium between photons and atoms—even for a matter of milliseconds—offers a temporal window to preserve and process useful quantum behavior.”

All quantum computers store and process information using qubits—the most basic units of quantum information and analogous to the binary bits used in classical computers. While classical bits can exist either as a 1 or a 0, qubits have the ability to exist in a superposition of two states at once, allowing for infinitely more complex calculations.

Keep reading

Superconductivity Breakthrough Brings Practical Use Closer than Ever, as Team Unveils “Hidden Magnetic Order in the Pseudogap”

In the quest for room-temperature superconductivity, an international team of physicists has uncovered a link between magnetism and the mysterious phase of matter known as the pseudogap, which may finally yield clues to achieving superconductivity above frigid, artificial temperatures.

Given the artificially cold temperatures on which current superconducting technologies rely, making their use impractical for many applications, the search for new room-temperature superconducting materials is a major goal of applied physics research.

Now, physicists from the Max Planck Institute of Quantum Optics in Germany and the Center for Computational Quantum Physics (CCQ) at the Simons Foundation’s Flatiron Institute in New York City are potentially helping to advance scientists closer than ever to superconducting at practical temperatures, as reported in a recent paper published in the Proceedings of the National Academy of Sciences.

Superconductors

Superconductors are materials that allow electrical current to flow without resistance. However, even in superconducting materials, the property only becomes active below a threshold temperature. This limits technological applications, as the materials require bulky cooling apparatus to maintain the desired temperatures, which are well below typical room temperatures.

Despite the volume of research involving superconductivity, in many ways it remains poorly understood, awaiting insights that will enable the next generation of quantum computing and other applications.

Some superconductors operate at what are considered “high temperatures,” although, in practical terms, these are still well below typical room temperatures and usually only slightly above absolute zero. What is interesting about those materials, however, is that they tend to exhibit a “pseudogap state” in which electrons begin to behave strangely as they transition to a superconducting state.

Understanding how this state leads to superconductivity could be essential to revealing the mechanisms at play and then applying them to produce room-temperature superconductors.

Testing the Pseudogap

Advancing toward resolving this long-standing issue, researchers used a quantum simulator set slightly above absolute zero to monitor electron spins. They identified that the up or down spins of electrons were influenced by their neighbors in a universal pattern.

At the center of the team’s work was the Fermi-Hubbard model, which describes electron interactions in a solid. The research team’s simulations successfully recreated this model, rather than a real-world material, using lithium atoms in an optical lattice of laser light at temperatures on the order of billionths of a degree above absolute zero. Simulations allowed the researchers a level of precision control impossible in real-world experiments.

When materials host an unaltered amount of electrons, they spin in an alternating pattern called antiferromagnetism. Through a process called “doping,” electrons can be removed, disrupting the magnetic order in a way that physicists had long assumed was permanent. Yet in the new observations, the team discovered a hidden layer of organization present beneath the seeming chaos at very low temperatures.

Keep reading

Physicists Have Achieved Quantum “Alchemy” by Exciting Electrons to High-Energy States

A promising—and powerful—new engineering breakthrough could soon enable researchers to alter the properties of materials by exciting electrons to higher-than-normal energy levels.

In physics, Floquet engineering involves changes in the properties of a quantum material induced by a driving force, such as high-powered light. The resulting effect causes the material’s behavior to change, introducing novel quantum states with properties that do not occur under normal conditions.

Given its promising applications, Floquet engineering has remained of interest to researchers for many years. Now, a team of scientists from the Okinawa Institute of Science and Technology (OIST) and Stanford University says they have developed a new method for achieving Floquet physics that is more efficient than past methods that rely on light.

21st Century Alchemy?

Professor Keshav Dani, a researcher with OIST’s Femtosecond Spectroscopy Unit, said in a statement announcing the breakthrough that the team’s new approach leverages what are known as excitons, which have proven far more powerful in coupling with quantum materials than existing methods “due to the strong Coulomb interaction, particularly in 2D materials.”

Because of this, Dani says, excitons “can thus achieve strong Floquet effects while avoiding the challenges posed by light.” The team says this offers a novel means of exploring various applications, which include “exotic future quantum devices and materials that Floquet engineering promises.”

Such unique phenomena could enable material science applications that are almost akin to alchemy, in that the concept of creating new materials simply by shining light on them sounds more like science fiction than even the most advanced 21st-century engineering.

Floquet Engineering

In the past, Floquet effects have remained elusive in the lab, although investigations over the years have demonstrated their promise, provided they can be achieved under practical conditions. However, a major limiting factor has been reliance on intense light as the primary driving force, which can also lead to damage or even vaporization of the materials, thereby limiting useful results.

Normally, Floquet engineering focuses on achieving such effects under quantum conditions that challenge our usual expectations of time and space. When researchers employ semiconductors or similar crystalline materials as a medium, electrons behave in accordance with what one of these dimensions—space—will allow. This is because of the distribution of atoms, which confines electron movement and thereby limits their energy levels.

Such conditions represent just one “periodic” condition that electrons are subjected to. However, if a powerful light is shone on the crystal at a certain frequency, it represents an additional periodic drive, albeit now in the dimension of time. The resulting rhythmic interaction between light (i.e., photons) and electrons leads to additional changes in their energy.

By controlling the frequency and intensity of the light used as this secondary periodic force, electrons can be made to exhibit unique behaviors, which also cause changes in the material they inhabit for the time during which they remain excited.

Keep reading

We Were Told There Is No Scientific Evidence for UFOs. Our Research Says Otherwise

Two months ago, the documentary The Age of Disclosure premiered in theaters and on Amazon Prime Video.

In the film, 34 government officials, including Secretary of State Marco Rubio and senior members of Congress from both parties, reveal what they are able to disclose publicly about unidentified flying objects (UFOs).

Rarely have so many highly credible testimonies been assembled in a single production, which quickly became the most-purchased film on the streaming platform.

We learn not only about UFO sightings, but also about serious allegations of secret government programs studying UFOs, crash-retrieval efforts involving non-human vehicles, and threats directed at whistleblowers.

The implications are enormous: our planet may be visited — or even inhabited — by another intelligent species, far more advanced than ourselves.

The Age of Disclosure has been met with both fascination and skepticism. The skeptics’ central response has been, “Where is the data? Where is the evidence?

Unsurprisingly, many news outlets have opted for lighter undertones in their coverage, choosing their language carefully to distance themselves from the exotic nature of the claims made in the film.

The topic has long been ridiculed and stigmatized within scientific circles, where engaging with it was considered a near-certain path to career ruin. Media houses and editors often fear publishing pieces that might appear to support such claims, and any articles that do emerge tend to downplay their significance.

But is there truly a serious lack of evidence for UFOs, as skeptics have insisted since the 1950s?

For the past several years, my colleagues and I have analyzed “transients,” intriguing astronomical phenomena which change in brightness – or disappear entirely – over short periods of time.

Our research has zeroed in on hundreds of thousands of bright, star-like short flashes of light, recorded in photographic surveys of the night sky. Importantly, these astronomical observations are from the years before the Soviet Union launched the first man-made satellite, Sputnik, in 1957.

In two papers published recently in respected, peer-reviewed scientific journals, we make a compelling case that at least some of these bright flashes are reflections of the Sun off of objects of unknown, but non-natural, origin.

We also find a statistically significant correlation among these bright flashes, historical eyewitness UFO reports, and above ground nuclear tests that were being conducted at that time. Unsurprisingly, our work has garnered significant attention from our scientific colleagues.

Keep reading

Researchers Successfully Reverse Alzheimer’s in Mice: Peer-Reviewed Study

Scientists have reversed Alzheimer’s disease in mice, potentially showing a pathway to treat the illness among humans, according to a Dec. 22 peer-reviewed study published in the Cell Reports Medicine journal.

Alzheimer’s is traditionally considered irreversible. In the study, researchers treated two groups of mice with P7C3-A20, a pharmacologic agent. One group carried human mutations related to amyloid processing, while the other carried a tau protein mutation. Both amyloid and tau pathologies are two major early events of Alzheimer’s.

Researchers say that as mice develop brain pathologies resembling Alzheimer’s, they are ideal subjects to test how P7C3-A20 affects Alzheimer’s in humans.

Among the amyloid mice, treatment with P7C3-A20 was found to have resulted in restoring the proper balance of Nicotinamide adenine dinucleotide (NAD+), which is a cellular energy molecule and a major driver of Alzheimer’s disease. As people age, NAD+ levels decline in their bodies, including the brain. Without proper NAD+ balance, the cells are unable to execute critical processes necessary for proper functioning.

The treatment was found to have reversed blood-brain barrier deterioration, DNA damage, oxidative stress, and neuroinflammation, researchers wrote. The blood-brain barrier maintains nutrient and hormone levels in the brain while protecting the organ from toxins and pathogens.

The treatment enhanced synaptic plasticity and hippocampal neurogenesis, a process in which new functional neurons are generated.

Keep reading

Tinker, tailor, publisher, spy: how Robert Maxwell created the academic peer review system

Publication of research results, theoretical propositions and scholarly essays is not a free-for-all. As shown by the dogmatism around climate change and Covid-19, sceptics struggle to get papers in print. The gate-keeper is the peer-review system, which people take for granted as a screening process to ensure rigour in scientific literature.

But it is not always been that way. Until at least the 1950s, the decision to publish was made by the editors of academic journals, who were typically eminent professors in their field.

Peer review, by contrast, entails the editor sending an anonymised manuscript to independent reviewers, and although the editor makes a final decision, the reviews indicate whether the submission should be accepted, revised or rejected. This may seem fair and objective, but in reality peer review has become a means of knowledge control – and as we argue here, perhaps that was always the purpose.

You may be surprised to know that the instigator of peer review was the media tycoon Robert Maxwell. In 1951, at the age of 28, the Czech emigree purchased three-quarters of Butterworth Press for about half a million pounds at current value. He renamed it as Pergamon Press, with its core business in science, technology and medicine (STM) journals, all of which instilled peer review.

According to Myer Kutz (2019), ‘Maxwell, justifiably, was one of the key figures — if not the key figure — in the rise of the commercial STM journal publishing business in the years after World War II’.

Maxwell’s company stole a march on other publishers and its influence was huge. By 1959 Pergamon was publishing 40 journals, surging to 150 by 1965. By 1996, one million peer reviewed articles had been published. Yet despite the increase in outlets, opportunities for writers with analyses or arguments contrary to the prevailing narrative are limited.

Maxwell was instrumental to peer review becoming a regime to reinforce prevailing doctrines and power.

Back in 1940, Maxwell was a penniless 16 year-old of Jewish background, having left his native land for refuge in Britain. His linguistic talents attracted him to the British intelligence services. On an assignment in Paris in 1944 he met his Huguenot wife Elisabeth. After war ended in 1945 he spent two years in occupied Germany with the Foreign Office as head of the press section.

Four years later, with no lucrative activity to his name, this young man found the money to buy an established British publishing house. According to Craig Whitney (New York Times, 1991), Maxwell made Pergamon a thriving business with ‘a bank loan and money borrowed from his wife’s family and from relatives in America’.

But how was he able to acquire Butterworth Press, initially? A clue is given by a BBC video clip (2022) on Maxwell’s links to intelligence networks. While operating as a KGB agent in Berlin, he presented himself to MI6 as having ‘established connections with leading scientists all over the world’. According to investigative journalist Tom Bower, ‘unbelievably what he really wanted was for M16 to finance him to start a publishing company’.

This point is corroborated by Desmond Bristow, former M16 officer, who states that Maxwell asked the secret security service to finance his venture. Seven years after launching Pergamon Press, Maxwell moved into Headington Hill Hall, a 53-room mansion in Oxford, which he leased from Oxford City Council.

Keep reading

Bermuda Mystery Surfaces with Discovery of Massive Underground Structure, Revealing a New Deep-Earth Anomaly

A new seismic analysis has revealed an unusually thick structure beneath Bermuda, a geological oddity that defies conventional models and may rewrite scientists’ understanding of how the island chain emerged.

The unusual feature consists of a 12.4-mile-thick layer of rock beneath the crust, located within the tectonic plate beneath Bermuda. Scientists have never detected such a thick layer of rock under similar tectonic conditions, where the mantle is typically found.

Bermuda Mystery

The 181-island chain of Bermuda has long puzzled geologists. The oceanic crust beneath the islands sits at a higher elevation than the surrounding seafloor due to a mysterious swell. Typically, volcanic activity would account for such uplift, yet geologists believe the region hasn’t experienced an eruption in 31 million years—a discrepancy that has fueled decades of speculation.

The newly discovered structure may help resolve that puzzle. Despite the extreme age of Bermuda’s last known eruption, the massive rock layer suggests that ancient volcanic activity could have injected a significant volume of mantle material into the crust. That slab now appears to be pushing the ocean floor upward by nearly 1,700 feet relative to nearby areas.

Similar mantle quirks may explain the formation of other islands worldwide. At certain locations known as mantle hotspots, rising plumes of hot material generate volcanic activity that builds islands from below—Hawaii being a prime example. In most cases, however, the crust eventually moves away from the hotspot, causing the uplift to subside over time.

Bermuda’s uplift, persisting for more than 31 million years, defies that pattern. What exactly is occurring beneath the island remains the subject of active debate.

Imagining the Bermuda Rock

The team behind the discovery, spread across multiple U.S. institutions, including Yale and Smith College, reported their findings in a new paper in Geophysical Research Letters. They relied on seismic data to make their discovery, drawing from a seismic station located on Bermuda, which collected the data by observing large earthquakes occurring at great distances from the island. 

These observations allowed scientists to image the Earth below Bermuda to a depth of 31 miles. Changes in the signal received as the tremors reached Bermuda enabled the teams to identify the anomalous rock layer, which varied in density, thereby altering the seismic waves.

Earlier research on Bermuda’s geology revealed that the archipelago’s ancient lava was low in silica, indicating that it was produced from high-carbon rock. Further analysis of the material’s zinc content revealed that the lava originated deep in the mantle. Geologists believe that the rock originally entered the mantle during the formation of the Pangea supercontinent some 900 to 300 million years ago.

Keep reading