Premature Babies Being Exposed to Massive Quantities of Gender-Bending Chemicals in Intensive-care Wards

Premature babies are being exposed to massive quantities of gender-bending chemicals in intensive-care wards, according to a new study. The findings are particularly shocking because premature babies are even more vulnerable to the effects of such chemicals than full-term babies.

In 2021, the EU brought into force a new regulation limiting the use of certain endocrine-disrupting substances in medical equipment, but this new study reveals these chemicals are still present in medical equipment and that the most vulnerable populations are being exposed to them at worrying levels. DEHP (di-(2-ethylhexyl) phthalate), for example, was widely detected in the new study, despite being a known endocrine-disruptor and probable carcinogen.

Researchers took urine samples from premature babies (neonates) born before 31 weeks gestational age at the Antwerp University Hospital, Belgium. Repeated samples were taken over a period of ten weeks after birth or until the subjects were discharged from the neonatal intensive care unit (NICU).

Analysis of the samples showed that almost all urine samples contained metabolites of harmful endocrine-disrupting chemicals including phthalates and other plasticizers. These chemicals have been linked to a wide variety of reproductive and health harms, from genital malformation, gender dysphoria and reduced fertility, to obesity and some forms of cancer.

Professor Shanna Swan, a reproductive-health expert from Mount Sinai University, has made endocrine-disruptors like phthalates central to her explanation of the global fertility crisis, which could see mankind unable to reproduce by natural means within decades, if current trends in sperm counts continue.

Keep reading

Study confirms the rotation of Earth’s inner core has slowed

University of Southern California scientists have proven that the Earth’s inner core is backtracking—slowing down—in relation to the planet’s surface, as shown in new research published in Nature.

Movement of the inner core has been debated by the scientific community for two decades, with some research indicating that the inner core rotates faster than the planet’s surface. The new USC study provides unambiguous evidence that the inner core began to decrease its speed around 2010, moving slower than the Earth’s surface.

“When I first saw the seismograms that hinted at this change, I was stumped,” said John Vidale, Dean’s Professor of Earth Sciences at the USC Dornsife College of Letters, Arts and Sciences. “But when we found two dozen more observations signaling the same pattern, the result was inescapable. The inner core had slowed down for the first time in many decades. Other scientists have recently argued for similar and different models, but our latest study provides the most convincing resolution.”

Keep reading

New concrete can turn homes into giant batteries

A new type of energy-storing concrete holds the potential to transform entire homes into giant batteries and supercharge the transition towards renewables, according to its creators.

Researchers at Massachusetts Institute of Technology (MIT) discovered that adding a highly conductive substance called carbon black to a water and cement mixture created a construction material that could also serve as a supercapacitor.

Supercapacitors can charge and discharge extremely efficiently but are typically not capable of storing energy for long amounts of time. So while they lack the functionality of traditional lithium-ion batteries – which are found in everything from smartphones to electric cars – they are a useful method of storing excess electricity generated from renewable energy sources like solar and wind.

Since first unveiling the technology last year, the team has now built a working proof-of-concept concrete battery, the BBC reported. The MIT researchers are now hoping to build a 45-cubic-metre (1,590-cubic-feet) version capable of meeting the energy needs of a residential home.

Keep reading

20-Year-Old Puzzle Solved: Physicists Reveal the “Three-Dimensional Vortex” of Zero-Dimensional Ferroelectrics

A KAIST-led research team has successfully demonstrated the internal three-dimensional polarization distribution in ferroelectric nanoparticles, paving the way for advanced memory devices capable of storing over 10,000 times more data than current technologies.

Materials that remain magnetized independently, without needing an external magnetic field, are known as ferromagnets. Similarly, ferroelectrics can maintain a polarized state on their own, without any external electric field, serving as the electrical equivalent to ferromagnets.

It is well-known that ferromagnets lose their magnetic properties when reduced to nano sizes below a certain threshold. What happens when ferroelectrics are similarly made extremely small in all directions (i.e., into a zero-dimensional structure such as nanoparticles) has been a topic of controversy for a long time.

The research team led by Dr. Yongsoo Yang from the Department of Physics at KAIST has, for the first time, experimentally clarified the three-dimensional, vortex-shaped polarization distribution inside ferroelectric nanoparticles through international collaborative research with POSTECH, SNU, KBSI, LBNL, and the University of Arkansas.

About 20 years ago, Prof. Laurent Bellaiche (currently at University of Arkansas) and his colleagues theoretically predicted that a unique form of polarization distribution, arranged in a toroidal vortex shape, could occur inside ferroelectric nanodots. They also suggested that if this vortex distribution could be properly controlled, it could be applied to ultra-high-density memory devices with capacities over 10,000 times greater than existing ones. However, experimental clarification had not been achieved due to the difficulty of measuring the three-dimensional polarization distribution within ferroelectric nanostructures.

Keep reading

Thermoelectric Effect Seen in Liquids for the First Time

Based on physics first observed over 200 years ago, thermoelectric devices can convert thermal energy into electrical energy and vice versa. But in all that time, thermoelectric phenomena had never been observed in an all-liquid system. That is, until researchers recently observed thermoelectricity at the interface between two liquid metals.

It’s an important observation: Liquid thermoelectrics could be used to create new devices for scavenging energy from waste heat, and insights from the research could help improve the design of liquid-metal batteries. The researchers, based at the École Normale Supérieure (ENS) in Paris, published their results today in the journal Proceedings of the National Academy of Sciences.

“Studying the thermoelectric effect at an interface between two liquid metals is one of those ideas that’s so intuitive and elegant that it seems obvious in retrospect,” says Douglas Kelley, a mechanical engineer at the University of Rochester. “But to my knowledge, nobody has done it before,” adds Kelley, who was not involved with the research.

Christophe Gissinger, a physicist at ENS, studies the basic physics of liquid metals and their applications in batteries. He says scientists know almost nothing about how temperature gradients affect the flow of electrical currents in conductive liquids. Gissinger says it occurred to him that the conductive layers in liquid-metal batteries were similar to thermoelectric devices. So he decided to look for thermoelectricity in liquid metals.

Gissinger and his colleagues chose two metals that are liquid at room temperature: gallium and mercury. The experiments were done in a cylinder with refrigerated walls. In the center of the cylinder, the researchers placed a smaller cylindrical heater. The researchers poured dense liquid mercury into the outer cylinder, then topped it with a layer of lighter liquid gallium. They heated the liquids from the interior, and cooled the cylinder’s outer walls, creating a temperature gradient along the interface between the two metals. Wires dipping into the liquid metals measured the resulting electric fields.

Keep reading

How Anthony Fauci Weaponized Science Against America

One of the biggest problems in our government is that people are promoted based on their loyalty (and sociopathy) rather than their competency (and integrity). In turn, the leadership of federal bureaucracy tends to be infested with individuals who habitually cover up the crimes of the government and those who are eager to sell out America to corporate interests—rather than the best minds our country has to offer who could genuinely move America forward.

Because of this, the employees of the federal government (e.g., our scientists) are often trapped in a position where they want to do the right thing but can’t because a president appointed a boss for them whose only qualification was lifelong loyalty to the corporation which funded the president’s election. RFK Jr. for example, has shared this was what he learned from repeatedly suing the Federal government, likewise numerous CDC employees signed a letter attesting to this, and more recently, when the GAO conducted an investigation, they found employees in the Federal agencies responsible for the pandemic response reported that they had seen political interference prevent scientifically correct policies from being enacted within their agency.

In my eyes, this reality is a product of the fact America is a superpower in decline which has gotten such an excess of wealth and power (and propaganda) that its government no longer has to produce results to stay in power (e.g., consider how pivotal institutions in the country such as the current presidential administration are now filling their ranks with grossly unqualified individuals who meet DEI metrics and then not being held to account for the horrendous failures of these individuals).

One of the individuals who best embodies the dysfunctional foundation of our government is Anthony Fauci. Despite being grossly incompetent, Fauci has amassed an unprecedented degree of power over the decades because of his unwavering commitment to the pharmaceutical industry and to covering up the (scientific) crimes of the American government.
Note: prominent figures such as Rand Paul and RFK Jr have compared Fauci to J Edgar Hoover, as both of them were career bureaucrats who amassed an unprecedented degree of power in the government and then leveraged it to force everyone else to go along with their crimes. In Fauci’s case, this is particularly unfortunate because he corrupted our national scientific apparatus and transformed it into something that hurt rather than helped the people of America.

Throughout his career, Fauci has done heinous things to the people of America, but by and large, he has completely escaped accountability for those actions. Yesterday however, this changed and Fauci was forced to testify in front of Congress, where he was formally accused by our leaders of his crimes against America. I wrote this article because much of what they accused him of, many others previously did as well (including prominent Democrats who are still in Congress).

Keep reading

OPTICAL ENGINEERS INVENT ULTRA-THIN COATING THAT TURNS ORDINARY GLASSES INTO HIGH-EFFICIENCY NIGHT VISION GOGGLES

A team of scientists has created an ultra-thin coating that can provide high-efficiency night vision to any glass surface, including ordinary reading glasses. Designed using something called a non-local metasurface, the plastic-wrap-thin coating also lets through all of the visible light, allowing users to see perfectly during daytime or at night.

Some previous efforts using a non-local metasurface to create a night vision coating have shown limited success. However, those efforts have suffered from severely limited image quality. The inventors of this newest coating say they have broken through that barrier, resulting in a high-definition image visible light image that could rival the infrared to visible light up-conversion rates of bulkier, more complex night vision systems already in commercial use today.

“People have said that high-efficiency up-conversion of infrared to visible is impossible because of the amount of information not collected due to the angular loss that is inherent in non-local metasurfaces,” explained Laura Valencia Molina, a researcher from the Australian National University’s ARC Centre of Excellence for Transformative Meta-Optical Systems (TMOS). “We overcome these limitations and experimentally demonstrate high-efficiency image up-conversion.”

Keep reading

Researchers Dumbfounded by High Excess Mortality ‘Despite Vaccines’ — Study

study published Monday which analyzed statistics from ‘Our World in Data’ indicated that despite the alleged ‘savior‘ of Covid vaccines, excess deaths worldwide still remain high.

“Excess mortality has remained high in the Western World for three consecutive years, despite the implementation of containment measures and COVID-19 vaccines. This raises serious concerns. Government leaders and policymakers need to thoroughly investigate underlying causes of persistent excess mortality,” the study said in the ‘Conclusion’ section.

Interestingly, Monday’s study which looked at the entire global population is not the first piece of research analyzing mortality rates in recent years.

A Norwegian study published in January had also found an increase in non-Covid-related deaths.

“There was significant excess mortality (number of deaths) in 2021 and 2022 for all causes (3.7% and 14.5%), for cardiovascular diseases (14.3% and 22.0%), and for malignant tumours in 2022 (3.5%). In terms of ASMR, there was excess mortality in 2021 and 2022 for all causes (2.9% and 13.7%), and for cardiovascular diseases (16.0% and 25,8%). ASMR was higher than predicted in 2022 for malignant tumours (2.3%). There were fewer deaths than predicted from respiratory diseases (except COVID-19) in 2020 and 2021,” the study said in the ‘Results’ section.

Keep reading

Badger culls to continue in England despite lack of scientific evidence

Badger cull licences have been issued by the government despite its own scientific adviser saying there is “no justification” for doing so.

Leaked documents seen by the Guardian show the Department for Environment, Food and Rural Affairs this month issued 17 new licences to continue culling badgers, overruling Dr Peter Brotherton, the director of science at Natural England, the government’s adviser for the natural environment in England.

Badgers are culled to the point of local extinction because they spread bovine tuberculosis (bTB) to cattle, and the disease can wipe out entire herds. Last year, figures released by Defra revealed more than 210,000 badgers had been killed since the cull began in 2013. However, scientific reports have shown that killing badgers is not the most effective way to end the disease.

Brotherton told Defra that while in previous years a cull could be justified, “based on the evidence, I can find no justification for authorising further supplementary badger culls in 2024 for the purpose of preventing the spread of disease and recommend against doing so”.

Defra officials said that in response they were pushing ahead with the cull because farmers who were most affected by bTB would lose confidence in the government if it was ended abruptly.

Sally Randall, Defra’s director general for biosecurity, food and trade, said in a letter to Natural England: “Those most affected by the disease must have confidence in both the process and the trajectory. Changes need to be carefully timed and communicated, whilst balancing a range of potentially opposing views. Any abrupt changes to policy would seriously undermine our ability to engage constructively with the industry on future disease control interventions.”

Brotherton said the badger population was likely to remain low for at least seven years, during which time vaccinations could be deployed to stop the spread of the disease.

He told Defra: “The balance of evidence has shifted. In my opinion it is now clear that badger vaccination can provide an effective alternative to [culls].”

Keep reading

Scientific integrity and U.S. “Billion Dollar Disasters”

Today, npj Natural Hazards, a journal in the Nature family of journals, officially published my new paper, “Scientific integrity and U.S. “Billion Dollar Disasters.”

The paper shows — irrefutably in my view — that the “billion dollar disaster” tabulation of the National Oceanic and Atmospheric Administration (NOAA), fails to meet the agency’s standards for information quality and scientific integrity.1

For reasons I describe in detail in the paper, the “billion dollar disaster” tabulation is not suitable as a “database” (scare quotes — it is not data by any standard) for the detection and attribution of trends in extreme weather. Similarly, the tabulation is not suitable for identifying the consequences of changes or variability in climate on the costs of disasters. The dataset has been widely misused inscience, by the media, and in policy.

It is, in a word, misinformation.2

Here is how the paper starts:

In the late 1990s, the U.S. National Oceanic and Atmospheric Administration (NOAA) began publishing a tally of weather and climate disasters that each resulted in more than $1 billion in damage, noting that the time series had become “one of our more popular web pages”1. Originally, the data was reported in current-year U.S. dollars. In 2011, following criticism that the dataset was misleading, NOAA modified its methods to adjusted historical losses to constant-year dollars by accounting for inflation.

By 2023, the billion dollar disaster time series had become a fixture in NOAA’s public outreach, was highlighted by the U.S. government’s U.S. Global Change Research Program (USGCRP) as a “climate change indicator,” was a cited as evidence in support of a “key message” of the Fifth U.S. National Climate Assessment showing that “extreme events are becoming more frequent and severe.” The time series is often cited in policy settings as evidence of the effects of human-caused climate change to increase the frequency and intensity of extreme weather events and associated economic damage, including in federal agencies, Congress and by the U.S. President. In addition to being widely cited in justifications of policy, as of March, 2024, NOAA’s billion dollar dataset has been cited in almost 1000 articles according to Google Scholar.

NOAA’s “billion dollar disaster” tabulation began as a simplistic but clever way to market NOAA and to attract the attention of reporters with a clickbaity listicle. At some point along the way, the “billion dollar disaster” list was somehow transformed into “data” used in peer-reviewed research, an official indicator of human-caused climate change featured by the U.S. National Climate Assessment, and used by the administration of President Joe Biden to justify a wide range of regulations and policy.

It is a remarkable story of how science can get off track and how misinformation can exist in plain sight, just like the emperor’s new clothes.

Keep reading