Johnson & Johnson knew for decades their baby powder was tainted with carcinogenic asbestos and they kept that information from regulators and the public. A government-funded study from the mid-1990s found that Johnson’s baby powder caused cancer in rats and other studies have found an increased risk of cancer in women who used their talc-based products. The potential risks have been known to the company for decades.
What’s more, in 2018, the pharma giant was ordered to pay $4.7 billion to thousands of victims who reportedly developed cancer from using Johnson & Johnson’s products. In that case, 22 women alleged the company’s talc-based products, including its baby powder, contained the known carcinogen, asbestos, which caused them to develop cancer. According to reports, there are over 9,000 similar talc lawsuits against the company.
Currently faced with several major lawsuits for fueling the opioid crisis in the United States, Johnson & Johnson also has a history of bribing doctors and government officials. Even more disturbing still, a Reuters investigation found that J&J knowingly sold a baby powder product that they knew had asbestos in it, which causes mesothelioma.
“In people who suffer from stress-related diseases, this circadian rhythm is completely thrown off and if the body makes too much or not enough cortisol, that can seriously damage an individual’s health, potentially leading to obesity, cardiovascular disease, depression or burnout.” – Adrian lonescu, Swiss Federal Institute of Technology Lausanne (EPFL), lead Nanoelectronic Devices Laboratory researcher While these devices may be helpful in a hospital setting, technology companies fully intent to integrate them into wearable tech like smart watches, pushing us closer to a world where everything we do is being tracked and recorded around the clock. “The joint R&D team at EPFL and Xsensio reached an important R&D milestone in the detection of the cortisol hormone,” said Xsensio CEO Esmeralda Magally. “Xsensio will make the cortisol sensor a key part of its Lab-on-SkinTM platform to bring stress monitoring to next-gen wearables.” These microchips are intended to eventually connect to the ‘internet of things,’ a comprehensive array of devices which track and record us at all times from our homes to our places of work. Former US intelligence chief James Clapper admitted over five years ago that the government ‘might’ use the internet of things to spy on you.
In 2003, terrorism was a more immediate national danger than infectious diseases. Dr. Anthony Fauci’s National Institute of Allergy and Infectious Diseases (NIAID) had just redirected $117 million from infectious diseases to fund a new anthrax vaccine effort in response to the anthrax attacks that happened a week after 9/11.
The millions were just a small part of the $1.8 billion Fauci had poured into defense from bioterrorist attacks over the preceding two years. More than half of those funds were devoted to anthrax and smallpox alone. In 2004, Fauci launched the $5.6 billion “Project Bioshield,” the National Institutes of Health’s biggest outlay for a single research issue until then.
Some microbiology researchers at the time, however, according to the journal Nature, were concerned that Fauci’s actions would ultimately “distort priorities in infectious-disease research, sucking money away from work to understand and counter natural disease outbreaks that ultimately pose a greater threat to public health.” The 2003 Nature article cited a Stanford University microbiologist saying “that diseases such as influenza and other respiratory-tract infections routinely kill far more people than would die in a bioterrorist attack, and therefore deserve a greater share of the NIAID budget.”
The criticism turned out to be warranted. In 2007, after spending billions under the opposite premise, Fauci admitted that “at the end of the day, you’re not going to kill as many people [with an anthrax attack] as you would if you blasted off a couple of car bombs in Times Square.” His anthrax vaccine effort had failed, having been “sunk by lobbying.”
The anthrax vaccine failure followed on the heels of Fauci’s controversial leadership of the nation’s AIDS response in the 1980s and ‘90s. According to “Good Intentions,” a 1990 book by investigative author and innovation expert Bruce Nussbaum, Fauci started his career as “a lackluster scientist,” who “found his true vocation—empire building” when he took the reins at NIAID in 1984.
To ensure that AIDS would be his exclusive demesne within the federal government, Fauci “started the most important bureaucratic battle in the history of the fight against AIDS,” squeezing out more scientifically competent, but less conniving administrators. According to Nussbaum, if Fauci had not won the battle, “many people who died might have lived.”
The White House has been reaching out to social media companies including Facebook, Twitter and Alphabet Inc’s Google about clamping down on COVID misinformation and getting their help to stop it from going viral, a senior administration official said.
President Joe Biden, who has raced to curb the pandemic since taking office, has made inoculating Americans one of his top priorities and called the move “a wartime effort.” But tackling public fear about taking the vaccine has emerged as a major impediment for the administration.
Since the onset of the pandemic, calls from lawmakers asking the companies to tackle the spread of COVID misinformation on their platforms have grown.
The White House’s direct engagement with the companies to mitigate the challenge has not been previously reported. Biden’s chief of staff Ron Klain has previously said the administration will try to work with Silicon Valley on the issue.