Editor’s Note: Megan L. Ranney, MD MPH, is an emergency physician and dean of Yale School of Public Health. Katelyn Jetelina, MPH PhD, is an epidemiologist, advisor to the US Centers for Disease Control and Prevention, and author of the newsletter Your Local Epidemiologist. The views expressed in this commentary are their own. Read more opinion on CNN.
A physician friend recently shared something shocking. Her patient had been putting off a much-needed colonoscopy, despite a strong family history of colon cancer. The reason? The patient had heard online a false rumor that doctors were surreptitiously administering Covid-19 vaccines while patients were sedated — and she didn’t want to get the jab. No amount of discussion about why this would not happen could convince her.
We are seeing and hearing these types of stories across America. The rate of childhood vaccinations against dangerous diseases like diphtheria and tetanus have fallen in recent years — and non-medical exemptions to kindergarten routine vaccinations are rising. Climate change skepticism is growing. Reproductive health misinformation has been named the next “infodemic.”
This illustrates a big problem: a growing mistrust of science, medicine and public health.
This week, a new survey from Pew Research Center substantiated our experience: It reported a decline in the public’s trust in scientists, and a decrease even the belief that science has a positive impact on society, between 2016 and today. This survey was disappointing but not surprising. Pew, KFF, Annenberg and others have warned of this trend for years. The data is, unfortunately, worse for certain sociodemographic groups than others.
We’re worried about what lies ahead if we don’t turn the ship on trust in science. Imagine if there’s another pandemic, a climate disaster or a biosecurity hazard like anthrax — and we haven’t fixed this trust problem? Think about what might happen if we, as a country, stop funding the scientific innovation that has allowed our dramatic gains in human health. And what if more Americans refuse to get their kids vaccinated or if our public health institutions are shut down?
It’s a scary, dystopian vision of the future.
But luckily, it doesn’t have to be that way.
We can point to a small silver lining in the Pew data. Trust in scientists, physicians and science itself — amongst all demographic groups — is still higher than Americans’ trust in, say, elected officials or journalists. We can lean into this by better training health professionals and scientists — doctors, nurses, EMTs, as well as public health workers, pharmacists and researchers — in best practices for communication. We can teach how to make science tangible, with real stories, told in plain language, of what it means and of how it improves human lives.
But the fix for the trust problem has to be broader than just scientists and health-care providers. Medical professionals are already overburdened and facing staggering levels of workplace violence. The public health workforce has declined by half over the past two years, according to research from Harvard T.H. Chan School of Public Health. The wave of mistruths is louder, bigger and easier to find than honest, transparent messages. Scientists and health-care providers, alone, are insufficient.
Here, there’s more promising news. Different groups trust different messengers. Teachers, businesses and family can be incredibly powerful. In fact, during the pandemic, over 20% of unvaccinated people changed their minds about vaccination because of conversations with family and friends. For gun owners, the military and law enforcement may be more trusted advisers than physicians on issues of personal safety. Cultural congruence matters, too, particularly for groups that have experienced discrimination.
These organic networks can be used for good — regardless of demographics, political party, education level or state — when we collectively take the time to listen to, respect and empower our communities.
During the pandemic, for example, networks of non-health professionals were developed to disseminate evidence-based information on Covid-19 vaccines maintained and funded — ranging from a state-wide initiative in West Virginia, to hyperlocal initiatives like IHeard in St. Louis, Missouri, to community response teams in Marin County, California. These networks could be adapted to communicate about other public health problems, like opioids, too. Data-sharing efforts like Connecticut’s DataHaven show how urban neighborhoods can be involved in gathering and analyzing health data. Communication partnerships like Yale’s Climate Change Communication project also show promise. We can all learn to understand and share the latest news in engaging and inclusive ways.
Social media can play a part, too. Work by the National Academy of Medicine, in collaboration with the Council for Medical Specialty Societies and the World Health Organization (with which Dr. Ranney was involved), outlined ways for social media companies to identify and amplify “credible health messengers” — both the professionals and the everyday folks who are volunteering their time to create content. We applaud companies like YouTube that have made this work a priority, and hope that more companies will follow.
By combining facts with stories, we can share tangible examples of how science and public health protect us, thereby increasing trust. It’s not just vaccines and therapies, but also clean water, clean air, anti-lock brakes, smoke-free zones, over-the-counter pregnancy tests and MRI machines. Each of these discoveries, policies and technologies helps keep us healthy and safe. And many of us have personal narratives of how and why we’ve been helped.
At the end of the day, if the United States is going to improve our trust in science, we have to ensure that we are all public health communicators. Sharing data with the public and building trust with communities is an essential part of science. It’s time for our training and our actions to reflect that.
For more CNN news and newsletters create an account at CNN.com