Amid the continuous discourse of climate change denial and vaccine distrust, a recent study conducted by the Pew Research Center based in Washington, D.C. found that, in general, the American public’s trust in science is actually rising. The majority of respondents believed that scientific research and medical innovations are making a positive impact on society and people’s quality of life. However, the study also found great disparities in the level of public trust on specific topics, including climate change, genetically modified organisms, and vaccinations. In addition, outcomes were influenced by the respondent’s ethnicity, gender, age, cultural background, and political ideology. For example, Americans of Hispanic and African descent held less trust in medical professionals and the pharmaceutical industry than Caucasians; the reasons for findings like this are complicated and historically influenced. What is influencing public trust in science, and what recent changes in social norms are influencing this phenomenon?

Insufficiency in Promoting Science News

Based on a survey by the Pew Research Center in 2017, only one-in-six Americans actively seek out and consume science news on a weekly basis. The most popular sources are news outlets and social media, followed by documentaries, podcasts, and science museums. More rarely do non-scientists read about the latest scientific developments straight from primary sources, such as peer-reviewed scientific publications. While news outlets can help bridge the knowledge gap between the lay public and scientists, some scientific inaccuracies naturally follow from the difficulty of translating technical data into lay language. Findings may be reported before having been rigorously validated; writers may not have trained in the relevant scientific topics; reports may oversimplify or exaggerate the results, resulting in potentially harmful consequences. For example, non-evidence-based reports and advertisements about stem cell therapy contributed to a boom in people undergoing stem cell transfers, which has led to financial loss and severe medical emergencies, including death. In return, these incidences have contributed to an increasing distrust in and misrepresentation of scientists and medical professionals.

To solve these problems, we need more accessible, simpler, and more entertaining media platforms to report new findings. Some existing examples are magazines like Scientific American, whose reports are simplified but still accurate, TV shows like Last week Tonight with John Oliver, which presents serious topics in a more fun yet still informative way, and YouTube channels like Kurzgesagt – In a Nutshell, which effectively translates and visualizes dense scientific topics into easily understandable concepts for a lay audience.

 

The Spread of Fake News

The spread of misinformation and pseudoscience on social networks is another serious issue fueling public distrust in science. Social networks like Facebook and Twitter have made learning and exchanging knowledge much easier than ever before, but they have also given fake news and conspiracy theories the opportunity to flourish. The daily flood of information makes it incredibly challenging for consumers to distinguish fact from fiction. The fast pace of the information age limits the time available for critical analysis of the knowledge we consume. Furthermore, online groups and forums foster group polarization, enabling antivaxxers and flat Earthers to congregate and propagate dangerous ideas. At the March 2019 US congressional meeting on vaccination, Ethan Lindenberger, an 18-year-old famous for getting vaccinated against his antivaxxer parents’ wishes, pointed out that social media giants like Facebook and Pinterest were the main culprits feeding false ideas about vaccination to his mother. These companies’ failure to fact-check and curb the spread of false information contributes to a growing belief in pseudoscience and damages the public’s trust in reliable, evidence-based research.

 

A Reproducibility Crisis and Research Misconducts

It is easy to point the finger at social media and the public for this growing crisis of trust in science, yet we (scientists, medical professionals, and pharmaceutical companies) should also be held accountable. According to a survey conducted by Nature, over 70% of researchers across all fields fail to reproduce another scientist’s experiments. Earlier this year, we witnessed the exposure of serious scientific misconducts implicating some of the most prominent scientists in their respective fields. For example, Dr. Piero Anversa’s discovery of alleged heart stem cells was once hailed as revolutionary and brought hope to patients with heart disease. Yet earlier this year, Harvard Medical School and Brigham and Women’s Hospital accused him of fabrication and falsification of his research, encompassing 31 important publications. His malpractice not only led to the misuse of millions of dollars of funding and the hard work of numerous researchers, but also further exacerbated the public’s skepticism in scientists. Similarly, in psychology, one of the most famous and compelling studies, the Stanford Prison Experiment, has been found to be fraudulent; the researcher instructed participants on how to behave in the roles of “guard” and “prisoner”. In the forensic sciences, it was recently exposed that research on the accuracy of many widely used techniques, such as blood pattern analysis and bite mark analysis, is lacking. Studies have found that about half of the cases where people were convicted but later exonerated by DNA testing technology involved misapplication of unvalidated techniques. As a scientific community, if we cannot reproduce or trust our colleagues’ work, how can we expect to win trust from the public?

 

Biases in Medical Research

Long-existing biases in study design have stepped into the awareness of the general public. For decades, basic research and clinical studies have made insufficient effort to address the roles of sex, racial, and age differences in study design, data collection, and analysis. In basic science research, numerous observations and conclusions have been made on studies using animal models of one sex at a specific age. Clinical trials have long underrepresented women, minor ethnic groups, and the elderly. This can have severe consequences, as many important factors associated with the development of novel therapeutics, such as disease symptoms, side effects, and dosage, vary substantially across race, sex, and age. For example, women are more likely to experience serious side effects than men for specific drugs, even when taking the recommended safe dose. In fact, in 2013 the FDA cut the recommended dose of a widely used sleeping pill, zolpidem (Ambien), for women in half, after it was discovered that women take longer to clear the medicine. Even though scientists and institutions are now increasing the number of sex- and race-specific clinical studies, progress has been limited. First, fear of discrimination among women of certain ethnicities has made them less willing to participate in clinical trials. Secondly, fear of possible exploitation driven by historical unethical research conduct (such as the infamous Tuskegee experiment) also contributes to distrust and reluctance in participating in experimental clinical studies.

Conclusion

The scientific community is reliant on the support of the general public, from participation in clinical trials to voter support of research funding. New scientific discoveries and the development of medical innovations require participation of public volunteers from various social groups; public opinion on scientific topics, like climate change, impacts policy making and decisions on research funding. Since our ultimate goal is to use scientific knowledge to build a better and more equitable society, we cannot turn a blind eye to the ongoing crisis of the public’s trust in science. More thorough and rigorous studies will be needed to identify how to tackle the major causes underlying this crisis.


References

  1. https://www.journalism.org/2017/09/20/science-news-and-information-today/
  2. https://www.nytimes.com/2019/10/04/health/fda-descovy-truvada-hiv.html
The following two tabs change content below.

Mengdi Guo

Mengdi is a PhD student in the Department of Immunology, University of Toronto. She is interested in reading, learning new knowledge and writing.

Latest posts by Mengdi Guo (see all)

Previous post Issue 3, 2019 – Cover
Next post Fighting Fake News: The Anti-Vaccination Story

Leave a Reply

Your email address will not be published. Required fields are marked *

Close

Feed currently unavailable. Check us out on Twitter @immpressmag for more.


Sponsors