Opinion
Social media is the new public health frontline. Let’s treat it that way.
The results of the 2024 presidential election have ushered in a new era of uncertainty for public health. With Donald J. Trump soon back in the White House and his choice of Robert F. Kennedy Jr.—notorious for backing some public health conspiracy theories—as a key figure in the health sector, the stakes are immense.
Kennedy’s history of spreading disinformation about vaccines threatens to undermine decades of scientific progress and public trust. As the nation starts to grapple with the implications of this seismic shift, the role of accurate, evidence-based communication that actually reaches people has become more urgent than ever.
In this new landscape, social media creators have emerged as the frontline of public health communication. Often trusted more than traditional institutions, these creators wield significant influence over how health information is disseminated and understood by not only the masses but also the hardest-to-reach populations. Yet many creators have told me they lack the tools and training to verify and translate health information and instead rely on quick internet searches, which can inadvertently spread inaccurate content. This lack of accessible science creates a void that both unintentional misinformation and deliberate disinformation readily fill. Equipping creators to combat mis- and disinformation is no longer optional. It’s essential.
Sign up for Harvard Public Health
Delivered to your inbox weekly.
Health misinformation disproportionately affects youth, people of color, and low-income communities, who often rely on social media for accessible health information. A recent study from the Centers for Disease Control and Prevention revealed that misinformation has significantly decreased vaccination rates in some communities, contributing to the resurgence of preventable diseases like measles and whooping cough.
Additionally, social media users are more likely to encounter and believe health misinformation. We know that a majority of people who use social media for health advice reported in a 2023 poll hearing and believing at least one false claim about COVID-19 or vaccines, compared to only four in ten who don’t rely on social media for health advice. These vulnerable groups often lack the resources to verify the information they see online. Even well-intentioned creators—those not spreading deliberate disinformation—struggle to simplify complex, jargon-heavy science for their audiences. How can we empower them to share accurate, impactful health messages?
While tools like commercial AI can summarize content and fact-checking services can identify false claims, these often fall short when it comes to offering creators audience-specific, evidence-based material that’s ready for sharing. Inspired by the need for a more effective solution, people have been developing strategies to simplify complex research. AI tools that summarize and organize academic papers have cropped up in the past year.
An organization I founded, Science to People, is working on a tool called VeriSci that uses AI to transform peer-reviewed health studies into usable content, with a language model specifically fine-tuned to best practices in science communication. This fall, YouTube announced it is working on something similar for creators in its partner programs. With social media companies’ commitment to fact-checking and removing misinformation and disinformation now waning, the need for independent, publicly accessible tools that offer scientific information in digestible language is clearer now more than ever before.
The demand for accessible and trustworthy health information is clear. For example, a recent experiment demonstrated the power of providing accurate messaging on mental health on TikTok, where videos tagged #mentalhealth have drawn more than 44 billion views. The researchers offered influencers digital toolkits that contained evidence-based mental health content in everyday language across several topics. They found that the creators who received the toolkits were significantly more likely to include mental health content supported by research in their videos. They also uncovered significant impacts. In the treatment groups, TikTok videos featuring the provided content attracted more than half a million additional views after the intervention. A follow-up study of the comments on these videos showed that these viewers had improved mental health literacy.
With support, creators could expand this impact across many health topics, reaching millions with accurate, culturally relevant information. Imagine a mental health advocate sharing evidence-based strategies to manage anxiety, or a sexual health educator presenting reliable birth control options tailored to their audience. Providing creators with science-driven information has the potential to improve health literacy and make a measurable difference in underserved communities.
As we enter this new era, it’s time to recognize that the frontline of public health has shifted to social media, where creators are leading the charge in sharing health information. Supporting these creators with innovative, research-backed resources is essential for combating misinformation and protecting public well-being.
These digital communicators have become our new public health allies. Empowering them with the right tools can make a significant difference in reaching diverse and often underserved audiences.
Republish this article
<p>We must give influencers tools and training to deliver accurate health information.</p>
<p>Written by Brinleigh Murphy-Reuter</p>
<p>This <a rel="canonical" href="https://harvardpublichealth.org/tech-innovation/to-combat-misinformation-social-influencers-need-the-right-tools/">article</a> originally appeared in <a href="https://harvardpublichealth.org/">Harvard Public Health magazine</a>. Subscribe to their <a href="https://harvardpublichealth.org/subscribe/">newsletter</a>.</p>
<p class="has-drop-cap">The results of the 2024 presidential election have ushered in a new era of uncertainty for public health. With Donald J. Trump soon back in the White House and his choice of Robert F. Kennedy Jr.—notorious for backing some public health conspiracy theories—as a key figure in the health sector, the stakes are immense.</p>
<p>Kennedy’s history of spreading disinformation about vaccines threatens to undermine decades of scientific progress and public trust. As the nation starts to grapple with the implications of this seismic shift, the role of accurate, evidence-based communication that actually reaches people has become more urgent than ever.</p>
<p>In this new landscape, social media creators have emerged as the frontline of public health communication. Often trusted more than traditional institutions, these creators wield significant influence over how health information is disseminated and understood by not only the masses but also the hardest-to-reach populations. Yet many creators have told me they lack the tools and training to verify and translate health information and instead rely on quick internet searches, which can inadvertently spread inaccurate content. This lack of accessible science creates a void that both unintentional misinformation and deliberate disinformation readily fill. Equipping creators to combat mis- and disinformation is no longer optional. It’s essential.</p>
<p>Health misinformation disproportionately affects youth, people of color, and low-income communities, who often rely on social media for accessible health information. A recent study from the Centers for Disease Control and Prevention revealed that <a href="https://www.cdc.gov/mmwr/volumes/73/wr/mm7338a3.htm" target="_blank" rel="noreferrer noopener">misinformation has significantly decreased vaccination rates</a> in some communities, contributing to the resurgence of preventable diseases like measles and whooping cough.</p>
<p>Additionally, social media users are more likely to encounter and believe health misinformation. We know that a majority of people who use social media for health advice reported in a 2023 poll hearing and believing <a href="https://www.kff.org/coronavirus-covid-19/poll-finding/kff-health-misinformation-tracking-poll-pilot/">at least one false </a><a href="https://www.kff.org/coronavirus-covid-19/poll-finding/kff-health-misinformation-tracking-poll-pilot/" target="_blank" rel="noreferrer noopener">claim</a> about COVID-19 or vaccines, compared to only four in ten who don’t rely on social media for health advice. These vulnerable groups often lack the resources to verify the information they see online. Even well-intentioned creators—those not spreading deliberate disinformation—struggle to simplify complex, jargon-heavy science for their audiences. How can we empower them to share accurate, impactful health messages?</p>
<p>While tools like commercial AI can summarize content and fact-checking services can identify false claims, these often fall short when it comes to offering creators audience-specific, evidence-based material that’s ready for sharing. Inspired by the need for a more effective solution, people have been developing strategies to simplify complex research. AI tools that summarize and organize academic papers have cropped up in the past year.</p>
<p>An organization I founded, <a href="https://www.sciencetopeople.org/about" target="_blank" rel="noreferrer noopener">Science to People</a>, is working on a tool called <a href="https://www.sciencetopeople.org/verisci" target="_blank" rel="noreferrer noopener">VeriSci</a> that uses AI to transform peer-reviewed health studies into usable content, with a language model specifically fine-tuned to best practices in science communication. This fall, YouTube announced it is <a href="https://blog.youtube/news-and-events/expanding-equitable-access-to-health-information-on-youtube/">working on something </a><a href="https://blog.youtube/news-and-events/expanding-equitable-access-to-health-information-on-youtube/" target="_blank" rel="noreferrer noopener">similar</a> for creators in its partner programs. With social media companies’ commitment to fact-checking and removing misinformation and disinformation <a href="https://www.msn.com/en-us/technology/tech-companies/meta-s-fact-checking-rollback-ushers-in-a-new-chaotic-era-for-social-media/ar-BB1r7iJx">now </a><a href="https://www.msn.com/en-us/technology/tech-companies/meta-s-fact-checking-rollback-ushers-in-a-new-chaotic-era-for-social-media/ar-BB1r7iJx" target="_blank" rel="noreferrer noopener">waning</a>, the need for independent, publicly accessible tools that offer scientific information in digestible language is clearer now more than ever before.</p>
<p>The demand for accessible and trustworthy health information is clear. For example, a recent experiment<a href="https://www.nature.com/articles/s41598-024-56578-1"> </a>demonstrated the power of <a href="https://www.nature.com/articles/s41598-024-56578-1" target="_blank" rel="noreferrer noopener">providing accurate messaging</a> on mental health on TikTok, where videos tagged #mentalhealth have drawn more than 44 billion views. The researchers offered influencers digital toolkits that contained evidence-based mental health content in everyday language across several topics. They found that the creators who received the toolkits were significantly more likely to include mental health content supported by research in their videos. They also uncovered significant impacts. In the treatment groups, TikTok videos featuring the provided content attracted more than half a million <em>additional </em>views after the intervention. A follow-up study of the comments on these videos showed that these viewers had <a href="https://osf.io/preprints/socarxiv/p65nv" target="_blank" rel="noreferrer noopener">improved mental health literacy</a>.</p>
<p>With support, creators could expand this impact across many health topics, reaching millions with accurate, culturally relevant information. Imagine a mental health advocate sharing evidence-based strategies to manage anxiety, or a sexual health educator presenting reliable birth control options tailored to their audience. Providing creators with science-driven information has the potential to improve health literacy and make a measurable difference in underserved communities.</p>
<p>As we enter this new era, it’s time to recognize that the frontline of public health has shifted to social media, where creators are leading the charge in sharing health information. Supporting these creators with innovative, research-backed resources is essential for combating misinformation and protecting public well-being.</p>
<p class=" t-has-endmark t-has-endmark">These digital communicators have become our new public health allies. Empowering them with the right tools can make a significant difference in reaching diverse and often underserved audiences.</p>
<script async src="https://www.googletagmanager.com/gtag/js?id=G-S1L5BS4DJN"></script>
<script>
window.dataLayer = window.dataLayer || [];
if (typeof gtag !== "function") {function gtag(){dataLayer.push(arguments);}}
gtag('js', new Date());
gtag('config', 'G-S1L5BS4DJN');
</script>
Republishing guidelines
We’re happy to know you’re interested in republishing one of our stories. Please follow the guidelines below, adapted from other sites, primarily ProPublica’s Steal Our Stories guidelines (we didn’t steal all of its republishing guidelines, but we stole a lot of them). We also borrowed from Undark and KFF Health News.
Timeframe: Most stories and opinion pieces on our site can be republished within 90 days of posting. An article is available for republishing if our “Republish” button appears next to the story. We follow the Creative Commons noncommercial no-derivatives license.
When republishing a Harvard Public Health story, please follow these rules and use the required acknowledgments:
- Do not edit our stories, except to reflect changes in time (for instance, “last week” may replace “yesterday”), make style updates (we use serial commas; you may choose not to), and location (we spell out state names; you may choose not to).
- Include the author’s byline.
- Include text at the top of the story that says, “This article was originally published by Harvard Public Health. You must link the words “Harvard Public Health” to the story’s original/canonical URL.
- You must preserve the links in our stories, including our newsletter sign-up language and link.
- You must use our analytics tag: a single pixel and a snippet of HTML code that allows us to monitor our story’s traffic on your site. If you utilize our “Republish” link, the code will be automatically appended at the end of the article. It occupies minimal space and will be enclosed within a standard <script> tag.
- You must set the canonical link to the original Harvard Public Health URL or otherwise ensure that canonical tags are properly implemented to indicate that HPH is the original source of the content. For more information about canonical metadata, click here.
Packaging: Feel free to use our headline and deck or to craft your own headlines, subheads, and other material.
Art: You may republish editorial cartoons and photographs on stories with the “Republish” button. For illustrations or articles without the “Republish” button, please reach out to republishing@hsph.harvard.edu.
Exceptions: Stories that do not include a Republish button are either exclusive to us or governed by another collaborative agreement. Please reach out directly to the author, photographer, illustrator, or other named contributor for permission to reprint work that does not include our Republish button. Please do the same for stories published more than 90 days previously. If you have any questions, contact us at republishing@hsph.harvard.edu.
Translations: If you would like to translate our story into another language, please contact us first at republishing@hsph.harvard.edu.
Ads: It’s okay to put our stories on pages with ads, but not ads specifically sold against our stories. You can’t state or imply that donations to your organization support Harvard Public Health.
Responsibilities and restrictions: You have no rights to sell, license, syndicate, or otherwise represent yourself as the authorized owner of our material to any third parties. This means that you cannot actively publish or submit our work for syndication to third-party platforms or apps like Apple News or Google News. Harvard Public Health recognizes that publishers cannot fully control when certain third parties aggregate or crawl content from publishers’ own sites.
You may not republish our material wholesale or automatically; you need to select stories to be republished individually.
You may not use our work to populate a website designed to improve rankings on search engines or solely to gain revenue from network-based advertisements.
Any website on which our stories appear must include a prominent and effective way to contact the editorial team at the publication.
Social media: If your publication shares republished stories on social media, we welcome a tag. We are @PublicHealthMag on X, Threads, and Instagram, and Harvard Public Health magazine on Facebook and LinkedIn.
Questions: If you have other questions, email us at republishing@hsph.harvard.edu.