For colleagues working in public health, education, or clinical care, this compilation of information grew out of the time I’ve spent speaking about health misinformation and digital information environment in undergraduate and graduate classrooms. Over the last few years, I’ve collected specific examples that consistently spark interest and debate among young people. These cases provide a rare opportunity to discuss digital ecologies in a way that is actually relatable, reflecting the messy, high-stakes ways that young people experience digital conversations and relationships every day.
A central part of this conversation is what Theresa (Terri) Senft calls Digital Influence Literacy. In her article Who Cares How Information Feels? A Call for Digital Influence Literacy, she argues that we need to look beyond just “facts” and focus on how information feels. She defines this literacy as our capacity to recognize, analyze, and emotionally regulate the feelings (moods, sentiments, or “vibes” ) that platforms generate and monetize. For young people, health information isn’t just factual data; it is an emotional experience that moves through their digital ecosphere. Terri also suggests additions to educators’ curricula.
Critical health literacy is now a vital skill for navigating this landscape. It goes beyond finding the “right” information to understanding the power dynamics and technical designs that shape our choices. If you are working in this space, there is a timely open call for papers at Health Promotion International that I encourage you to check out: Critical Health Literacy Call for Papers.
So to guide any discussion with young people, I suggest using these five lenses. They help move the conversation from “what information is wrong” to “how is this information affecting us”:
- contrast “commercial” vs. “caring”: analyze whether the source is offering genuine support or monetizing insecurity. Look for revenue models behind the “advice,” such as affiliate links for supplements, paid “masterclasses,” or ad-driven engagement. is this person providing a service, or are they monetizing your insecurity? Look for hidden revenue streams like supplements, “masterclasses,” or ad-driven engagement.
- focus on the “why” (the empathy filter): people rarely follow misinformation because they are “uneducated.” Shift the focus from what is wrong to why it feels right. People often turn to these spaces to solve real-world problems like chronic stress, isolation, or a desire for bodily autonomy. what “void” was this information filling for the person in the story? Did it offer a sense of control, a community, or a simpler answer to a complex health problem?
- the “relatability” test: different platforms have different “vibes” that affect our skepticism. Discuss how the medium affects the message. Does a private WhatsApp note feel more “truthful” because it’s a secret? Does a TikTok “vibe” make extreme advice feel like a shared lifestyle choice? which of these digital spaces (TikTok, WhatsApp, AI Chat) do you trust most? Have you ever seen health advice there that made you feel your own life was “unhealthy” or “impure” compared to the aesthetic on screen?
- the incentive trap: in the digital economy, “engagement” is a currency. Even if someone isn’t selling a pill, they may be “selling” a radical opinion to gain followers. does the platform’s algorithm reward this person for being accurate, or for being shocking? Is the goal to help you, or to keep you scrolling?
- the authority shift: there is a growing gap between “institutional authority” (the doctor) and “relatable authority” (the person who looks like you). Examine why a “real person” on a screen often feels more trustworthy than a doctor in a lab coat. Identify what the medical establishment is missing (time, validation, or shared lived experience) that influencers are successfully providing. why does a “real person” on a screen sometimes feel more trustworthy than a scientist in a lab coat? What is the doctor missing that the influencer is providing?
I’ve begun organizing my collection of 20+ real-world cases from long-form journalism reporting into thematic “sidecar” articles to make them easier to use in practice. You can find the initial deep-dives with facilitator notes here, covering:
- the private trust gap: why we believe a voice note from a cousin on WhatsApp more than an official infographic from the WHO.
- the optimization trap: exploring the manosphere, “bigorexia,” and how digital masculinity is being medicalized through unregulated supplements.
- the aesthetic-to-filler pipeline: from “Sephora Kids” to “Cortisol Face,” investigating how digital filters transform normal biology into a “flaw” that requires a purchase.
- algorithmic vortexes: when the feed identifies vulnerability before a clinician does, pushing users into loops of self-harm or eating disorders.
- the ai mirror: how sycophantic chatbots mirror our worst impulses and the high risks of forming emotional bonds with statistical models.
Appendix: Practitioner’s Toolkit & Resources
The following tools and readings are gathered from the original discussion framework and updated research to help bridge the gap between theory and practice in health communication.
Commercial vs. caring resources
- Quackwatch: A massive, long-running database that investigates health fraud and “quackery.” It helps you verify if a specific supplement, “alternative” therapy, or influencer claim has a history of misleading marketing or regulatory warnings. Explore Quackwatch.
- Use the Meta Ad Library or the TikTok Creative Center. You can search for a specific influencer or brand to see if they are running paid ads for the “miracle cure” they are talking about. It’s a reality check to see if their “caring advice” is actually a paid marketing campaign.
- The SIFT Method (Stop, Investigate the source, Find better coverage, Trace claims) by Mike Caulfield is a simple four-step guide that is widely used in schools to help people spot commercial motives.
- Furhter reading/watching: Check out the documentary The Social Dilemma (Netflix) for a deep dive into how “caring” interfaces are designed to maximize profit.
- The “De-influencing” Trend (2024): Search TikTok or Instagram for “#deinfluencing” + a health product. This creator-led movement specifically calls out over-hyped health products and reveals how sponsorship “caring” language is crafted.
Empathy & “the why” resources
- New Literacy Project’s “Brain Gains” planner, themed around the weekly “exercises” students can do to strengthen their ability to spot health and science misinformation and identify credible sources, is a downloadable tool you can save, print and share.
- MisinfoRx: A resource specifically for health communicators and clinicians to help navigate difficult conversations about misinformation. It focuses on strategies to understand a person’s underlying values and build trust through collaborative partnership rather than just correcting facts. Explore MisinfoRx.
- The game “Cranky Uncle” is a free app designed by scientists. It uses cartoons to teach you the “techniques” of science denial (like fake experts or logical fallacies) in a way that feels like a puzzle rather than a lecture. https://crankyunclevaccine.org/
- Harmony Square: A 10-minute game where you purposefully sow division in a quiet neighborhood to learn how “us vs. them” narratives are rewarded by algorithms. Harmonysquare.game
- The MLN Resource Library, for classroom, home, community, or constituents, we have a number of resources and tools available to help you advocate for media literacy education.
Relatability & aesthetics resources
- A tool to use is Google Lens on a phone. You can take a photo of a “miracle ingredient” or a “perfect aesthetic” post, and it will show you where else that image exists. Often, you’ll find it’s a stock photo or from a different context entirely.
- How to Spot a Deepfake: https://detectfakes.kellogg.northwestern.edu/ .
- Practice “Lateral Reading“: instead of staying on a person’s TikTok profile to see if they look “real,” open a new tab and search “[Name] + controversy” or “[Name] + credentials.” This takes 30 seconds and breaks the “vibe” of the app.
- Inside TikTok’s Algorithm (WSJ): A visual investigation using “automated bots” to show how the platform identifies a user’s vulnerabilities and pushes them into rabbit holes.
- Zombies, Run! (2026 Narrative Update): This is in my view the funnest example of “Epic Meaning” gamification. Instead of tracking calories (which can trigger orthorexia), it uses an immersive audio drama to make exercise a “mission” for survival. It shifts the user’s identity from a “consumer” to a “hero.” zombiesrungame.com
- Tular Nalar MIL Boardgame (Indonesia): This is a physical and digital board game released by Mafindo in late 2025. It gamifies the “vibe” of social media, forcing players to choose between “Clout” (getting likes for shocking info) and “Truth” (keeping their community safe). It’s perfect for youth workshops. mafindo.or.id/tular-nalar
- Sure and Share Center (Thailand): A YouTube-based organization that uses a “Checklist” format for cultural health claims (like traditional herbal “cures”). They focus on how these claims are packaged in “relatable” Thai cultural aesthetics to bypass skepticism. Sure and Share YouTube
Incentive & algorithm resources
- Checkology: An interactive platform for young people that has specific modules on how to tell the difference between “branded content” and actual expertise. Visit Checkology.
- Go Viral! Game: A 5-minute game from the University of Cambridge that lets you play the role of a misinformation creator to see how fear and outrage drive clicks. Play Go Viral!.
- The Attention Economy Youth toolkit: A youth-focused toolkit explaining how features like infinite scroll are engineered like slot machines to keep you scrolling. humanetech.com/youth
Authority shift & global resources
- Who’s Really Shaping What You Believe? An interactive tool to examine how diverse your close network and social media feeds really are, and what that means for your information environment.
- Viral Facts Africa (WHO-hosted Initiative) This is a first-of-its-kind initiative by the WHO and a network of African fact-checking organizations. It produces short, visual, and highly shareable “myth-buster” videos and infographics specifically designed for African social media contexts. https://afro.who.int/aira/viral-facts-africa
- Chequeado & LatamChequea Education: Based in Argentina but serving all of Latin America, this network provides “Media and Information Literacy” resources specifically for teenagers that focus on “reflection and action”—a model rooted in Latin American educational philosophy. https://latamchequea.com/en/latamchequea-education/
- BOOM Live’s Teen Fact-Checking Network (India) India’s leading fact-checking outlet runs a dedicated program where teenagers are trained to debunk medical and corporate misinformation. Their YouTube channel is an excellent reference for seeing how young people in Asia are tackling “digital arrests” and AI-generated health scams. https://tfcnboomlive.in/
- Sure and Share Center (Thailand) Thailand’s primary fact-checking hub, based at the Thai News Agency. They produce a “Verify Before Sharing” checklist series on YouTube that specifically addresses cultural health claims, such as traditional herbal “cures” for chronic diseases. https://www.youtube.com/c/SureAndShare/videos
- Fatabyyano (Jordan/Middle East) As the leading independent fact-checking platform in the MENA region, Fatabyyano provides Arabic-language resources and verification tools. They are a great reference for how health misinformation is addressed in an Arabic-speaking context, particularly during regional health crises. https://fatabyyano.net/
- Fátima by Aos Fatos (Brazil) An AI-powered WhatsApp chatbot from one of Brazil’s top investigative outlets. Users can send suspicious links or images directly to the bot to get instant verifications. It’s a gold-standard example of a “native” app tool designed to fight misinformation inside encrypted spaces. https://aosfatos.org/fatima/
- Africa Misinformation Portal (AMP) A tool developed by the Africa Infodemic Response Alliance (AIRA) to aggregate and analyze real-time misinformation trends across the continent. While some parts are for professionals, their “Infodemic Trends Reports” are incredible reading for anyone wanting to see the “Why” behind health rumors in diverse African communities. https://www.afro.who.int/aira/amp