Join Emmy award-winning journalist, poet, physician, and author Dr Seema Yasmin as she provides up-to-date insights regarding communication issues facing the medical profession. With the volume of information out there, there is much misinformation, false information, disinformation, as well as accurate information. Sorting it all out can be daunting. Tune in to learn how to communicate effectively with patients and colleagues, support accurate information and avoid misinformation, or disinformation especially during the COVID-19 pandemic.
Sign in to take quiz and track your certificates
To help improve the quality of its educational content and meet applicable education accreditation requirements, the content provider will receive record of your participation and responses to this activity.
Stanford Medicine offers CME on a variety of topics that is evidence-based, references best practices supported by scientific literature and guidelines and is free of commercial bias. Learn more
Dr Ruth Adewuya: Hello, you're listening to Stanford Medcast, Stanford CME's podcast where we bring you insights from the world's leading physicians and scientists. If you're new here, consider subscribing to listen to more free episodes coming your way. I am your host, Dr Ruth Adewuya. This as part of the COVID-19 mini-series addressing up-to-date insights on COVID-19. In today's conversation. I'm joined by Dr Seema Yasmin. Dr Seema Yasmin is an Emmy award-winning journalist, poet, medical doctor, and author. She trained in journalism at the University of Toronto and in medicine at the University of Cambridge. She was a finalist for the Pulitzer Prize in Breaking News in 2017 with a team from the Dallas Morning News, and a recipient of an Emmy for her reporting on neglected diseases. Seema, thank you for chatting with me today.
Dr Seema Yasmin: Thanks so much for having me, Ruth.
RA: You have a unique experience in medicine, epidemiology, and journalism, so I think it's very timely that we are having this discussion. As you know, over the course of March, 2020, the everyday life of most people have changed from normal to extraordinary around the globe. And with this crisis, there has been a massive flow of information about COVID-19 in terms of news coverage, both local and national, given by political leaders, kings, religious leaders, and clinicians. So to me, it's been clear that communication is, and has been, a key factor during this COVID-19 pandemic. I want to start off our conversation by asking you about your thoughts on the evolving role of physicians around communication, and especially during a public health crisis.
SY: So as you said, the information ecosystem is so important, right? There's not just a pandemic, but there's an infodemic happening at the same time. And yet we really relegate communication to the lowest ranks. When we think about are scientific training, the way that we design medical education, communications is a soft science. It's like a thing that you're either inherently good at or maybe you can be trained a little bit, but we don't focus on it so much as an intervention. And actually, it's been studied for decades. You know, the stuff that I teach about storytelling even, or how to engineer messages, so that they are efficient, how to have conversations about very polarized topics, I'm drawing on evidence from sometimes the 60's, the 80's, the 90's, early 2000's.
There are scholars doing this work. It's just that we've labeled all of this like a soft science and something that's not as important as for example, teaching a resident how to do a lumbar puncture. And yet we see time and time again, again, there's evidence for all of this, how often are failures in life in general and science and medicine especially, come down to communication failures. And we're seeing now with the pandemic, that the role of the physician, for example, in all of this isn't just as the person who is diagnosing COVID-19 or treating COVID-19. It's that person at the front lines of that pandemic and the infodemic. And especially when I look at the situation back in my home country of the UK, it's very divided right now, with some of the top doctors in the country saying the government is doing too much in terms of interventions and lockdowns, and then other high profile, just as well-trained, doctors saying, the government's not doing enough.
And of course, all of this is highlighting the role of communication, the role of the physician, and I hope it's teaching us that we need to have better training on information communication, because we're often communicating these life and death issues and doing it in the context of not just an infodemic, but a misinfodemic. So there's not just an information overload in general of accurate information. There is that, but then there's all that information that's actually inaccurate, and we can talk about definitions, but it can be misinformation or disinformation. And our role, I believe, is also helping the public navigate that.
RA: You mentioned the term infodemic and I think personally, actually I think that's the first time I've heard that, and I think it's in line with my next question around this massive flow of health information that is happening. And I did hear a talk that you did before around the concept of information disorder and the different categories. Could you elaborate? What do you mean when you say information disorder? What are the components of that?
SY: Sure. I've given these talks for many years now, because none of this is new, Ruth. I think it's new to a lot of people, right? But these are not new phenomena. So whether you're thinking about headlines that are misleading, you know, we've had yellow journalism, as we refer to it, since the 1800's, where you have misleading journalism, misleading headlines that's trying to politicize and weaponize information. We've had false news spreading, and I very deliberately, as someone who's trained as a journalist and belonged to this profession, as well as being a physician, but belong to the profession of journalism that's been so attacked by the president and by the highest office in the country. That's why I deliberately do not use the term fake news, unless I'm specifically talking about the way that that term has been used to demean journalists and to hurt journalists.
But when it comes down to these definitions, I think they're really important because, on the one hand, we don't want to say fake news because that's what the president says, and it's helpful, maybe the science part of me as well finds it really helpful to have specific language for things. So misinformation is false information that's spread without the intent to cause harm. That could be your neighbors saying, “Hey, I heard that if you gargle with salt water, that even if you're exposed to somebody with COVID-19, you won't get sick,” right? They're not doing that, I hope, to hurt you. They're your neighbor. They care about you and they think they're spreading something accurate, but it's not. It's false. Then you have disinformation, which is false information that is spread with the intention of causing harm.
Could be individual harm. It could be home to a democracy. It could be harm on a society level. Then you have mal-information, which is actually accurate information that should be kept in the private domain, and it's made public with the intention of causing harm. Collectively, we refer to these as information disorder, and probably some of the talks that you've heard me give are about the parallels between information disorder and disease spread. Because, you know, I went to medical school, but then I trained as an epidemiologist and served as an officer in the Epidemic Intelligence Service at the CDC where my singular focus, every time I got sent into the field to investigate an epidemic, my singular focus was stop the spread of a pathogen. And yet, what I saw every single time in every time the hot zone was, wait, it's not just a virus that's spreading. It's not just a fungus that's spreading. It's not just a bacteria.
It's anxiety, it's emotion, there's emotional contagion and there's information contagion. Sometimes it helps, people are spreading information. They're helping to get my case definition out there. Other times it's really harmful. It's preventing me from doing my job. I can't find my index patient or my patient zero because of these rumors, these medical myths and health hoaxes that are spreading. So information disorder, these are terms that have come from first draft news-based at Harvard. Those are scholars of information and journalism. For me though, as an epidemiologist and someone who studies infectious disease, I find it very helpful to think about information disorder, because I think there are many medical parallels and metaphors. The parallels between how a virus spreads and how information spreads are so many. So in some of my training, I worked at UCLA with Dr Sally Blower, looking at mathematical modeling.
How can you model, and I think this has become much more like in vernacular now because we're all living through this pandemic. But mathematical modeling was a tool I learned to think about, well, there's this epidemic happening. If we apply intervention A versus intervention B, how might things differ? So long story short, I learned some basics of SIR models and how you plug data and assumptions into these models. And then when I later went to journalism school and started studying information and comms, I was blown away that people who were studying information contagion were using the same mathematical models that I used as a disease modeler. It's the same, and even when you think about super spreading events, you can do that for a disease, but you can also look at how you have super spreaders of misinformation or super spreaders of disinformation.
Whether that's word of mouth, somebody who has a big platform in the community, or whether it's someone who has loads of Twitter followers, or has loads of Pinterest followers, you know, you see these very similar metaphors. Then it comes full circle in that we have a theory called inoculation theory, which again, draws this metaphor between medicine and information spread that says, actually, and this goes back to a social psychologist, William McGuire, in the 60's at Harvard, who said, “You can immunize people against rumors in the same way that we do in medicine.” Usually, I'm oversimplifying here, but we give you a weakened dose, a weakened version of the thing we're trying to protect you from. And McGuire in the 60's were saying, you can give someone a weakened version of a rumor in the hopes of immunizing them against then leaving that room, or when they are then exposed to this room on a bigger scale.
So I'm really simplifying, condensing it here, but this is what I studied. This is what I find so interesting. So yeah, this pandemic's happening and all my worlds are colliding. And I think a lot of us feel that we've been talking about this for years. Yeah, so there is that frustration, but maybe the silver lining will be that we don't repeat our mistakes and that the status of communication will be elevated in the sciences and in medicine.
RA: I think it's very interesting. I can understand why, you know, just your worlds colliding in this pandemic must be an interesting phenomenon just for you. But I'm curious about the concept that you said about misinformation spreading so quickly and immunizing people, like giving them a weakened version of that rumor, because I would agree with you that, from the outside looking in, it seems that misinformation seems to spread faster than accurate information, right? Is that correct, or do I just feel that?
SY: No. There are studies that backup your sentiment, Ruth. Just one from 2018 from a group at MIT that found the false information travels faster and farther than accurate information. So that totally backs up what you're saying and what many of us feel, right? Like, why is it that this piece of BS is spread really widely, but us trying to counter it, that didn't seem to take off? What I find more interesting than that depressing takeaway from the study is those scholars analysis of why. Why is it that false information might travel faster and farther than accurate information? And I enjoy that part of the digging because I think it's really interesting, but also because it gives us some clues as to how we might mitigate the spread of false information. For example, we've learned, and again is not a new phenomenon, it's been studied for a decade, but that part of the reason that false information spreads so widely is because it has the novelty factor.
And that is status involved with being a person who spreads information that appears to be novel and appears to be new, right? As opposed to, oh, someone just texted me this meme that says vaccines don't cause autism, which is accurate information, but as opposed to, oh my God, someone said that there's this scientist in London and he's discovered that vaccines don't only cause autism, but they also cause Parkinson's disease in older people, right? And you're like, what? I've never heard that before. That's so new. Oh my gosh. That should trigger lots of red flags, but that same novelty is what helps these things spread, and that status that's linked to being the person who spreads something that is false. That's just one of many factors involved. But I find those interesting, like I said, because they give us some pointers as to how we might mitigate the spread of the BS.
RA: Yeah, absolutely. And then you mentioned also the fact that this has been studied for years now and learning from it. So this is obviously not our first public health crisis, and so it's been studied. What can we learn from how information was handled in prior public health crises? And are you seeing that we're learning from it or have we learned from prior public health crisis? And my secondary question would be, what can we learn from how we are handling information right now?
SY: I don't think that we have learned, and I think this has been very detrimental to public health and to the pandemic response. The anti-vaccine movements are not new. The anti-science movements are not new. The flat earth movements, you know, who knew those would be back? But those aren't new either. What we have learned from studying those movements and their responses from medical and public health institutions is that we have a lot of hubris, and we believe that because we are armed with the facts and the best evidence, that that's going to counter the false information. And it doesn't. It doesn't work. So why do we keep rolling out the same dry bullet points and the same boring leaflets when, those who are spreading the novel information, you know, the misinformation, the disinformation, are doing it in really sophisticated ways in ways that trigger emotion, which is how humans respond and how we learn and how we put down memories too?
They're doing it in ways that are just really compelling. They tell good stories, they'll have a parent who will say that, you know, she's convinced her child was injured by a vaccine and that her child had developmental delay and will never be the same again. You don't counter that kind of viral YouTube clip with a pamphlet that says vaccines don't cause autism, and yet that's what we do time and time. There's so much evidence of that's what we do in public health and medicine. It doesn't work and we should stop repeating that.
RA: I was going to ask you this later, but I think now seems to be a good time. What should we be doing?
SY: So the first place to start is we should not lump people together and assume that they are all coming to the same broadly anti-scientific or anti-climate change or anti-vaccine sentiment from the same history or from the same context. So I know it's really frustrating for many of us who believe that vaccines are one of the best public health interventions ever, or one of the best human inventions even ever, right? We're like, they work, they work. And so when we hear about people not getting vaccinated or not getting their kids vaccinated, we start referring to them as anti-vaxxers, we talk about anti-vaccine movements and anti-vaccine campaigns. That's fine, except that kind of language and that kind of mentality and approach falls flat when you're thinking about how to communicate with those groups, because they are groups within groups. Anti-vaxxers, if you want to use that term, and some say we shouldn't, they're not a monolith.
Among them, for example, black Americans who have a very recent history and a current experience of dismissal and exploitation and unethical experimentation by the medical establishment. And so the messaging that you might use, the kind of messaging you might employ to communicate with people of that background, and again, you don't want to make assumptions, I'm kind of generalizing here to make the point, might be different to a white person who has a different history and a different history of interaction with the medical establishment and might be coming to that decision of I'm not going to vaccinate my kid from a very different context. But what we do is we have a one size fits all, broad messaging, and we think that that will help everyone. And it often antagonizes and it often causes a backfire effect and causes people to double down. And again, it's been studied.
So we see where this falls flat, and yet, it's a bit frustrating that we just keep rolling out the same, oh, you think vaccines are unsafe? No, they're not, and here are 16 studies of 10 million data points that demonstrate the safety of vaccines. And it's like, yeah, those studies have been around for at least 25, 30, 35 years or longer. People aren't responding to that. And so before you even have that, I'm the physician, I'm the health care professional, I know the facts and I know what you need to hear, you actually have to do something that's really, really difficult, especially for doctors, which is you have to shut up. When we do this training in ACES, you know, at Stanford, the Advancing Communication Excellence, we ask people, like, on average, how long do you think a doctor lets a patient speak for before they interrupt?
And people will say, like, a minute, 3 minutes, and the studies show on average, a doctor, specifically a doctor, not so much nurses or anyone else, doctors let patients speak for 11 seconds before they interrupt them. Yep. And when you ask people like, why do you think we do that? And I'm saying this as a physician myself who probably talks too much, right? It's because, oh, we're so worried about the garrulous patient. In fact, I learned the word garrulous in medical school, in a clinical exam where that was one of the stations was, how would you deal with a patient who talks too much? And I had to ask what does garrulous mean so I knew how to deal with the patient encounter. But that's my association with that word is, oh, it's a patient that talks too much. I've learned this from having these conversations with physicians especially is, oh, but if I don't interrupt, they'll talk for so long.
And you say, okay, how long do you think the patient will talk for if you let them? And they're like, 5 minutes. That's a really long time, and actually there are studies that show, when you let patients talk, they don't go on for that long. But actually, your clinical encounter becomes a lot more efficient and has better outcomes when you let the patient tell you what's going on. And this applies to these frustrating conversations, but we're frustrated before the conversations are barely started. I have in my fridge, a vaccine that I can tell you can save your child's life. Why would you say no to this a life-saving intervention? But if I'm not going to listen to you and find out where you're coming from or what your sixth cousin on your mother's side said to you about her toddler who got vaccinated, we're just going to have a really frustrating conversation where neither of us have our needs met.
It's about shutting up, the doctors especially, listening and building empathy and finding consensus. Because the thing that we can agree on, and I'm just using an example of a parent that's bought in their kid, is we care about the kid. The parent really cares. They want to be a good parent. You know, most of the time that's the case. And we care about the kid too. There's your consensus. Start with that. That is the thing that everyone can agree on. Have empathy, listen, and then work from there. As opposed to I have the facts, I'm going to use a message that I use for all of my patients, but not all your patients are the same. So treat them as individuals.
RA: You bring up a good point with that physician-patient interaction, and I wonder, layering COVID and the pandemic and how health officials have come out with guidelines for the public, when you talk about, you know, shutting up, building empathy, and create consensus, do you see an opportunity when we talk about the social distancing guidelines that a lot of people have flaunted or some people have flaunted, I should say, the mass guidelines. How would you apply those three concepts in the pandemic situation?
SY: It's a tough one, because what's happening is that there's a lot of blame ascribed to people who are seen, for example, not physical distancing, not wearing masks, but when you think about the leadership and the people often constructing, or at least disseminating, these messages, not modeling that ideal behavior, it makes it very difficult for other physicians and scientists to have that communication. But I think it comes back to, I think the trickiest thing, which is having compassion and empathy and understanding and listening, again, you may not want to because masks work. We have the evidence, except if that's where your conversation is starting with somebody who does not believe that, you're not going to have a successful outcome from that conversation. And we know this. Again, it's been studied in social psychology, it's been studied in comms. where you have a polarized environment, where you have a polarized conversation, sprinkling facts onto that conversation is like pouring kerosene onto a fire.
It does not help you find consensus. It does not help you reach any agreement. So it comes back to the listening. And again, I'm saying this in the context of so many of us who follow the guidelines, who've made personal sacrifices being like, but why do I even bother? I just saw people having a party, you know, near a pool I walked past. I wouldn't engage with those folks to have a conversation in that setting. So again, that comes into it too, but where we do have the opportunities to educate, I think we have to come back to those basics, which are very often dismissed. And again, because we're not often taught that in medical school or we're not taught that in scientific training. We're certainly not taught it enough. We are taught a very, I guess, patriarchal, paternalistic way of medicine, which is, you are the doctor.
You hold the knowledge, you will disseminate this knowledge and people better listen to you. Except the world don't work like that. People listen to their Facebook timeline. People listen to the WhatsApp messages in their family group chat. And they may spend a lot less time with their doctor and actually may not even be willing to begin that conversation with their doctor about vaccines or about masks or about whatever, because they already have assumptions about our reaction. And often that plays out to be true because we believe that we practice evidence-based medicine. We want to lead with that evidence. Of course we do, but if you're not trained and you're not taught how to communicate that science, then it all goes belly up. And I think we've seen that play out over the last few months because we've had to not only teach people about COVID as we learned, but we've had to try and teach health and science literacy at the same time.
And we haven't peeled back that curtain well enough before. So what it looks like to many of the public that I've been speaking to is, well, one doctor said this, but then I read a report that these other doctors and scientists said the opposite. So if they can't agree, and it's like, yes, because that's actually how science works. Science is not just facts. Actually, science is a process of the hypothesis and testing hypothesis and then coming to conclusions. But I don't think we've done a good job of teaching people that, so then it becomes really confusing in the context of a life-threatening pandemic that's really scary, that's affecting your kid's schooling, affecting your job prospects, affecting your 401k. We're trying to play catch up on something that we should have been doing for decades.
RA: I want to switch gears from our discussion around misinformation, information disorders, to a discussion about accurate information. Because I imagine in the same way if accurate information is not shared correctly, that that would have a negative impact as well that is probably equal to, or maybe it's not equal to, but you can tell me, to misinformation. What are your thoughts on that concept?
SY: I find this the most interesting because when I teach case studies of this, it's quite obvious, like, oh, this was a piece of disinformation. It was disseminated widely. Look at the chaos it caused. And there's lots to be learned from that. There's lots to be learned from misinformation and mal information too. But I think the most fascinating examples of where accurate information has had really dire unintended consequences. I'll give you two really quick examples. One is during the Ebola crisis of 2014 to 2016, when CDC and others went into West Africa. I went to Liberia to report on this. I was a full-time journalist at the time, and I spoke to people who said, well, of course, I didn't go to an Ebola treatment unit. And I said, why didn't you go because you had the symptoms? And they said, because those American doctors, they came and they said, Ebola is real.
It causes these symptoms, and if you have it, you need to come to us and there's no treatment, no vaccine, and no cure for Ebola. At the time that was accurate. We have a vaccine now. And so that was accurate information that the CDC shared. Of course, there was no vaccine at the time, there was no cure, no treatment, and the consequence of that was people decided, well, why would I go to you then? I'm going to go to a traditional healer who says they can do something for me. And so people went to those heaters and those fueled new clusters and new outbreaks. Another really quick example is from about 20 years ago when thimerosal was removed from a lot of childhood vaccines in the US.
That backfired spectacularly because of the way medical institutions, including FDA and CDC and AAP, communicated why thimerosal was being removed. I almost have memorized the statement because it's so interesting, but it's a very, very confusing statement. And by the end of it, you're left thinking, wait, so you're saying this is safe, but you're removing it from vaccines anyway. And now you're saying they're safer, but if it was safe, so what happened? Why? It's so confusing. What happened from that, and again, this has been studied so we have the data, is that there were anti-vaccine groups that formed off the basis of that action. And there was a backfire effect in hospitals where, even though physicians were given guidance about Hepatitis B vaccine, they still, some physicians, I think one in 10 hospitals or something, stopped giving it to newborns and we saw children die.
And that comes down to a communication failure. And it wasn't that they were saying anything actually inaccurate when you read, because that had been vetted by the top scientists, the same as the messaging in Liberia that the CDC for sharing about the treatments or the lack of treatments for Ebola. It was accurate, but it had deathly consequences.
RA: One of the last questions I have for you is how can clinicians ensure that they are sharing accurate information and in a way that has positive impact? I know you talked about kind of those three things that I have written down, which is build empathy, shut up and listen, you know, and also kind of create consensus, but around accurate information, do you have any tools or strategies to ensure that's happening?
SY: We have to think really carefully about what we mean by accurate information too, and how that information was created. And when I think about this, again, I'm thinking about historical context of medicine and of science. The fact that there are gynecological procedures that are done to this day that were invented by people who abused enslaved black women, right? That's our history of Western medicine. And that might sound so esoterical, so far removed from modern day problems around misinformation and disinformation, it's completely intertwined. And I think it plays into the fact that even our accurate information has these bloody roots. You know, some of the reason that we know the natural history and the signs and symptoms of untreated syphilis and late stage syphilis is because American doctors purposefully infected people with syphilis and denied them treatment. That's how we got the pictures in our textbooks. That's how we got the information in our textbooks.
And again, I know to some people, this is just so frustrating. I get these email complaints all the time whenever I talk about this, people tell me, medicine is neutral. That science by design is neutral. That's complete codswallop. None of it is neutral. I am not neutral. You are not neutral. There is bias inherent in all of this. If we're not really learning about the history of our profession and our craft, and we're not atoning for some of that, I don't care whether you were involved or not, I wasn't, we still need to atone for it. We're not going to build those bridges between us and the communities that we serve. We're just going to fail over and over again, whether it's at debunking misinformation and disinformation, whether it's just building relationships with our patients and getting across the most basic information and making sure that our patients adhere to treatment programs and that they have good outcomes.
We're just going to keep failing if we don't atone for the history, learn the history, and really have some, what's the opposite of hubris? Humility. Have some humility about who we are, our role in society. We are not the doctors of the Victorian era who prescribed opium and everyone loved us and listened to everything that we did because people don't just get their information from us and from legitimate sites. They're getting that information from Facebook, from Twitter. I wrote a New York Times op-ed recently, which was about health care providers believing misinformation and disinformation and saying, but I saw it on Facebook, so it must be true. COVID is a hoax or there is a cure for COVID. So that's where the humility comes in too. That, yes, we need to learn those tools and techniques are having these conversations with our patients, but don't you ever think that you are fully resilient to this stuff because we can be vulnerable too.
RA: I appreciate your insights on this topic. Thank you so much for stopping by.
SY: Some food for thought. Thank you so much for having me, Ruth.
RA: Thanks for tuning in. This podcast was brought to you by Stanford CME. To claim CME for listening to this episode, click on the Claim CME button below, or visit medcast.stanford.edu. Check back for new episodes by subscribing to Stanford Medcast wherever you listen to podcasts.
All Rights Reserved. The content of this activity is protected by U.S. and International copyright laws. Reproduction and distribution of its content without written permission of its creator(s) is prohibited.
AMA CME Accreditation Information
Accreditation: The Stanford University School of Medicine is accredited by the Accreditation Council for Continuing Medical Education (ACCME) to provide continuing medical education for physicians.
Credit Designation Statement: The Stanford University School of Medicine designates this Enduring Material for a maximum of 0.50 AMA PRA Category 1 Credit(s)™. Physicians should claim only the credit commensurate with the extent of their participation in the activity.
Financial Support Disclosure: This CME Activity is supported in part by educational grants from Novartis.
The Stanford University School of Medicine adheres to ACCME Criteria, Standards and Policies regarding industry support of continuing medical education.
There are no relevant financial relationships with ACCME-defined commercial interests for anyone who was in control of the content of this activity.
Ruth Adewuya, MD
Associate Director, Education Development
Seema Yasmin, MD
You currently have no searches saved.
You currently have no courses saved.