Towards Resilience: A Conversation with Kate Starbird About the Future of Online Political Discourse
Justin Hendrix / Dec 8, 2024Audio of this conversation is available via your favorite podcast service.
Kate Starbird is a professor in the Department of Human Centered Design & Engineering and director of the Emerging Capacities of Mass Participation Laboratory at the University of Washington, and co-founder of the University of Washington's Center for an Informed Public. Last week, I interviewed her about her team’s ongoing efforts to study online rumors, including during the 2024 US election; the differences between the left and right media ecosystems in the US; and how she believes the research field is changing.
What follows is a lightly edited transcript of the discussion.
Kate Starbird:
Kate Starbird, I'm a professor at the University of Washington in the Department of Human-centered Design and Engineering.
Justin Hendrix:
Kate, I'm talking to you on Monday, December 2nd. It's the end I'm sure for you of a busy season of monitoring election rumors in the United States. I want to spend a little time this morning talking about what you learned this cycle, how it has contributed to and changed the way you think about these topics that you study, perhaps given you some reason to revisit some of your theories about how information moves in the current online environment.
I just want to start with your big picture reflections. If you cast your mind back over this period, I think most people expected there to be slightly more of an interregnum before we would know the result of the 2024 election, but in fact it came swiftly. What do you think we've learned observing how information and rumors flow during this election cycle in the United States?
Kate Starbird:
Yeah, let me couch this a little bit on what I know and what I don't know because our research was really focused on rumors about election administration, about processes and procedures, when to vote, how to vote and election results, and we were operating sort of in the shadow of similar kinds of things that we were doing in 2020 where that became a very big story, where the big lie developed and led to a contestation of election results and eventually the events of January 6th. And so we did a similar kind research, live research or real-time research in 2022 and then again in 2024, just kind of trying to help draw attention to and resolve rumors about when and where to vote, how to vote, those kinds of things. I don't know that we were surprised by anything.
I don't know that we're going to revisit our understanding of those kinds of rumors. There was definitely a sort of low drumbeat of rumors about, "Oh, this name isn't printed right on the ballot" or on election day, "Oh, the seals on these machines were tampered with, so we can't trust the results." At some point on election day, Donald Trump said something was happening in Pennsylvania and it needed to be stopped, and so we did see that happening up to election day, on election day, but it resolved pretty quickly, especially on the political right. The political left did pick up some rumors after the election because they lost in ways that they weren't expecting and that kind of searching for an explanation can lead to conspiracy theorizing and we saw that.
The big sort of difference right and left in 2024 versus 2020, what we saw on the right in the US supporters of Donald Trump was they really laid down an infrastructure to spread these rumors, to amplify these rumors and to potentially mobilize on these rumors. We actually did see in the lead-up to election, they used rumors to mobilize lawsuits to try to contest different parts of the process, when certain ballots should be counted, extending hours in some places, reducing them in other places. We saw that kind of activity. We expected that infrastructure to be used. If Donald Trump had lost or the results were close and there was uncertainty, we would've expected a lot more amplification of those kinds of rumors on the right after election day. But because Donald Trump won, a lot of that resolved.
On the left, there just wasn't that same kind of infrastructure. People were worried about things on election day and even leading up to election day, they shared rumors of concern about things that they thought they saw that might have been interference and certainly after the election, a lot of people on the left participated in conspiracy theorizing, but it wasn't picked up by political leaders. It wasn't picked up by mainstream media or the media that a lot of people on the left turn to that follows journalistic standards. And so it didn't have legs in the same kind of way as we saw on the right in 2020 after Donald Trump lost. So the big story for us was really the infrastructure and the legitimization of that kind of conspiracy theorizing just didn't happen for the Democrats after they lost in 2024 in the same way that pro-Trump Republicans did in 2020 after he lost.
Justin Hendrix:
After 2020, and even in advance of the 2020 cycle, I remember you developing this kind of participatory model for how in some cases false claims kind of percolate up and also move back down, how they're kind of constantly in this sort of reflexive relationship between individuals, actors, elites, media, the rest of that. Did you observe the same phenomena again this cycle? There's been a lot of hand wringing I know among Democrats about the kind of deficiencies, the kind of media environment that most folks on the democratic side of the political aisle inhabit. What did you learn about your model and how it works on both left and right?
Kate Starbird:
Yeah, I think you've got two questions here. So let me start with the first one was really about the sort of participatory nature of information, disinformation, rumors. We can talk about it using different words. I want to go back to 2016. So coming out of 2016, there was this sort of conversation about disinformation and the real understanding of disinformation was this top-down coordinated manipulation by foreign governments or others, and there was this idea that disinformation was this sort of organized, coordinated and top-down.
After 2020, our research was looking at things and this is not top-down coordination, there is disinformation here, I would still characterize, and others have as well, the election denialism in 2020 as a disinformation campaign, but it wasn't simply top-down. You could see Donald Trump and others of his supporters were saying that the election was going to be rigged, and yet the individual stories about how the election was rigged or how they thought the election was rigged came from audience members, came from just everyday people that went to vote or went to their mailboxes, saw something that they thought was evidence of fraud that usually it wasn't, but shared that, it got amplified by influencers.
And so there's this kind of cyclical process that we started talking about as participatory disinformation that really has three groups of elites and media and politics that are sort of setting the frames or the expectations or the theme or setting the drum beat and audience members or content creators that can actually generate content to support those kind of themes or frames and influencers who really work often by amplifying content that they see that matches the theme of the day, whether that is claims of a rigged election or claims that immigrants are voting or claims that immigrants are eating pets or whatever it is. So you've got these kinds of participatory information flows.
2020, I would characterize the big lie is disinformation. 2024, it's so messy. We really talk about it as sort of propaganda, collective sense making, rumoring, a lot of different kinds of words, but the content is not what's necessarily important here. It's that process of how the news is made and especially on the right, it is a very participatory model where, yeah, influencers and political elites have influence, have the ability to share their messages, but audiences actually have a lot of influence as well. They can nudge their leaders, they can push them to talk about things, they can create new content, new storylines, and that interaction, that participatory nature, especially on the right, makes their politics and their media very reactive to their audiences. They're giving their audiences what the audiences want, the audiences are telling them what they want, and that can be very exciting. It can be very motivating to be an audience member.
I also think, and some others might think that it can have these kind of processes that kind of spin out of control a little bit, because you can pick up grievances or things from the audiences that start to lead into really dark places, and I think there's a worrying trend there as well, and you can see that in some sort of influential people who start to get radicalized by their audience, and we can see that online in a lot of cases.
Justin Hendrix:
There's been a lot of talk about how the substrate of what some people are still calling in alternative media ecosystem, although I wonder sometimes whether in this country the right wing media ecosystem is itself, in fact the most prominent media ecosystem.
Kate Starbird:
There's two media ecosystems, but I don't think one is alternative and one is not. I think it is an established media ecosystem on the right. It just doesn't follow the same logics of journalism or media that we might have considered 15 or more years ago.
Justin Hendrix:
Well, the observation is that the substrate of that ecosystem is less concerned with fact, that there are a lot of false claims that spread, percolate, become essentially the sort of de-facto framework in which the politics of that ecosystem operate. I know there's been various polling after the fact about whether people believe certain facts or understand certain facts about either the state of the economy or figures on immigration and how these related to people's decisions about which candidate to vote for. I don't know. How do you think about that in the context of your study of online rumor? There was this sense after the election, you saw it in certain analyses that misinformation or disinformation didn't play as much of a role in the outcome this time. And it seems to me that that's true in some acute sense. Nothing happened like what happened in 2020. There wasn't a stop the steal movement, but more fundamentally, false claims misinformation still were an important substrate that were potentially determinative of the outcome.
Kate Starbird:
It's almost impossible to measure the impact of certain sort of false claims and other kinds of things. And I think what the reflection is this time around is that operationalizing the problem as bad facts or misinformation is probably not going to be helpful here. Certainly we can look at say that person might've been misled. They're saying they voted because of this, but that doesn't make sense. But it turns out that it's not really right now about facts, it's about stories. I have a student that I'm working with a PhD student, Steven Prochaska, who has used this frame of deep stories to look at all of our work, starting with the 2020 election denialism and going into similar rumors in 2022 and really talking about this concept of deep stories, which comes from Arlie Hochschild, and that is this idea that political groups, and she did her work on conservatives in the United States, and that these political groups have these animating stories, these stories that represent their hopes and their fears and their values, who they are and what they believe in.
And those stories aren't true. They don't have to be true. In fact, whether or not they're true don't even really matter. Parts of them can be true, but it's not the particular elements of the story, it's the particular facts within it. It's the story. And so if you think about the idea of a rigged election in 2020 as not a single claim, but this story, that means that any one little single claim, you can go back and say, oh, that's false. And you're like, okay, whatever. I don't care, but because I believe the story. And so you start to get these stories of a bad economy, you get these stories about immigration being harmful to the United States. You get these stories and they're going to be partially true, they're also going to be false in other ways, but it's the story that's so powerful. And I think on the right, they knew how to make these stories into things that motivated people, animated people, got them to vote.
And the left didn't really have a story. They had a defense of facts. They said, "Oh, that's not true. This isn't true." Or some other kinds of things, but they had a hard time developing a story to get people to believe in and to motivate them to go vote. Now that's a little bit simplistic in the sense of right versus left, but I would say I think Arlie Hochschild and Steven are correct in this idea that right now it's really not about misinformation, bad facts, it's really about these deep stories that are very powerful. And I do think that the design of these media ecosystems are very effective for building and reinforcing those stories. I've been talking about as frames in a similar kind of way, but I think it's really effective to think of this as sort of participatory storytelling, so deep storytelling with their online audiences, and the writer is very good at that right now.
Justin Hendrix:
Can we dig into that just a little bit as some of the features of the information ecosystem at the moment that you think sort of favor one set of stories over another one or one way of storytelling over another?
Kate Starbird:
Well, I mean this is going to be really simplistic, but one of the differences is that, and you could see it if you're on Bluesky as I have been for a few years, from Bluesky, you could see people complaining that the New York Times wasn't telling the story "The New York Times, isn't telling the story. We're mad at the New York Times," the New York Times and other sort media that a lot of people on the left have come to rely on are journalistic outlets that have commitments to these balanced kinds of ways of telling their stories, and that's how they tell them. So the left is relying on all this mainstream media that tries to tell these balanced stories and get both sides of a story out there, and we can be critical of that and I'm happy to, and can go have another conversation about it.
Meanwhile, the right has an explicitly partisan media environment that's highly participatory. They've got all of these different outlets that often restate the same thing in different words. In fact, a lot of times they share each other's content, they don't even worry about it, and it's just very different logics that they're operating on. And they're not stuck having to put in both sides the story. They can really just work on developing their story and getting their message out. And that's social media, it's podcasts, it's cable news. It started years ago with AM radio, which was also very reactive and participatory compared to other ways of having of media flows. And so I think the right has just for generations now been operating in a world where they didn't think that the mainstream media represented them, and so they developed an alternative and that alternative is really bearing fruit.
Justin Hendrix:
I want to ask you about how you're thinking about your research agenda going forward. As you pointed out earlier, it sort of seems like there's been a kind of consensus forming, as you say, that thinking about some of the problems of the information ecosystem as about misinformation as about fact or fictions, about falsehood may not always be the most useful way to think about these phenomena. There's also quite a lot of concern in the "disinformation" research community and the information sciences about the field. Certainly some of that comes from the types of political attacks that the field has been targeted with over the last few years. You yourself have been a prominent target over the last couple of years, but it does seem like in a way, the field is sort of moving into a different phase. There's a phase shift happening here.
Kate Starbird:
Well, I would say, I mean it's been in my slide deck since 2017, is that the language of how to talk about this problem, I would've slide with 17 different words on there from information operations, disinformation, misinformation, information disorder or these kinds of things, that it was always contested, it was always dynamic, and we're still trying to figure out what the problem is. And the way that especially misinformation was operationalized and has continued to be operationalized in the research is not always that helpful. I don't find a lot of that work that insightful often because it's so reductive in how they have to talk about what the problem is that they're actually not measuring what we've often thought the problem to be.
Our work, if I just look back, starting 2022, we kind of threw off the word misinformation and went back to rumors, which is what we've been studying since 2013, which opened up the ability to start talking about things in ways that you don't have to get caught up in, whether they're true or false or intentional or not, you're really talking about how people make sense of the world around them.
We've also used a collective sense-making framework that we've been building upon in the last 18 months or so to think about how people are collectively trying to make sense of the world and how that's shaped by the structure of our information environment, by our beliefs, identities, other things that are happening in the world. And so we're finding those perspectives and the deep story perspective to be a more effective way of, or a more productive way for us to understand what the problem is. I think a lot of us feel like there's a problem. There's people living in separate realities. We've got conspiracy theorists being nominated to run our national intelligence operation in the next administration. We've got conspiracy theorists being nominated to run some of our public health apparatus. And so there's a problem there, but is it just misinformation and bad facts or is there something else going on?
And so I do think we probably see in the coming years a reorientation onto the structure of the information environment, onto some of these kind of different frameworks for thinking about what's going on there. And this is in the research, not maybe in the public discourse, but I do see the research field moving into that. We've been talking for a couple years about the structure of online influence and how that's something we should be looking at rather than misinformation, disinformation, and then around these sort of these participatory dynamics, which I think are all sort of intertwined. And I think other people coming from a different academic perspective are going to look at that differently. I'm not a psychologist, I'm not a political scientist. I come from a field of human-computer interaction. We look at how people use technology and how that shapes how they interact with each other and all sorts of other things that happen in the world.
And so I'm naturally going to be thinking about the structure of the information ecosystem and how people are using it, but I do think that stepping back from the veracity of certain content... An example here is all of the rhetoric around AI going into 2024, "Oh, deepfakes, deepfakes, deepfakes." I'm like, no, we don't need deepfakes. People are misled without deepfakes. But we did see AI, just not in deepfakes. It was used to create propaganda, propaganda that didn't really have a truth value, like images of Donald Trump running with pets, ostensibly saving them from immigrants, from this horrifying claim that immigrants were harming pets. The AI was used to create not deceptive true looking things, but propaganda.
And so really kind of stepping back and thinking more about like, no, they're here to build stories, to reinforce these stories, to have people thinking certain ways about the world. They don't care whether or not it's true. They care whether or not it resonates with their sense of how the world works and how the world should work and what their identities are. And I think if we don't like the political outcomes of what we're seeing, and one group is using these kind of logics very well, the other side or other sides need to think about how do they more effectively leverage some of these logics without giving up their values. And it's a challenge. It's going to be a challenge. It's a hard task.
Justin Hendrix:
Is part of that also thinking more about economic and financial incentives? It strikes me that the underlying kind of reality of this environment, there's just an enormous amount of money to be made peddling certain types of false claims, certain populous claims, certainly racist or otherwise oppressive material. That seems to be a big part of this. Maybe that goes less explored, less understood, less addressed.
Kate Starbird:
Yeah, I think it's hard to unwind the profit motives from the ways that these technology enabled pieces of this media ecosystem have developed. So social media, surge, other kinds of things. So I mean, I think you're absolutely right. Those systems have developed in an advertising-based paradigm where the content that generated the most interaction spread the furthest, and that often was content that generated outrage, well, all sorts of emotions, but outrage, self-righteousness, making people feel good about themselves and angry about others and different kinds of things just happens to be what we like as humans often, and we interact with that content more. And because of that, and I think the platforms were also optimizing on short-term attention and had they optimized on long-term attention, maybe something different would've happened. But if you're optimizing on short-term engagement numbers, turns out you develop systems and the systems evolve to spread content that was outrageous. And it turns out that that outrageous content was often false in a number of other things that are potentially harmful.
And so I think you're right there about, if we want to think of the political overlay on that is that certain political actors and parties, and it's not just happening here in the United States, it's happening all over the world, certain parties have embraced that logic and certain parties have sat back and yelled at that logic and pointed our finger and said, "Oh, that's bad." And it turns out that it's just sitting back and saying, that's bad, didn't stop the folks that were savvy enough to use that to their advantage from sort of using those to their advantage and then having them locked in. To now say to change these logics is biased. And so there's been an effort by those who have gained power through those kinds of toxic attention economies, social media dynamic information logics. There's been an effort to lock them in by the people who are effective at using them.
Justin Hendrix:
We've had a lot of pieces on tech policy press about access to data from social media platforms, how difficult that's become certainly since 2022, since 2020. Of course, the Twitter firehose, which used to be very valuable to you and many other researchers in this space, not entirely unavailable, but certainly access has changed. What would you make of the environment for getting data these days? And I want to kind of put a sub question here. You mentioned Bluesky, Bluesky has made much of the fact that it's an open platform. It should be possible to study at much more granular level of detail. What do you make of data access across the board and is Bluesky beginning to register for your research efforts?
Kate Starbird:
Yeah, I think there's two things happening simultaneously here. One is that the information space itself is becoming fragmented. There's so many different platforms. If you want to study what's going on during maybe a breaking news event, 10 years ago, you could just be grabbing a ton of Twitter data and kind of get a pretty good idea of what's happening. You didn't know what was happening everywhere, but it was a pretty good sort of thumb on the pulse of what was happening. Now, activity is really more fragmented across any number of things. Part of this is the changes that happened as Twitter became X, but also this was just external to that as well. You've got whole movements into more video-based content, short-form video especially. We got TikTok, Instagram and also now a movement by mostly folks that are on the left, but there's different kind of populations in different places that are moving or different demographics onto Bluesky from X. It's a really interesting time.
I mean, as a researcher, if I was a junior researcher, there were diminishing returns for studying Twitter, just stepping away from the politics and all that stuff, but just as a researcher diminishing returns. We were also over-anchored on that platform 'cause the data was free and there was so much of it. And so in some ways it's actually a good change in the sense that we've been forced to start looking at other platforms, but at a time where other platforms are even more important. Yeah, we've lost access to Twitter for the most part, but there are ways to see things that are still above the board and not going to get you sued. And we found a way in 2024, we were able to do similar stuff to what we had done for our rapid research. We couldn't possibly do what we call big R, like our peer-reviewed research based on the data we'd collected, it just wasn't robust enough and had too many limitations.
Certainly absolutely, we've got three people right now probably this morning, looking at how to build our infrastructure for Bluesky. We already have some feeds coming in, but we're trying to make that more robust. We've got research on TikTok, we've got research on Telegram. I'm probably leaving a bunch of things out. We've got some access through a third-party vendor to a whole bunch of long-tail sort of alt platforms. And yeah, it's actually really interesting time because all of our papers are based on different platforms now, and we're seeing a lot more just variety and understanding how different kinds of platform affordances or the designs and features, shape interactions there.
I think personally, just from a nerdy researcher being curious, watching Bluesky develop is going to be really interesting. You've got people that have already gone through the Twitter era, many of them, they understand both what worked and what they thought didn't work. You've got platform designers that have taken some of that into account. You've got some new features like the starter packs, which is a very different way of doing recommendations than we saw on Twitter back in the day. And so watching sort of a redevelopment of a social network, and also people already kind of know who their influencers are, but not really. And so watching the redevelopment there is going to be fascinating. So I'm along for the ride and very curious and I'm really excited that that platform is going to be open data 'cause I think there's a lot to learn.
Justin Hendrix:
One of the things I keep thinking about watching this kind of fragmentation, the dynamics that you've described in our conversation this morning is that maybe it's time to put aside some of the thoughts we may have harbored in the field that a lot of these phenomena could "be fixed". I find myself thinking more about words like resilience, words like imagination, alternative ways of thinking beyond where we're at at the moment. I'm not sure I see very far into the future. I don't know if you can, if you have a kind of vision of an information ecosystem in 2030 or 2040, but how do you think about the future now, especially given what you've learned over this last close to a decade studying this stuff?
Kate Starbird:
I remember thinking in 2016, 2017 that we had about four or five years to fix whatever was happening there before it became locked in, in the sense of the proliferation of low quality content, people becoming really invested in conspiracy thinking or conspiracy theorizing, and I think we're past that point. And so defending the information ecosystem against that, just looking at the nominations for the next cabinet, a lot of these people are conspiracy theorists who have amplified or even created some of the conspiracy theories that are so prominent. And so I don't think we're in a place where there's a defense and we're going to roll things back. That's over. We've got to close the book on that. I do think we have to think of a much longer time, if you care about these problems, if you want shared realities or people making decisions based on evidence-based understandings of the world, I do think we're going to have to extend our timeline out.
I still think it's really good to invest in information literacy. I think it's going to be really important. So I don't think we should give up on those kinds of things. And instead of thinking about fixing existing platforms, that's done, they're not going to fix themselves, I think it's about developing new platforms, voting with your feet, getting to places where you want to be. And if you have political values that don't align with those that are gaining power using one media ecosystem, you need to start thinking about how do you build a similar kind of information environment for the support of the kinds of values that you have and the stories that you want to tell about the world. And you hope that those are anchored in reality. I hope that they're anchored in reality, but I think there has to be some acknowledgement that we're in a different moment. We're going to have to think more about building and less about defending, if that makes sense.
I'm starting to go back in sports metaphors, which was my first career, and I never used sports metaphors. I try not to use them almost at all, but you cannot win a game just playing defense. Now, if you're way ahead and you can play defense and you can maybe try to run out the clock, but it turns out that the clock of history doesn't work that way. It just keeps going. Once you're behind, you can't just play defense. You got to keep playing defense and you got to defend your values and hopefully help to prevent some of the worst things that are happening. But you've also got to figure out how to get your message out. You can't just point to say, okay, their message is wrong, but what is your message? What is your story? And so I do think folks on the left that they want to compete with the right, they're going to have to stop just playing defense.
Justin Hendrix:
I do feel like people have been saying this about the right-wing media ecosystem for more than a decade. I mean, this is a phenomenon that's been going on a really long time. Probably it's time to put aside the idea that some sort of well-behaved Mark Zuckerberg and the traditional media are going to save the left.
Kate Starbird:
One of the huge mistakes I think the left made was to not realize how far behind they were and to have this idea, "Oh, look, they're doing these things. Oh, no one's going to fall for that. Okay, it's a little problem, whatever." And not realize that, yeah, 10 years ago they could have said, oh, we need to do this better. But they were way far behind and they're only further behind.
And I do think one of the biggest things that has to happen is just an adjustment of how long in the future folks need to be thinking about for making these changes. 10 years was not enough to catch up, and in fact, I still think everyone got, oh, no, let's make the things we have better. I'm guilty of that too. Let's defend these platforms. Let's make these platforms better. I was very invested in Twitter. I'd been on there since 2009. I did almost all my research on there. I was very invested in trying to say, no, let's make this a better place. And I think there's a recognition that no, folks on the left need to go build other better places with different kinds of logics.
Ethan Zuckerman's been talking about that a lot for years, and I often was like, oh, okay, whatever. And I'm like, no, okay, yeah, this is what we need to do. And I don't know if there's a script for exactly how that works. People need to really get into rooms and think about it, but the rooms that I remember being in we're all about how do we defend these existing platforms. I haven't seen a lot of energy around the other rooms, and I do think there needs to be those other conversations going of like, how do you start building within the logics that we have, but thinking about the logics that are 10 years out?
Justin Hendrix:
More room for revolution, less room for reform. Kate Starbird, thank you very much.
Kate Starbird:
Yeah, thank you. It was super fun.
Authors
