The Past, Present, and Future of the US Information Integrity Field
Dean Jackson / Nov 16, 2025Audio of this conversation is available via your favorite podcast service.
From 2016 to 2024, ‘information integrity’ was a significant priority for Western governments, large foundations, tech companies, and global civil society. Since January 2025, however, the field has been in disarray. The closure of USAID and the State Department's pullback from democracy programming have created a funding gap so large that it can't realistically be filled, and technology companies have either stalled or reversed their own efforts. Politicized attacks on academics and others working in this field have also taken a serious toll.
To discuss the past and future of information integrity work, I spoke to Adam Fivenson, who was formerly a senior program officer for research and conferences on these issues at the National Endowment for Democracy and is currently a nonresident fellow at the American University Center for Security, Innovation and New Technology (CSINT), and Samantha Bradshaw, an assistant professor and director of CSINT, where I am also a nonresident fellow.
What follows is a lightly edited transcript of the discussion.
Dean Jackson:
This is a bit of a special panel for me personally because I previously held Adam's former portfolio at the NED, and Sam is a longtime collaborator in this space, having done early work in the field while finishing her PhD at Oxford. Today, I'm lucky to count both of them as colleagues at American University's Center for Security, Innovation, and New Technology, or CSIN, which Sam directs and where Adam and I are non-resident fellows.
In July, CSIN convened a workshop on the future of information integrity, which I was fortunate enough to be able to attend. Sam, Adam, I wanted to start with a bit of a reflective question. At some point in 2015, we were all probably sitting in offices reading an article about Russia's information war against the West. A decade later, I really have to reflect on where we were then in the world and the state of affairs now and ask did we lose that war?
Samantha Bradshaw:
Yeah, I think that's a really great question, Dean. When it comes to using the framing of an information war in this battle that we've been facing against another nation, I sometimes think that framing is not necessarily helpful because it invites us to only think of the challenges around information integrity through a military lens as something that we can win or lose with very clear adversaries fighting on these front lines. But I think what's happened over the past decade isn't just about Russia or foreign disinformation, it's about how our entire information environment has really evolved. We've seen the rise of things like algorithmic amplification and what scholars have been calling effective polarization. The idea that people don't just disagree, but we increasingly distress and even dislike people who hold different political opinions and views compared to ours.
And I think these dynamics are now driven as much by domestic actors and driven by platform incentives as they are by any foreign campaign. So to kind of come back to this idea of winning or losing the information war, I wouldn't say we've lost by any means. I'd say we've actually been transformed by what's been happening. The battlefield has really shifted inwards into our social relationships, into how we use and consume media online, and even our shared sense of realities. And I don't think this is something that we can solve through deterrence or through countermeasures alone, methods that we often think of when we're putting it into this kind of militaristic term or militarized frameworks. For me, the real challenge is about rebuilding a lot of the connective tissues of trust, of empathy, the things that make democracy possible, designing information systems that can foster understanding rather than outrage.
Because at the end of the day, to me, it's not a security problem to win, but it's more of a social problem to heal.
Dean Jackson:
Adam, what do you think?
Adam Fivenson:
Thanks so much, Dean. And thanks Samantha, that was an excellent way to reframe the question. I want to take it on a little bit more directly and I would just say, "No, I don't think so." Just to echo what Sam was saying, I think what we've learned since those days is that this isn't Russia's war on the West. But perhaps Russia and its authoritarian allies war on the very idea of citizens working together in good faith to rule themselves, to make decisions about their societies and how to hold the powerful accountable, basically what we call democracy, right? So in my mind, that particular moment that 2013, 2014, 2015 in Ukraine was an early entry in what has become a series of unprecedented escalations, perpetrated in part by the Russian state, but also in part by their fellow authoritarian powers and allied movements around the globe.
We talk about Putin having launched the first land war in Europe since World War II. These escalations have been enabled by democracy's collective inability, both to perceive the threat, in essence I would argue in large part because of the massive manipulation in the information environment and the bottom falling out on our democracy and our ability to deter that future aggression. So I really agree with Sam that in terms of how we define the problem, if we talk about it as a war, a hybrid war, then it puts us in the mode of thinking about military responses. And we've seen from the history, especially in this country in recent years, looking in particular at 9/11 and the response from the United States that when there's an opportunity to shift things in a military direction, that'll often happen.
Dean Jackson:
Let's follow that thread a little bit because we're on this podcast talking about the future of information integrity work, and that's a term with a history and multiple definitions. And what you described the militarization of the phenomenon and the response kind of happened in a way, right? We spent billions of dollars in military contracts to work on countering adversarial disinformation to say nothing of the public health response around miss and disinformation related to the COVID-19 pandemic. Could you tell us a little bit more about how you define it and how it's different from, or similar to the terms like disinformation, influence operations, or propaganda? Do these semantic debates matter?
Adam Fivenson:
Yeah, this is the question, and it feels like over the last few years we've been on sort of a non-stop treadmill where we're changing our terminology based on what's acceptable politically or how we view the problem in different ways. To me, in some ways it's quite simple. To me, the idea, the concept is that citizens in a democracy should be more likely to encounter high quality information than low quality information when they go out and search for answers for their problems. And that might be as simple as what to do when your kid gets a cold or as complex as understanding how the climate is changing and how that's tied to human action. That term high quality information, that's obviously subjective. What it means to me is that information is evidence-based, it's factual, and where relevant, it leverages scientific or expert knowledge to inform people's decisions.
I feel my thinking on this is really informed by Maria Ressa, a Philippine journalist, the winner of the 2021 Nobel Peace Prize, and she reminded us during her Nobel speech that without facts as a society, we can't have truth. Without truth, we can't have trust. Without trust, there is no democracy. And without democracy, we can't solve these major challenges that face humanity, whether it's infectious diseases like COVID or whatever might come next, whether it's climate change or whether it's the challenges to democracy. So that's kind of how I think about it.
Dean Jackson:
Is the quality of information really obviously subjective? A lot of the work in this field when you think about fact checking or initiatives like NewsGuard or the Journalism Trust Initiative, is kind of based on the premise that there are objective facts and that some sources of information can objectively be rated as more or less trustworthy. And I'm not saying that's true or that you're wrong, but could you say a little bit more about why you say it's obviously subjective that some sources of information are of low or high quality because that seems to be very much contested.
Adam Fivenson:
Yeah, yeah. I guess for me, there are some things that either happened or they didn't. I handed you this dollar, you took the dollar from me, that happened. The sky is blue, it's raining, these are things that maybe we can't contest. But when we're talking about the interpretation of real world events, and especially as we're talking and thinking about at a high level, societal level, a historical level, that requires interpretation. It requires someone who has history and knowledge, experience who understands the demographic and geographic vagaries distinctions between aspects of the challenge and aspects of responses and solutions. And that tends to be where this question of truth or untruth and who to believe and who not to believe becomes really much more contentious.
So it's less so about the basics that nobody can really deny, although you will find authoritarians, in almost all contexts, denying even those facts because in some ways that's the gateway to the larger lies that are required for maintaining that sort of control over a society. At least that's my view. But generally speaking, my sense of it is that it's not the basic stuff, it's really the more contentious and challenging issues around politics, around current events, around what's happening in the moment and how those stories are told immediately that help us to shape, essentially build a version of history starting where we are and looking to the future that in essence, this sector and the information integrity sector has really struggled with to try to find a way to set that baseline of history, that baseline of agreement and truth.
Whether you're a fact-checker, whether you're a journalist, whoever you are operating in this space, trying to figure out the way to thread that needle and maintain trust in a society and at a time when it's being clearly undermined at scale rapidly for so many reasons.
Dean Jackson:
What are some of the consensus markers of information that has high integrity or low integrity? And could you talk a little bit more about the consequences of low integrity information? Why is work to improve the integrity of information or the information environment so important?
Adam Fivenson:
I think this is part of the challenge that we have because my argument is that so much of the effort that particularly the US government was supporting over the last five, 10 or more years in this sector carry the markers of what I would describe as very traditional. Media. It is how do we revive the journalistic business model? How do we build trust in certain voices over time in a society? And what we're used to looking at is this is someone who's on TV, there are chyrons at the bottom of the screen, there are graphics over their shoulder. It's very traditional markers that going back to the maybe '70s, '80s, '90s, these things made a lot more sense because as an organization you needed more resources, more funding to set these things up and make them work. You needed to work with the government to get your broadcast license.
Of course, these days it's much easier. And part of the problem I would argue that we have is that if you're the New York Times, if you're the Traverse City Record Eagle, where I'm from, your presence online in essence is a small bubble immediately right next to the bubbles of a number of outlets that may not be fact-checking their information, that may bring a very clear ideological perspective to their work that's not evidence-based or intended to inform, but to convince and to influence. And so for me, part of the challenge we're in is that these markers that we used to have of high quality information have in essence been totally flattened in the context of our digital ecosystem. So what do you look for these days? To me, I think you have to do your research on the individuals.
I think you have to know the history of the platform itself. Tools like NewsGuard are really useful for that. There are a number of online rating systems. Of course they've come under attack in recent times because of their efforts to call out liars and manipulators. Liars and manipulators don't like being called out as we know very well. And then what are the consequences of that blurring? I would argue it's like imagine it's 1996 and you're in the checkout aisle in the grocery store. To your right, the fact that aliens just abducted the governor or whoever in the National Enquirer, it's as if the entire information ecosystem is now the National Enquirer and is not focused on helping people understand the challenges that they're facing in society and what to do about it, and instead is focused on delivering them to advertisers, delivering them to certain political forces and manipulating them in essence.
It's a fine line. You could say that all advertising is manipulation. As someone with some background in that field, I'm not sure that's totally accurate, but certainly we can draw some lessons from how commercial actors have operated in the information environment. And of course, I'm thinking of Shoshana Zuboff and her argument about surveillance capitalism there. But yeah, I digress and pass it back to you.
Dean Jackson:
I want to throw out a phrase that someone used during the workshop in July that really stuck with me, which is the idea that some sources of information, be they academic experts, newspapers, government agencies, rely on a self-anointed claim to authority in order to receive government trust. And you just described processes like having a fact-checking department, having to work to get government licensure. And some of that makes sense, right, because to achieve a certain scale, you can't usually be totally fly by night or at least in the past not. But you also said that there's a fine line and there have been several high-profile cases where the question of who's right and who audiences trust really balances right on that line. I'm thinking about things like during the COVID pandemic, science was moving very quickly and mistakes were made by public health officials and by journalists.
And so being able to say, "This person's right and this person's wrong because this person has credentials and this person doesn't," became very difficult because sometimes the people with credentials would be wrong and would have to backtrack. How do we separate information integrity work from censorship or propaganda? I'm someone who's asking this question from inside the tent. I was in information integrity, expert and professional. I still consider myself one in many ways. But in an era where this is still politically contested, I think we have to have a digestible answer to that question and so I put it to you here, how do you separate out the type of information curation work that the integrity space has done from much more authoritarian practices?
Samantha Bradshaw:
Maybe it's helpful to think about information integrity as a process and as a system that can help people access reliable information. So things like transparency, accuracy, and accountability, where improving information integrity means doing things like supporting local journalism, improving people's general digital literacy skills and creating mechanisms that also help audiences understand where information is coming from, how it's shaped, and the different kinds of biases that might affect how stories are being told. Whereas when we're thinking about other kind of terms like censorship, like propaganda, like these things you mentioned, I kind of see that more as content problems that are about control. Censorship is much more about limiting what people can see or say often under the banner of protecting the public or protecting national security.
But it's something that ultimately narrows the space for open debate. And propaganda can do similar things, but instead of removing information, it's much more about flooding the space with selective or manipulative content to push a particular agenda. There's differences there both around the different kinds of effects that these things are having. So while information integrity, censorship and propaganda, I'll deal with information, we can think about different kinds of effects where information integrity has much more to do with empowering citizens to make better or informed decision, and censorship and propaganda as something that is used to disempower people and take away a lot of their agency to evaluate information.
Adam Fivenson:
I think Sam is spot on. Censorship is when the government uses its power to silence its critics. What the information integrity community has been doing is highlighting manipulation and lies, using our first amendment rights and our freedom of speech and freedom of expression to call out people who are trying to manipulate and I would argue trying to harm the public more broadly. You could argue as Sam is saying, that we're simply trying to help citizens hold the powerful to account, protect their freedom of expression and their freedom of speech in this broader ecosystem. And I would think that's, to me, self-anointed and sounds rather they're sanctimonious. I think we can flip that perspective on its head by saying that these are people, whether they're fact-checkers or journalists that have taken on tremendous responsibility and tremendous risk, again for protecting freedom of speech, freedom of expression in democratic societies.
Samantha Bradshaw:
Yeah, and I would just add that what's interesting today is that a lot of governments are starting to take a very hands-off approach to content moderation under this banner of protecting free speech. But I think it's also left a big power vacuum. Platforms have been and will continue to be the de facto regulators of speech. And the decisions that companies make with regards to what kind of content goes viral, what accounts or content get removed, these kinds of things shape our entire information ecosystem. And when we're taking this laissez-faire stance to regulation and to content moderation, the things like propaganda, disinformation, harassment, these manipulative practices, they can really flourish. Because a lot of these information integrity mechanisms that would help audiences navigate our information environment simply aren't there anymore.
So just to come back to the main point for me, I really think information integrity, it's not about suppressing speech in any way, it's about governance and it's about design. It's about asking how do we build systems that support open expression while also safeguarding a lot of the conditions that make democratic deliberation possible?
Dean Jackson:
One of the most important takeaways I took from the workshop was the idea that content moderation is essential to have public discourse and free speech on the internet that you can't really express yourself meaningfully if there's a constant parade of bad actors marching through the space where you're trying to speak. You can't express yourself safely and confidently in the middle of a neo-Nazi demonstration. And that's what so much of the internet becomes without content moderation. It becomes a space where really motivated people with bad intentions and extreme views can mobilize and drown out the rest of the public. I wondered if we might try a thought experiment. Imagine that I'm the median American news consumer, whoever you think that is, and it's 2020.
I am trying my best to follow the news about the COVID-19 pandemic while also maybe taking care of young children, struggling to adapt to work from home or struggling to go into a workplace as an essential worker. And in this environment, I have to make sense of the world and I'm getting conflicting information from the government, from news sources, wear a mask, don't wear a mask, the virus spreads by hand contact, the virus spreads through the air and it's difficult to tell up from down. And at the same time, I'm bombarded by other sources of information, maybe neighbors in a community Facebook group, maybe it's an online influencer whose videos I really like, maybe it's simply a relative who watches different news sources than me and suddenly my social media posts, particularly on Facebook or Twitter, start getting labeled, taken down.
Maybe I've repeated some things that ended up not being true. How would you explain to me what information integrity is and why it matters?
Samantha Bradshaw:
I think if we're looking at the COVID pandemic, this was a time when people were trying to make the best decisions for their family with the information that they had. And suddenly their posts might get labeled, they might've been getting taken down depending on what they were sharing and the kinds of information and communities that they were engaging with. And it could feel like platforms were doing a lot to police what they were saying online. And I think that broader experience and the sense uncertainty and mistrust, these are why processes of information integrity matter. Because I think when we're talking about information integrity, again, we're trying to create these conditions where people can actually trust information. And of course during COVID, the science itself was evolving.
Experts were learning as they went and as we learned more about the virus, but our communication systems weren't built for this kind of uncertainty. Social media often would reward speed, it rewards emotion, and it rewards virality. It doesn't reward accuracy or context or slowing down in an uncertain environment. And I think there's also a broader tension with information integrity where we can think and see it as indirect competition with the business models that monetize misinformation. We all know it spreads faster, it keeps people scrolling longer, it drives engagement to generate more advertising revenue. When we have an uncertain information environment and there is all this competing information being structurally supported by social media business models, it makes it even more challenging to get the right information to people.
I think this is a little bit of a failure of information integrity because platforms haven't always been transparent about their decisions. It can feel very arbitrary and very political, which can fuel perceptions of censorship. We can't just leave information integrity to the goodwill of companies or to the markets to self-correct. We need rules, we need independent oversight, and we need better accountability mechanisms so that incentives for accuracy, for safety, for transparency are going to outweigh incentives for profit and for people to profit from outrage and from confusion.
Dean Jackson:
I want to pivot to ask about the field itself. What has happened to the people in this field since the dramatic US government funding cuts in the beginning of 2025?
Adam Fivenson:
Yeah, happy to address that. I think there's two different sides of it. One is the community of folks based here in the US who were working on this issue supporting this issue, both domestically going back to 2022, and now more recently the internationally focused efforts to support journalists and fact-checkers and narrative researchers and other groups all around the globe. Personally, I know a little more about what's happening domestically, but there are some interesting research going on that would tell us more about what's happening in other contexts. To me, the story of the moment is this tragically fascinating experiment, this natural experiment of asking what happens when you remove the vast majority of the money that was going to support information integrity in contested places and contexts like Sub-Saharan Africa, East Asia, Latin America, and obviously particular countries, not that these are regions that are homogenous in any way?
And so we know that a massive number of journalists in those places that were working to hold power to account, working to uncover corruption, working to counter disinformation and ensure that citizens have high quality information are out of work, they're out of a job. And we're going to be seeing over the next coming months here and years what that means for democracy and for citizens in those contexts. It's a fascinating question to ask. Let's fire all the journalists and the fact-checkers and the folks are worried about this issue and working on this issue and that the US government in particular has been able to support to do that work over the last X number of years and let's see what happens if you let the bottom drop out there. And I fear that the results will be quite terrifying, not just for people in those contexts.
But if we understand American security to be related to security, stability, networks of trade all over the world, all of those relationships that the US and Europe and frankly between other countries have are going to be under significant strain. And I fear what that will mean in terms of conflict and competition around the globe. Here domestically, I would say it's a much broader grab bag in terms of what's happening with the folks who were doing this work. Some folks have gone into consulting, some folks have managed to find a new role. Some folks are finding support out of Europe in some ways. But on the whole, there is no longer a professional community that's deeply engaged in working on this issue domestically.
And you could argue that some of the changes, frankly the attacks on this community from 2022 onward to undermine, for example, the Election Information Partnership in 2020 are part of the reason why things went the way they did in 2024, although of course there are many reasons. It's a community that's scattered and that's trying to figure out where do we go from here? If in a future context there's another chance to do this work, what will we do differently? What do we understand as a community to be the mistakes and the things that we were doing poorly and that we could do better? And if we were to start over with a blank slate or at least a more informed blank slate, what would that look like to do this work in a different context? Which of course leaves out the broader question of the fact that this work will not be done in the current context. So trying to sort through that as well is a challenge.
Dean Jackson:
I wanted to ask you about the international call out as well, because of course, USAID and the State Department were the tremendous force of support for civil society and journalists around the world. And in Europe there are some domestic sources of support that are ramping up to replace that, but in many countries there's simply no replacement to be found. What do you expect for civil society next year as these funding cuts become more felt?
Adam Fivenson:
It's very challenging. The US was not the only funder in this space, but as you suggest, we were a massive one and an important one. There are some folks that have been doing some survey work to look at the state of civil society and understand a bit of what's been lost. One of them is DIA, who's the executive director of the Tech Global Institute. They've been doing some research on the impact on internet freedom, which is not the same community, but an adjacent one, of course, finding that a huge number of organizations have lost funding. Accountability Lab has also done some survey work on this, broadly speaking, looking at the impact on civil society as some of these cuts around the globe and found similar results. Kourtney Pompi is getting ready to publish a piece on this as well that folks should keep an eye out for.
But yeah, I don't think the situation is good. This was the US approach, was to support civil society. That's bread and butter American culture and it's history politics, historically speaking, how we've operated as a country, and by and large, it's how we've framed our democracy support around the globe, is by supporting independent nonpartisan organizations that are focused on a civil society, on developing civil societies around the globe. The American public has invested that money over time historically because we understood that having democracies elsewhere meant less war, less conflict, more trade, and better outcomes on the whole. And clearly we're seeing them move in the opposite direction here. So I'm worried because many of those people are my friends. They're out of a job that's tough, they may have families to feed, rent to pay, et cetera.
But at a broader scale, I'm tremendously concerned about what the impact will be on those societies of those people who've not just been receiving US funding, but US support and US engagement. Someone mentioned at the workshop that you've been describing, the one back in July, that historically if there was a threat to civil society in their country, then they could very quickly go to the US Ambassador and say, "Oh, we're facing this threat. Can the US step in and say something, let this bad actor know that there's an eye watching?" And now that's no longer the case. And of course, I'm sanguine and honest and open that there are many facets of US influence in many contexts. But in the democracy sector, the loss of US support I view as potentially catastrophic and that it will have significant impacts on American's prosperity, safety, and security.
Dean Jackson:
Participants at the workshop suggested that cuts in US government funding mean that the civil society actors that are able to remain in the space will be free to experiment with new strategies they couldn't before when their primary donor was the United States. Other participants in the workshop called for a sort of ruthless prioritization, that they really need to drill down to what are the most important objectives and most dire threats in the information integrity space. And these aren't mutually exclusive suggestions, right? You can experiment in pursuit of very ruthlessly defined priorities, but it does open the line of questions. What new strategy should we embrace? What should we prioritize and what should we deemphasize?
Adam Fivenson:
That's the key question of the moment, is where do we go from here or would we go from here given the opportunity to revive this work? And I think, depending on who you talk to, you're going to hear different answers. My view of it is that there are a few different broad areas that we need to be looking at. The first one for me is to refocus our effort on real world harms. And in doing so, the expansion of the coalition of actors and voices who are concerned about this. And part of what I mean by that is just that when we're talking about sort of nebulous concepts like democracy, I think it's very easy for the salients to miss the average consumer. Why does this matter how democracy is going away? I think that's a big deal. You think that's a big deal. We agree that's a problem, but what does that actually mean for the average American?
What does it mean for the average person in El Salvador and Botswana, in Thailand? Refocusing our conversation around real world harms a focus on those areas like health, like climate, like safety and security. These are areas where I think we have a real opportunity to expand the aperture and the conversation. And I would argue that our immediate closest partners or sectors where we have voices and perspectives that would align, are areas like health, climate, anti-corruption, those are the rather obvious ones. I think that our community has to do a much better job of making this case to less adjacent communities that we haven't been effective in talking to about the importance of truth, trust and integrity in the information environment. And to me, that means the business community and it means the national security community.
We've got to open up those conversations in ways that are relevant to everybody. Secondly, we have to reclaim the free speech mandate. We can't be the censors. And again, I would argue we are not, nor have we ever been. We're people using our first amendment free speech rights to talk about what's happening, who's who in the information environment and who's lying. We have to be able to tell that story effectively. And then thirdly, I would point to the need for a much more distributed and resilient infrastructure in support of this work. And what I mean by that is that in large part, at least internationally, so much of the effort that the US has funded around the globe comes through our very traditional standard procurement models. And that results in, it generates competition between players that should be collaborating.
Competition has its role. It can be really helpful in getting the best results and answers when you have a commoditized market. But in a market where you don't have as many players, you don't have as many voices, you really have to find a way to make sure that all those boats can rise. And I don't think we have done that very effectively. In addition to the fact that our method of engagement, even our fastest methods of engagement, such as the National Downward for Democracy's Grant Making process still can take three, four, five, six months to reach people. Meanwhile, Russia's doling out money through Africa stream, 20,000, $30,000 within days. It's not a question of strategic competition. More so just understanding that our efforts to support this ecosystem around the globe have been piecemeal, they've been siloed.
And they've not generated the kind of collaboration that our partners really need to have impact at a time when we know that authoritarians, both international and domestic are collaborating very closely. They're collaborating in terms of technical knowledge of how to manipulate and control an information environment, they're collaborating by sharing expertise and funding, and they're also directly amplifying each other by using the retweet and the like button in ways that we are not. And so when we talk about resilient distributed infrastructure, it's more support, more direct support to those local organizations, it's helping them be better connected to one another, sharing data, sharing tactics.
And then I would argue maybe perhaps most importantly, it is joint amplification. And to borrow or maybe manipulate the term from Facebook—coordinated authentic behavior—that we need where everyday people are leveraging their shared values to amplify messaging and ideas that help to support their communities. And we've really gotten away from that. So that's just a few high level thoughts on that. I hope that we'll have opportunities in the future to dig into that even further, because that's certainly the question is, what would we do if we had a chance to do this again? What would we do differently?
Dean Jackson:
One long-running coping point in the space and a common suggestion for what we might do differently is to embrace the changed information ecosystem that you and Sam have both alluded to a few times throughout this conversation. And Rachel Kleinfeld and Renee DeResta in particular have a new piece out from the Carnegie Endowment about the changing shape of the information environment and what experts and institutions need to do to influence public debate today in this changed environment. And they say institutions should recognize that audiences today play an active role in shaping conversations on social media. And so it's essential to engage new mediums like short form video and new partners like online influencers. And I guess the question I'm left with after hearing versions of this for so long is, why has this taken so long to grasp and why aren't we all on TikTok?
Samantha Bradshaw:
I think this is a great piece, Dean. I do agree in a lot of ways that today's media landscape is different in part because it's participatory. Audiences are no longer these passive consumers of information that we receive from the top down, but we actively shape and co-create a lot of narratives. And I think it's really smart to think about how we're doing a lot of citizen and civic journalism in new spaces like TikTok or on YouTube shorts. But I also think that we have to be careful to not just simply buy into the same media logic that has created many of the current problems that we're facing in today's information environment. Platforms like these are built around engagement metrics, clicks like shares and outrage, not around accuracy, reflection or civic dialogue.
And when we're adapting our messages to fit this logic, we risk reinforcing an attention economy that rewards speed and emotion over depth and truth. For me, the question isn't just about how we're using new media tools, but it's also about the values we're reinforcing when we use them. Are we reaching people in a way that's actually building understanding and critical thinking, or are we just competing in the same race for attention that's already eroding public trust? So for me, I'd rather see us invest in alternative models, things like slower journalism or community-based storytelling, things that don't just play by the existing rules, but also try to rewrite them. Because at the end of the day, I think if we're serious about changing the information environment for the better, we can't do it by reinforcing and reproducing the incentives that broke it in the first place.
Dean Jackson:
Sam, I think that's such a thoughtful answer, and it reminds me that I was recently watching a congressional hearing and chatting about it with someone, and they were frustrated at the way a particular question had been framed. And I said, "I think that substantively, you are right in your critiques about this question, but that is going to reach the audience and make a much bigger impact and get the point across to many more people than a more nuanced version." And I was conflicted at that moment because you want to have impact, you want to reach people, but you also want to be telling nuanced, complex stories about nuanced, complex things. And I think your answer just really reinforced that trade-off for me, is that trade-off unavoidable?
There have been so many different approaches and suggestions for reforming journalism over the years from solutions journalism to slow journalism to explainer journalism, and I find myself wondering, are these an elite fascination that'll ultimately fail to compete in the new media environment? And if not, what would it take for us to really challenge the way that the media ecosystem works, as you said?
Samantha Bradshaw:
Yeah, I do think that there is growing demand for slower content, for spaces, for people to pause and to reflect and to engage more meaningfully. And I think this is playing out so clearly in a lot of the conversations about addiction and compulsiveness when it comes to social media use. I think people are really starting to recognize the harms that these broader technologies and platforms can have on our, not just our democracies, but our abilities to internalize information, to think critically, to read, to focus, and to be present in our daily lives. At the end of the day, these are the things that matter the most, and I think a lot of people are feeling an increasing disconnect because of the way that technology has been driven by the underlying logics of attention economics.
And so I think we're poised to make big, broader systemic changes about how we're consuming information, the kind of ways that we're telling stories. We just need people to make those investments, to take a lot of these initiatives around community-based storytelling off the ground, and to get them out there to give these kinds of platforms and alternative visions for how news and information to be shared. They at the end of the day need funding and they need resources. And I think this is a moment in time when we can start to think creatively and get real buy-in from people who are increasingly feeling not only the gap of having this very polarized, very toxic, very attention-demanding information ecosystem, but also feeling the need to slow down and reflect and disconnect from our digital technologies and what is happening in influencer world.
Dean Jackson:
Adam, do you want weigh in on that?
Adam Fivenson:
Yeah, big time. I think Sam is spot on in her vision of a healthier, slower, more thoughtful information environment. But I think the challenge is how do we get there? We're in a moment when expert influence, expert ideas are at a nadir in terms of our influence. Societally, we're used to an information environment and a society where expert ideas are translated to the public through journalism. And that's the reason why in many senses, authoritarians attack journalists because they are an external source of power, knowledge, and influence in a society. And we're in a moment where that critical linkage has really been effectively severed, not for everyone, not all the time, obviously not universally the case. But broadly speaking, at scale, our ability as experts to reach the public is at an all time low.
And so my sense is if we want to be in a position where we can help to influence policy in ways that are informed by history, by science, by ongoing knowledge of geography, demographics, etc, then I actually think that Renee and Rachel's argument is very strong to the degree that I would argue that our sector, particularly the information integrity sector, we should be training people on vertical video. We should be helping them get over some of those barriers. Frankly, you asked why aren't we doing it? I think we've all been very comfortable and incentivized in very traditional ways such that if your report, if your perspective gets studied by the media, gets picked up for a podcast, those are the signals of success. But do they reach the broader public in ways that build influence and build salience around democracy?
I think we have a long way to go there. I don't think that vertical video is the only answer there. I think that's part of it. I think the other half of it is going back to in-person engagement. That's just harder to do at scale. But I think if you compare those things, even if only 5% of our sector were to get onto social media or get on to TikTok in ways that reach people and are effective in bringing this perspective, I think that's at least part of what can bring us toward regaining some salience and helping to convince folks that we have a problem and it's only going to be solved if we're all working together and rolling in the same direction around our democratic values and around what threats are to our societies, our democratic societies.
So I totally agree with the long-term vision, but I think to get there in the way that authoritarians have infiltrated and undermined open information spaces, we now have to descend into these uncomfortable spaces to regain salience and use that salience and influence to move us back in another direction.
Dean Jackson:
During the workshop, we also talked a lot about the partnerships that are needed to carry on this work post 2024, and how we can build greater solidarity between efforts that should be partners. But this is hard. In the United States, philanthropies rebalancing its priorities, Universities are facing deep threats to their independence, businesses are falling back on real politic like their bottom lines. In Europe, policymakers are reconciling tech regulation with trade negotiations, and in Brazil muscular efforts to defend democracy against it who have been controversial, especially among free expression advocates. There are a lot of fault lines in what could be natural coalitions for information integrity. And I'm wondering how you see this field building new alliances on top of those fault lines?
Adam Fivenson:
Yeah, yeah, absolutely. I think it's tremendously important. I talked a little bit about this earlier in terms of immediately adjacent communities that are the low hanging fruit in this context. That's for me, that's the climate community, that's the health community, anti-corruption community. But to me, if we want to have real impact and societal salience, again, it's all about the business community. What I've noticed is that there are a number of organizations, especially private sector groups that are actually engaging private sector organizations, businesses around this question of information integrity. They just typically do it through the lens of reputation management and monitoring and response.
And one of the challenges is it seems like many businesses are much more interested in understanding threats to their brand and discussing how they might then respond to that particular threat with their own information attacks or their own alike. And it's not necessarily an ecosystem level conversation. And that is one of the mismatches between our communities is that when we talk about this set of issues, it's all about the ecosystem. It's all about the environment and how we can change the incentives in that space to improve things. It's not that those kinds of initiatives don't exist in the business community, of course they do. This is just a tremendously politicized issue. And my sense is that for us to better engage the business community, we've got to figure out where those entry points are, where the issues that they care about align with the issues that we care about.
I think it's rather obvious in the sense of if you have a contract signed with another business, if there's no basis of truth or trust in a society, then either side can violate that contract. And frankly, it's hard to say if there will be consequences for that. So to me, the stakes are quite clear. But translating that environmental level challenge to an individual organization's profit and loss sheet and giving them a rationale, a business rationale for investing in information integrity broadly, I would say is a puzzle that we have not yet figured out how to solve. But there's lots of folks who are thinking about that, and I think there's real hate to be made there for our sector. As far as alliances that matter, I would only just add the national security community.
We need to know what's happening in the world so that we can defend Americans and so other societies, democratic societies can defend their citizens. And so there's a real clear case in that context for truth-based communication and for verification of information, empiricism, science, et cetera, for that community. So figuring out that how to bridge that gap, I think is the big challenge for our community in the immediate future here.
Dean Jackson:
Your last point is really well illustrated by the situation in Ukraine, which is where we started this conversation. And just to take it back there, if we misunderstand what's happening on the ground there and what's really at stake and make our national security decisions based on those misunderstandings, we could really come to regret it in short five or 10 years, if not before. Sam last year, Sam, you wrote a piece for the Harvard Misinformation Review, said that misinformation research is really at its best when it speaks to specific provable harms to the public, and that this research can often be very abstract and you made a call for more specificity and grounding in harms to communities. I wonder if you might say a few more words about how you think that could help this field find a way forward?
Samantha Bradshaw:
I think a lot of the misinformation and disinformation research tends to measure misinformation, fake news, disinformation, malinformation, information operations, all these terms. We have so many terms and we have so many different kind of definitions of what kind of harms these turn into. And a very also narrow focus on only a couple of mainly Western case studies where we've tried to assess how the impact of these kinds of campaigns actually affected people's attitudes, their opinions, and their behaviors. And I think there's been a lot of broader generalizations around different definitions to these bigger, broader questions of how is social media affecting democracy? We don't have a lot of really good evidence in the field that is very specific when we're identifying different kinds of harms and different kinds of impacts.
And I think we need more of that to be able to develop very clear policy responses to different kinds of threats. We need to know what kind of communities or individuals might be at a greater risk of being misinformed than others. We need to know more about how media diets and internet habits and other kinds of online practices are affecting how people consume, internalize news and information and then go out into the real world with political beliefs and values and behaviors. And we need to have more of these studies done within broader geographies and within broader contexts to really come up with strong policy solutions to counter them.
Dean Jackson:
Sam, Adam, this has been a really interesting conversation about a very important topic, and I know one that all of us have spent a lot of time working and thinking on, so I'm really pleased you were able to take the time to be here with us today. Thank you so much.
Samantha Bradshaw:
Thanks, Dean. It's been so nice to chat with you and Adam about information integrity and where we are today and where we hope to be in the future.
Adam Fivenson:
Yeah. Thanks so much, Dean. Really appreciate the opportunity to reconnect with you and to speak with the tech policy press audience. It's a really important conversation and tremendously important that we have these discussions and that we keep having these discussions about this topic in particular.
Authors
