Home

Donate
Podcast

Assessing the Relationship Between Information Ecosystems and Democracy's Woes

Justin Hendrix / Jun 1, 2025

Audio of this conversation is available via your favorite podcast service.

Earlier this year, an entity called the Observatory on Information and Democracy released a major report called INFORMATION ECOSYSTEMS AND TROUBLED DEMOCRACY: A Global Synthesis of the State of Knowledge on News Media, AI and Data Governance. The report is the result of a combination of three research assessment panels comprised of over 60 volunteer researchers all coordinated by six rapporteurs and led by a scientific director that together considered over 1,600 sources on topics at the intersection of technology media and democracy ranging from trust in news and to mis- and disinformation is linked to societal and political polarization. I spoke to that scientific director, Robin Mansell, and one of the other individuals involved in the project as chair of its steering committee, Courtney Radsch, who is also on the board of Tech Policy Press.

What follows is a lightly edited transcript of the discussion.

Robin Mansell:

Robin Mansell, London School of Economics and Political Science, professor emeritus and scientific director for the first cycle of the Observatory on Information and Democracy.

Courtney Radsch:

I'm Courtney Radsch, I'm the director of the Center for Journalism and Liberty at Open Markets Institute and a non-resident fellow at Brookings Institution.

Justin Hendrix:

So just to start this morning, what is the Observatory on Information and Democracy?

Robin Mansell:

They were established in 2018 with a mission to investigate all aspects of the role of information ecosystems in public discourse, in shaping the way in which people understand the world around them, the way in which they debate issues, and particularly in the light of increasing evidence of very destabilized environment for that public discourse impacting on democracy. They're a non-profit organization. They have 55 member states at last count around the world, and they attract money from many different sources, mostly CSOs, NGOs and a few governments, no big tech companies.

Justin Hendrix:

This document starts out with a simple statement. "Democracy is troubled. There is no dispute about this. What is controversial is the role of information ecosystems in contributing to the fragility of democracy and to the viral spread of mis and disinformation."

I understand this project to essentially be trying to draw a line under all of the research we have to date on problems of information integrity, problems of the way that tech intersects with democracy. Can you generally describe the methodology for producing this 321-page tome?

Robin Mansell:

Thankfully, we have an online world, and the observatory reached out with multiple messages to attract... At one point, there were over 300 people from around the world, some scholars, a few policymakers, a lot of activists and NGO members, and they were asked a set of specific questions by a steering committee, which was chaired by Courtney. There were 39 questions, and they were asked, "Do you have expertise on any of these issues around troubled democracy and information? And if you do, let us know."

They set up three separate groups, if you like, around media and trust, AI, and technological change and its impact, and data governance, and whittled it down to around 60 active participants who sent us amazing amounts of information. In the end, we accumulated 3,000 and entries in our database... More than 3,000. They were all scanned by at least two people and around 60% of them were academic sources, some of them were policy documents, gray literature, and some of them were government documentation. And that was all sifted through, and then we had six rapporteurs on each of those themes and they wrote text, and all that text came to me as scientific director and I had the pleasure of trying to configure that into the report that you see.

Courtney Radsch:

And maybe I could just add a little bit about the endeavor underway. So there is a worldwide interest in understanding our information ecosystems and particularly disinformation, misinformation, malinformation, whatever you want to call it. There's a widespread recognition that things are not going well and that there is a link between information and democracy. And so we wanted to understand what the research actually shows and to make sure that we were acknowledging how that research is produced, why it's produced, where it's produced, what we actually know from the research and what that tells us from where it is based and why it is done. So for example, I think one of the major contributions that this report does in addition to the incredible work that Robin and her team did to draft this really in-depth explanatory report is it created a database of all the sources that were used so that we can visualize what we know about these different intersections of issues within the information ecosystem.

Quantitative versus qualitative research, was this done in only in English or in other languages in the Global North, Global South or global majority, really wanting to put front and center that our knowledge is also political, how we create knowledge, what is done, but also to recognize that while there are other efforts that may have similarly tried to do similar type of effort like the international panel on the information environment, that was restricted only to peer-reviewed academic work. But what we know is that so much of the research and what we know about what is happening and how the actual empirical evidence is created by institutions that might never get into a peer-reviewed journal because there's a whole Global North, Global South linguistic politics to publishing. And so much of this is being done on the front lines by non-profit organizations, by NGOs, by the development community or observations by journalists who are interviewing people or looking at the data on platforms, et cetera.

We really wanted to cast a wide broad net. So in addition to sending out and asking for sources, there was also a specific effort to cast that wide net and to look beyond the traditional sources to look at a very wide range to make sure that we were really understanding what do we actually know because there sure are a lot of assumptions about things. And we also wanted to go beyond this techno-deterministic emphasis on what people posted and how people reacted to get beyond to the more structural and ecosystemic issues.

Justin Hendrix:

This is coming at a moment... It's almost like we could draw a line around the time of the publication of this report, everything that came before it and everything that appears to be set to happen after it. We've got an enormous retaliation underway against this type of research in the United States. The companies themselves have made themselves more opaque and have done their best to put up obstacles to researchers who might like to look at questions like those that are discussed here. And then I guess another moment in time, this is also looking backward at a social media and information ecosystem almost just at the precipice of the rollout of generative AI. And we're still in, I'm sure, a bit of a lag in studying the effects of generative AI on the information ecosystem. There's still lots to be done there. But I guess I'll just raise these timing issues here. I want to put those to you how you think about this artifact as a reflection on time.

Robin Mansell:

It's a benchmark. One of the aspirations was to do an IPCC-style report, which is a very big thing. It looks at the data, it evaluates it critically, and it puts it out there for others to use in order to influence opinion. I think it's important to note that we were not just looking for what we know, but also what we don't know and what is disputed and conflicted. And that confliction has a history. So looking back, our report actually brings together a lot of that conflicted history with different voices having different things to say.

Courtney already mentioned that is true in the Global North versus the global majority world, but it's also true in different political situations and contexts. So looking back, mis and disinformation has been a core theme regardless of what do you call it, propaganda or whatever, for a very long time in the old media as well as in the newer media. And AI also comes along, but it's also been around for quite a long time and there's quite a lot we know if we don't just get fixated on that new thing called generative AI. So our report contributes to a context for thinking about these developments, which are always happening. Of course, when we published on the 15th of January, it was just after Trump took over. I think he takes over in November-

Courtney Radsch:

Just five days before. Yeah.

Robin Mansell:

Yeah. Wouldn't have been nice if we could have coincided exactly with Trump taking over, and then the first set of executive orders which have destabilized the world big time and have obviously begun to attack the media to be very hell-bent on having America win the AI race, come what may, as you said, making it virtually impossible for people to do independent research on questions of the harms associated with mis and disinformation. All of that is happening. So I think our report can speak to that, but it doesn't pinpoint the exact reasons for why that is happening or what can be done about it. And I would argue no report of this kind could because it matters where you are and when you are. The biggest message coming out of this report is it is wrong to generalize from the experience of any one country and any one time and say, "That applies everywhere." It simply doesn't, and that's what the research evidence shows.

So for the specifics of what's happening in America now that's impacting all the rest of the world, we need to take that into account that it is very specific set of values which are being promoted by a current government in America, just as if you took Hungary, it's a very specific set of values that are taking place in a particular thing called a democracy. I'm not sure that word applies nowadays. And that, in turn, has a knock-on effect on how discourse happens in the public sphere and how it doesn't, who feels surveilled, who feels oppressed, who gets locked up, and in the case of journalists, who gets killed. I think our report speaks to all of that. And when we speak of harms, this is not just some nebulous idea of a construct. It's a real tangible impact that these information ecosystems are having on people's everyday lives and their livelihoods. And I think our report does speak to that.

Courtney Radsch:

I think this report came out actually at the perfect time. It came out right before Trump took office. Right? It came out just less than a week before inauguration. It came out as Meta decided to stop fact-checking on its platforms. It came out as a host of important platforms and media organizations decided to get rid of efforts to focus on diversity, equity, and inclusion throughout their information ecosystems, right? Throughout the staffing, contra-production, all of these aspects and decided that now with the political winds shifting, that was no longer manageable. And, of course, the attacks on disinformation researchers, as they've come to be known, precede the Trump administration, but are certainly propelled by it. So I think this was a very perfectly timed report because it is the product of over a year of intense scientific research while recognizing that science and scientific research takes place in different forms. And it was published before the politicization that could have occurred had it been published after, say, the various executive orders or dismantling of democracy that the Trump administration is currently leading.

And what I think it does very importantly in contradiction to a lot of the focus of the second Trump administration, which equates disinformation research with censorship, where it equates content moderation with censorship, where it is turning this idea that regulating information ecosystems or digital markets is a form of censorship or restrictions on free speech or anti-Americanism. And we actually have this evidence. We now have a very robust document that has a lot of this evidence drawn together about what are the issues, what do we actually know about content moderation and censorship? And it's very interesting right now because the Federal Trade Commission is asking for requests for information about what they perceive inaccurately to be the politicized suppression of Republican voices or conservative voices and perspectives on social media and internet platforms. And in fact, when you look at the evidence, what you find is that is neither supported that it is a politicized effort, nor is there evidence that these people are somehow overwhelmingly censored.

What we actually have now is a body of evidence to show how does content moderation work, how is that linked to the politics of the day for sure, we know that COVID led to different content moderation decisions, and I think it really shows the complexity of that topic is far beyond just it was a guy in the government calling up some guy at the platform to take down a piece of content. This is a really important antidote to the factless, baseless, fictitious world that is now being created by the Trump administration's doublespeak and erasure of information. I'm really thankful that we did this research before so much information produced by the US government, for example, has been erased or taken down. So I think this will actually be a very important contribution and something to build on as we strive to understand what the broader impacts of the Trump administration's crackdown on freedom of expression, on human rights, on democracy, and this rising authoritarianism is all about.

Robin Mansell:

A big contribution of this report is that it not only looks at the information content side of things, which, rightly, it needs to, but it looks at that and why it's happening in the wider context of the whole system, if you like, from infrastructure to applications and to why we are seeing the spread of a data economy which privileges big tech profits, data surveillance, various kinds of top-down data governance, which they go some way to try to protect people's privacy, they go some way to protect information, which now the Doge initiative is trying to take away from everybody in the United States, but that's been happening in other countries too. And it begins to address whether or not there are alternatives to the current configuration of information ecosystems that we are now seeing. So it takes a really strong stand on saying these decisions are people's decisions.

They're not inevitable, there are alternatives, there are grassroots initiatives happening all over the world, and that is one area that we simply don't know enough about. We need to join up those kinds of initiatives, whether they happen to be to do with cellular phones and doing that in a less expensive way or whether they have to do with new frameworks for collective governance of all the data that we generate online, all of those kinds of things are alternatives and they need to be discussed, they need to be understood and we need to push forward with them because otherwise we're basically stuck with the current framing of things, which is either a part of a US-led big tech-led position or one which says, "Ah, we just need to get the regulation and the governance right as the EU is trying to do." But we can see at the same time that as the EU does that with European values, they also get politicized and start vacillating on whether or not they can really tackle the big tech companies because of trade issues, for example.

Justin Hendrix:

Cory Doctorow has a piece in the Financial Times essentially arguing that what's going on right now, the Trump administration effort to dismantle the global system of trade, creates an opportunity potentially for countries to draw firmer lines and potentially to emphasize the desire to compete.

I know, Courtney, this is what you work on day in, day out, but I guess maybe the flip side of looking back on the moment before the Trump administration, maybe looking ahead, has the events of the last few weeks, months maybe clarified some of the problems in this report? Is it possible that maybe the types of crises that are spinning out of the present moment maybe creates a little of the room that you're talking about to try new things?

Robin Mansell:

I've always been a believer, a hopeful person. I think it does do a couple of things. One, it brings coalitions of people together who might not always see themselves as being interested in the same things. So you're beginning to see, certainly here in the European side, a lot of collaboration between NGOs, some of whom are interested in hate speech, some of whom are interested in trade, some of whom are interested in trafficking people, some of whom are interested in child rights online and the harms, they're coming together in a way that they perhaps wouldn't have been doing before because they're galvanized by the fact that something really dramatic is happening in a very influential part of the world, which is the US. I think that is an opportunity and I think you never know what's going to come from these opportunities.

And I'm a Canadian, so I hear big time about the need to fight. It's time to fight and it's not just time to fight about not buying US products on Canadian shelves. It's time for all sorts of discussions to go on across the provinces in Canada about how they can rebuff and resist the incursion of whether it be Musk or whether it be [inaudible 00:20:08] refusing to publish news in Canada, how they can find alternatives to these to create a wider space for a democratic public discourse. And as we know, the far right in Canada is not missing. It's there. It's a problem just as it is in other countries. So it seems to me in this age of populism and polarization, there are... Because of the huge presence of the American changes, it's on people's minds and it galvanizes action in a way that might not otherwise have been the case.

Courtney Radsch:

I think that the crisis that the Trump administration has provoked by completely revising the American role in geopolitics, international relations, trade, democracy, literally just a massive multi-front triggering of a crisis in many respects could be an opportunity. And a couple weeks ago, I wrote a piece about resisting American techno-fascism that sounds like it's making the same point as Cory made in his more recent op-ed, which is this is a wake-up call for other countries that they cannot just sit around and allow American corporate technology companies that are owned by unaccountable billionaires that have now fused with the power of the American state to overwhelm, and overrun, and dominate their digital markets and their information ecosystems. So for too long, these companies have played an outsized role. And when I say these companies, I'm talking about big tech companies. And let's remember that we've never called something big, big pharma, big tobacco because it's good, it's because it needs to be resisted, right?

So these big tech firms dominate the way that people inform and communicate with each other, how they share information. They have completely reshaped politics in many cases because of the algorithmic amplification and monetization of extremism, and fear, and polarization because of these business models. I think that there has been wide awareness about this, but a lack of ability to do anything about it because we didn't have this crisis, and because a lot of these corporations, they do create some economic value in the markets where they are. They have resisted the jurisdiction of smaller markets, for example, like South Africa, Brazil, refusing to abide by domestic laws and regulations. And so this crisis, I think, has been a wake-up call. What I hope that we see is a united front by countries like those in the EU, Canada, Brazil to stand up for their regulations and their laws. This means that their digital markets laws, there's a really good AI bill in Brazil, the UK now is doing... They have this new Competition Markets Authority that are doing these investigations into what they call strategic market status of companies.

And it turns out that many of these companies are American. And, unfortunately, that is now being weaponized by the Trump administration to claim that it's anti-American. But, actually, companies have been allowed to garner illegal monopolies. They have been able to self-preference to use data across different lines of business to disregard data privacy laws for many years, or the fact that we didn't have any to combine products to ignore copyright law. They've turned lawbreaking into a competitive advantage and they have insinuated themselves through their free services in order to achieve market dominance. And so this limits the ability of alternatives to be created and for local economies to operate according to a different logic than surveillance capitalism according to a different logic than whatever it is that what the American big tech corporations are trying to put forth. And we see that now with AI innovation. We hear so much about AI innovation. Innovation towards what? Are we innovating towards a better, healthier, more just, equitable world or are we innovating towards more profits for big tech, less human dignity, fewer human rights? And we're not having those discussions.

So this is an opportunity for countries to stand up for the fact that they should have sovereignty over their digital markets and they need sovereignty over their information systems. Because every media organization, every political actor is using these platforms, and you have the choice to use some of them, I wish we saw more people opting out of using X or Twitter for example, but others, you don't have an opportunity to opt out of. There are only two browsers. There are only three main email systems. There are only two ad tech providers. So the whole system is monopolized at multiple different levels. So we need to break up those monopolies and we have that opportunity now because that is an oddly tiny little bright line in the crisis that we're in is that you actually have this bilateral agreement that big tech companies are too powerful in how they control and shape our information ecosystem and content moderation.

Robin Mansell:

I agree with what Courtney has said, but I think when we look at the evidence on breakups in the past of this kind, what we sometimes see, and I hope we don't see it this time around, is that companies compete to the bottom. And so what we could see if we break up some of these companies is the same business models being duplicated, but by smaller companies, which actually raises the bar in the challenge for any successful regulation because you have more of them. And if you have more of them, you have more resources to go into tackling one. So that's a problem.

The other thing that I think is really important to take into account is that digital inclusion, and inclusion in AI, and all of these kinds of things for a lot of countries in the global majority world can end up as being unequal. And that is not just a matter of whether the American companies come in and invest, it is also a matter of the fact that the discourse, and the ideology, and the business models are so ubiquitous that it's very hard for people to even begin to imagine alternative ways of doing things. And that is where a real stumbling block is because we see time and time again, and there are reports on this coming from the global majority world, is the tendency, for example, of the African Union to think, "Okay, we're going to govern differently. We're going to do this differently. We need to." Committed to that. And then you look at the language of their new laws and they're basically copy and paste with a few exceptions coming from quite often the European Union and they don't have the resources to implement it, et cetera.

So we need to be careful about assuming that the top-down type of approach to legislation, whether it be the Digital Markets Act or whatever [inaudible 00:27:59] is actually going to be enough of a solution to push back this bigger problem of not just big tech, but also big government, all of them getting the idea that there's one way forward and that is to datafy, digitalize, surveil people in every moment of their waking, and sleeping life some of the time. So that model, that notion that we must be automated, that we must be surveilled and that we will have more effective and more efficient services, whether they be public services or private services, that is the zeitgeist of today. And unless that is penetrated and we can do that with some of the research that is available, we can't really imagine that the [inaudible 00:28:46].

Courtney Radsch:

I 100% agree with Robin on that. And that's what I mean when I'm saying we're not questioning this idea of innovation. Right? The AI innovation is premised on more and more data, more and more datafication and therefore more surveillance of every aspect of our lives, our biometrics, our thoughts, our interactions with the world. And whether you're talking about the American big tech that is behind the foundation models and has the biggest chips, or China, right? They are really the AI leaders right now, but they both have the same concept of what we need to do, which is exactly what Robin said and raises massive concerns that are unlikely to be dealt with effectively solely through regulation and laws, but I think must be dealt with. Right? We can see one reason we have the development of surveillance capitalism and massive pervasive datafication is because the US has no privacy law and therefore the companies that are based here that have led the creation and perpetuated and created this logic because they were in this context where we had no and have no data privacy law. Right?

And so that has been very detrimental to the rest of the world. Or you look at China. Similarly, China has zero interest in data privacy. They have helped many governments create smart cities and datafy their systems in order to better provide services, but also track the citizens and surveil. So this is at the core of innovation and the ideas for alternatives and the resistance is going to come from the bottom up. But I would disagree with one thing that Robin said. We have never had a breakup of this kind in recent history. There was Microsoft, the Microsoft case, two decades ago in a pretty very different era of the internet. And even then, that was not a very ambitious breakup and ended up getting a bit watered down by the administration that did continue the case, but watered down the remedies. So I would propose that we have... This is actually a unique and critical opportunity to break up the companies and to think about how should we govern them as these critical intermediaries, as gatekeepers, as common carriers akin to public utilities.

And we actually have a whole toolbox of laws and regulations throughout at least the democratic world that we have used to govern and regulate these types of intermediaries that play... We call them platforms, whether they're intermediaries or they're what other businesses or what people depend on to reach each other or to do business. Think about telecoms, think about railroads. I think you had my colleague, Barry Lynn, on before to talk about some of these. When we think about Canada, for example, and the fact that one of the most important social media platforms in that country, lots of users on Facebook and Instagram was allowed to just decide not to carry news, to censor and ban one type of user, news publishers, in order not to have to be covered by this new law, the Online News Act. And that's very odd because, traditionally, we don't allow these critical gatekeepers in our information system to just decide not to carry really important information.

For example, in broadcast, we have must-carry provisions. You must carry local news. Canada, there are content requirements for French language, for local content, for Canadian-produced content. Why aren't we applying these to other information platforms and why are we allowing them to discriminate against one category of users? It's as if a business was like, "You know what? I don't want to build a ramp because it costs money and I don't want to have to comply with the Americans with Disabilities Act, so I'm just not going to allow people in wheelchairs." It's very odd that we have not resurrected, and especially that other countries that also have common carriage must carry gatekeeper type of regulatory frameworks to these types of platforms.

Justin Hendrix:

I want to step back for a minute, maybe just switch gears a little bit because one of the things this report does, it doesn't just criticize the commercial tech ecosystem. It doesn't just think about the interaction between government and the tech platforms, et cetera. But it also is very critical of the research enterprise itself and of all the effort at gathering science, the types of assumptions that researchers make, whether the researchers seem to fit into some of those narratives, adopt some of those narratives, Robin, about datafication or even about innovation or make normative assumptions about the technology in their work. And then, of course, there's the general criticism, or I suppose statement of reality, that a lot of this work comes from the, quote-unquote, Global North and that concerns about global majority are drastically underrepresented. But I don't know, when you, sitting back thinking about this report now, if we gave you the proverbial magic wand and you could go out and fix the global research agenda, an infrastructure for covering these issues, what would you do? Where would you start? What would be the things that would first come to mind?

Robin Mansell:

I suppose I'd start with education. One of the big problems is the training that is predominant in the sciences and, to some extent, in a lot of the social sciences, which is you need to be able to measure effects, you need to be able to count things and if you can't count them, they're not meaningful. You translate that into the world of harms and effects of mis and disinformation and propaganda in the information space, and that means you need to look for how technology causes those changes. And that is the paradigm, which is the best funded. It produces a lot of interesting insights about how different kinds of configurations of social media and content online and moderation, et cetera, impacts on people, young and old, some of the harms from an ethnic or racial point of view, et cetera.

But what it doesn't do is it doesn't answer any why questions. It doesn't tell you why is this happening, how are people resisting it, how are people thinking about it. And that takes qualitative research, not just quantitative research. If I had to change something and I had a magic wand, I would change the education system which teaches predominantly methods which are quantitative and privileges to them because we have so many tools and so much data that we can quantify everything that moves and what gets neglected and underfunded is the more qualitative experiential side of how people think about the world around them, how they interact with information, et cetera, and why the patterns evolve as they do, why is populism and polarization such a big deal. And that brings in politics and it also brings in the economics that Courtney was talking about it a minute ago, the reason why the companies are developing in the way they are.

Yes, I would recommend a much stronger emphasis on multi-method types of approaches. I would recommend not always starting with information and content, but asking the question, why is this information being produced and whose interest is it, the political actors, some other actors, economic actors. It is capitalism which is driving the whole system that's at a very high level, but beneath that level, there are a whole lot of structural factors which mean that we have the world that we do have and you cannot get at it, at least in my opinion, by simply looking at complicated causal models and trying to quantify the impacts of media or the impacts of disinformation, for example.

Courtney Radsch:

I am 100% with Robin on this. I think that one of the reasons that we're in this challenge around understanding the situation we're in is because there is a overemphasis on quantitative research at the expense of qualitative research. It is also easier and less expensive to get information when a platform has an API like Twitter did, so you see a lot of studies on Twitter data, many fewer on other types of platforms, YouTube, for example, because it's much harder to do that data analysis, more expensive. And so then it skews what we think we know about things. Even at its most popular, Twitter was never at the scale of any of Google or Meta's properties, and yet we know a lot about different types of content flows on there because it was easy to know. And that can then, I think, have this perverse effect of reinforcing the power of that platform as an influential platform.

I would also say another thing that we need to do more of is non-platform-based research on other information providers. I'm very happy that this report really took on the news industry as a critical part of the information ecosystem, what I call a keystone species, because it is an important way of how people get their information. And so the economics of that as well as the platformization of journalism has an important impact. I think I would have loved to do more on entertainment media. Most people don't actually consume journalism as we know. News consumption is way down. There are so many different types of ways that people get information. And in my conceptualization of information ecosystems, you have the media system [inaudible 00:40:15] within that, you have entertainment media, keystone species which are nonfiction media, and then user-generated content. And so all of those shape how the system works, how the algorithms work, how politicians interact with different medium, how people shape their beliefs. It's a very complex and intertwined system.

So I'd love to see more research on things that aren't just online. And I think one of the challenges to doing this research is it's very expensive to do qualitative research. So my doctoral research, I went to Egypt and I was looking at the micro politics of practices and discourse and what that revealed about how these shifts in the primary modes of communication from state-owned television to blogging and social media would favor certain species of thought or organization or authority or truth over others so that we could understand better how ICTs are implicated in processes of political change. That is very expensive to have somebody go be embedded. And we don't fund that research. Not to mention if you're trying to do that research at a university in a country or city that doesn't have a lot of resources that doesn't have the funding to plug into those broader research networks. And that's another thing. We haven't done this so much in this report, but I think there's a lot of interest in looking at who funds the research, how is the knowledge that is being created, funded, who's funding it?

And we know that, for example, a lot of the quantitative research on platforms was either funded or done in institutions or organizations that receive money from tech platforms or had to be done in collaboration with tech platforms in order to get access to data, which raises all sorts of issues around how we know what we know. Not to disregard the excellent work that many colleagues in this field have done despite the constraints, but it means that it's very important also for us to ensure that we have diverse ways of funding research so that we can ask questions, for example, about the political economy of the information ecosystem, the business models' monopoly issues that are not going to get funded by corporate tech or maybe philanthropies who are also getting money from corporate tech. So there's a lot more dynamics and I'd love to see more research into that aspect.

Justin Hendrix:

You've drawn this line under the research, you've set out a set of recommendations and ideas about what should happen going forward. This is an excellent resource that, in my class, that I teach called Tech Media and Democracy, I intend to use in the syllabus of the future. So I'm certain people will learn from this effort. But what happens next for this effort? What will you do next? You mentioned you want to do more education, but is this a project that, like the IPCC, we can imagine you coming back and giving us another report at some interval at this scale? This is a huge undertaking.

Robin Mansell:

I can't speak for the observatory, although I do know that they intend to take a more focused approach, at least in the near term, to tackle specific issues. Because one of the big questions that we were asked all the time while we were doing this report was, "So what? Does it tell us about what policymakers can do really tangibly?" And we've been talking about some of those things today, but in the report like this, it's in the nature of the research beast that there will always be people saying on the one hand and on the other hand not the least, because they're starting from different places and making different assumptions. So you're never going to find, I think, research community coming forward and saying, "We have empirical evidence that once and for all establishes X." We have the same problem with climate change. And there's always going to be somebody who can invent a project that says it's not as serious a problem as we think.

But on the other hand, we all know that there is a major problem in this space, and therefore, doing smaller studies that are more focused can help reveal, doing a big one in two or three more years I think would be really helpful in seeing what has changed during that time. And a lot will have changed. So I think there are opportunities to do both. Both the benchmarking on the big scale, but also more focused studies. And I also think that the research resource that's been mentioned a couple of times is useful for people in many walks of life, whether they're students or whether they're faculty who are saying, "I need a baseline to get started with and I'm going to move on from here." And I think that can be helpful. I think our report might've been the last one that was done without the aid of AI.

Because if you imagine that we had sourced our information using AI, the danger is that it would've been horribly skewed towards the stuff that is captured in the web of science and other sources like that, which, again, would've left out much of the global majority world for also its reasons. I think, going forward, one of the biggest lessons is to continuously raise the point that there is no such thing as value-free research. People start with a set of working assumptions and values. They might be about human rights, they might be about justice, they might be about equality, but they start with that as a rationale for why they want to know about this space of information and communication. And then if you take [inaudible 00:46:39] which is socio-technical, combines the social and the technical, as well as the political economy, the structural issues, and put that together, then you can begin to make progress. And I think that is what the research community needs to do and can do.

Courtney Radsch:

Yeah, and I would also add to that, I think if I recall correctly, the IPCC, it was really the first report, right? The Intergovernmental Panel on Climate Change was created to help policymakers and those working on these issues to provide them with regular scientific assessments of the issue and its implications, its future risks, and what type of policies are needed. So I think that is similarly the goal of this report. Having a massive global report is important for establishing what is known, what is not known, where are the holes, I would love to see this be used as a guide by universities, and researchers, funders, philanthropies, those who are interested in this to figure out where do we need more research and information and to think about how they can be more strategic in contributing to this body of knowledge. And I think the next steps, as Robin said, is to do deeper dives, and then connect that more explicitly to policy because we're not just doing this for the fun of it, although I will say it is sometimes fun to do these things.

I did a report for UNESCO about a decade ago on world trends in media development and freedom of expression, which was a massive undertaking. And you realize similarly, and that was a decade ago, to establish a trend, you need at least three points, right? You need three data sources. And we were similarly concerned about making sure that it was not dominated by Global North or Western or certain perspectives, and so trying to double source and make sure there are a variety of sources. Similarly, that's very challenging. And after that, they've done a series since then of deep-dive reports on specific elements. So I think this is a foundational report, and maybe in five years or a decade, we need to do another big massive review. But in the meantime, let's translate this into action. Let's translate this into policy. Let's do something with it because we actually don't... I don't think we have 10 years to sit around and wait to see what happens. We have to shape the future that we want.

Justin Hendrix:

Well, I'm very grateful to you all for taking the time to speak to me about this report. I will just note as well that there's so many different tech policy press contributors who were part of this. I couldn't name them all because there are so many names in here. But everybody from folks like Theo Lenoire to Emily Tucker, Luca Belli... I'm just going to go on the list here. I'll name a few more. Sonja Solomun, Nicole Turner Lee. These are all folks who have written for Tech Policy Press, and I'm sure I'm missing many. And going down just the list of extraordinary citations that you have, just looking back and imagining the extraordinary amount of work and effort that's gone into this by so many people across the globe over so long, I definitely recommend that folks check out this document, which you can find if you Google "Information Ecosystems and Troubled Democracy: A Global Synthesis of the State of Knowledge on News Media, AI, and Data Governance," from the Observatory on Information and Democracy. Robin and Courtney, thank you so much.

Courtney Radsch:

Thank you, Justin.

Robin Mansell:

Thank you.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & Inno...

Related

Beyond Moderation: Challenging Big Tech’s Power in a Troubled Time for DemocracyJanuary 15, 2025

Topics