Looking Ahead to 2024: Mitigating Election Mis- and DisinformationAnya Schiffrin / Apr 27, 2023
Anya Schiffrin is the director of the media and technology specialization at Columbia University’s School of International and Public Affairs.
On Monday, April 17th, I moderated a discussion on efforts to mitigate election mis- and disinformation in the runup to 2024, a year that will see key elections in dozens of countries, including the US, Mexico, India, Indonesia, South Korea and Ukraine. Expert panelists included:
- Laura Zommer, the General Director of Chequeado and Cofounder of Factchequeado;
- Sarah Lister, Head of Governance in the Bureau of Policy and Programme Support, United Nations Development Programme; and
- Laleh Ispahani, Co-Director of Open Society Foundation's U.S. Program (& E.D. in July).
What follows are highlights from the panel discussion.
In 2024 there will be elections all over the world. What are you seeing?
I study election mis- and disinformation targeting Latino communities. There are some narratives that are in English and then translated to Spanish, but we also see some intended to create fear or anger in the Latino communities and affect which party they vote for. One example is the narrative related to inflation. An inflation rate of 9% is a lot for many Americans, but for people coming from Venezuela, who remember an inflation rate of 500% a year, talking about inflation makes them feel much more uncomfortable and concerned about their present and the future.
Another narrative that is targeted and micro-targeted to Latinos groups (through videos, text, graphics, memes) present the idea that Biden is not a communist or a socialist, but that he's a “controlist”. The implication is that the state is not trying to protect or support people but will create problems with the IRS or taxes, for example. We're also seeing a lot of misleading or false narratives and some scams related to immigration. Where there are gaps in information, there is room for targeted mis- and disinformation.
What we do at Factchequeado, and with more than 40 media partners in 17 States and Puerto Rico, is create trust and build community. We also use a chatbot in WhatsApp so that people can send us questions about content they’ve received. Our chatbot helps us research the answers in our own archive. In some cases we prepare a piece and send it back to the person who contacted us and ask that person to share it in the same group in which it was received.
How about the US elections?
2024 is a banner year for elections in major democracies – India, the European Union and the US, among others, will all hold elections, elections that are important for the fate of democracy. Mis- and disinformation will loom large in all of them.
Hate actors organize and recruit online. Events like the January 6th insurrection came about in part because there is no policy or actor that can successfully de-platform most forms of hate. Nor is there any successful model or action for disrupting violent online organizing, or changing major platforms’ business models (so that these issues are less prevalent online). It’s a question of the broader information environment, because much of the problem, and the solution, lies in the interplay of media outlets and social media platform companies.
For example, false or misleading content originates offline, then gets amplified online, and then this cycle can generate more media coverage into infinity. Attention-driven business models incentivize outrage, ignore nuance or don't seek to inform the public. The death of high quality local news outlets leaves many people in the U.S. without reliable and responsive sources of information. Of course, cable news has a huge role to play in all of this too.
In the US, there's no silver bullet. We don't have a GDPR– a general data protection regulation– like the EU countries do. Even when there's greater understanding of online dangers and hate, there's little public will to do anything about it. The large tech companies, which are all headquartered in the US and operate under US law, have enormous public policy shops or lobbying shops in Washington DC. Fifteen years ago Google had just a few people, it now has hundreds.
We can't get data privacy at the federal level, so we're looking at the state level. California actually has a consumer rights privacy bill. Washington state just passed a narrowly targeted health data privacy bill, and one of the reasons for that bill was because of the response to the Supreme Court striking down reproductive freedoms in the Dobbs decision. States are beginning to at least ensure health data is private (likely so that, for example, providers of abortions aren’t attacked, or the families that drive people across state lines aren’t harmed).
Challenging the monopoly power of tech companies is difficult. They stifle innovation, they harm kids, they’ve run much traditional news media out of business. If you limit their monopoly power, less harmful models would have a chance to exist, and perhaps, to flourish. And decisions made by many individual tech companies may each have lower stakes. You can simultaneously build trust in credible sources, including in media outlets that are owned and operated in the public interest. You can experiment with new models of media ownership and organizing. We've helped facilitate the purchase of media weeklies and radio stations that serve primarily black and Latinx or audiences. We have invested in local news outlets that fill news voids and in communities that have lost journalism outlets.
Litigation and advocacy against bad actors is finally beginning to have an impact. The Dominion lawsuit against Fox News, and the lawsuit against Alex Jones, who was a major conspiracy theorist, were very successful (even if Jones doesn't ultimately pay up). There've been one or two successful civil society boycotts, including the one in 2020 that targeted Facebook advertisers in an effort to force the platform to change its practices.
Sarah, what is the UN doing on this issue?
We have about 32 active electoral projects at the moment globally. I was having a look at the list earlier -- from Liberia to Mali to the Solomon Islands to Uganda to Vanuatu, across all regions. The UN provides electoral support of many types and has done for many years, from procuring ballot boxes to supporting voter registration to so on and so forth. We are seeing huge demand from the countries that we support for ways of dealing with this problem. We would very much take the same approach as Laleh– you have to take an ecosystemic approach to it. You have to look at supply and demand and different actors, and you need to bring multidisciplinary and multi-stakeholder coalitions together .
Globally we are seeing 'Dark PR' – the rise of new affordable tactics that almost anybody can buy from wherever they are to disrupt elections through information. This kind of manipulation for hire doesn't just affect the US or the UK; it's also prevalent in other countries. We’re also seeing the rise of gendered disinformation and hate speech. And, as micro-targeting becomes more sophisticated, this has an effect on women's political participation at all levels. It's not just the superstars that we've all heard of. Electoral officials who are just trying to do their job, and journalists who are not well known are all routinely subjected to this type of online violence. That’s serious in terms of democracy as well as the rights of women and gender equality.
The business models of the technology companies affect what contexts they prioritize, and this has been worsened by the effects of the latest layoffs. Senior people in institutions in the countries in which we work cannot even get the big technology companies to answer the phone to them.
I would love to know more about what’s working.
There are models with the potential to transform the spaces they're in – with better information, with more factual information, with information about civic engagement, democracy, elections. Wikipedia has become one of the most trusted sources for countering misinformation because it's one of the only places online where you become more moderate as a result of what you read. It really worked to build democratic accountability. There are some lower-profit models such as the Vermont Front Porch Forum, which has a community moderation model and non-surveillant advertising.
So while today in the US we don’t have remedies via federal policy, we have some affirmative state policy, and the development of models like these that are bright spots.
We create evidence-based content and try to reach the people in the US that nowadays don't have good quality content in Spanish in the places where they get information. But we are not necessarily going to change their voting behavior. We create channels and give them better tools to navigate the disinformation ecosystem so that people can make their own decisions, and, whether we like it or not, people in some cases decide to vote for liars.
Misinformation is always going to be with us, but you need a culture of accountability and an effective, robust and even adversarial media that'll push our leaders to do better.
The other limit we should acknowledge– though it's often also positive– in the US is the First Amendment. That makes it very different from other countries; ultimately simply lying or spreading false information is generally not a crime. That's a strong and important protection but it can also make it harder to take aggressive action against misinformation. Ultimately we have to try.
The governance of societies is important too. Do you have a rule of law that is functioning? Do you have parliaments that are able to hold politicians to account? Whole democratic governance structures need to be brought to bear to deal with this issue.
If there is a bright spot, it's that the conversation has matured and become a conversation about the information ecosystem and the different actors and parts that are needed, including the role of independent media. I'm not saying that we're winning, but I think we are getting better at putting this problem into an appropriate context.