Home

Graphic Content, Trauma and Meaning: A Conversation with Alexa Koenig and Andrea Lampros

Justin Hendrix / Sep 24, 2023

Audio of this conversation is available via your favorite podcast service.

The ubiquity of cameras in our phones and our environment, coupled with massive social media networks that can share images and video in an instant, means we see often graphic and disturbing images with great frequency. How are people processing such material? And how is it different for people working in newsrooms, social media companies, and human rights and social justice organizations? What protections might be put in place to protect people from vicarious trauma and other harms, and what is the ultimate benefit of doing this work?

In their new book, Graphic: Trauma and Meaning in Our Online Lives, University of California Berkeley scholars Alexa Koenig and Andrea Lampros set out to answer those questions, drawing from their work at the Berkeley Human Rights Center Investigations Lab.

What follows is a lightly edited transcript of the discussion.

Alexa Koenig:

So I'm Alexa Koenig. I am co-faculty director of the Human Rights Center at UC Berkeley School of Law, and also cofounder with Andrea Lampros of the Investigations Lab at the Human Rights Center.

Andrea Lampros:

And I'm Andrea Lampros. I am the communications director at the Berkeley School of Education at UC Berkeley, and I cofounded the Investigations Lab with Alexa and have worked on the resiliency piece of the lab when I was there at the Human Rights Center.

Justin Hendrix:

You are the authors of this new book, Graphic: Trauma and Meaning in Our Online Lives. This book is a sort of mixture of primary interviews and also secondary research and a kind of synthesis of ideas. How long have you been working on it?

Alexa Koenig:

Really, we've been working on it in a conceptual way since the launch of our lab in September 2016, almost immediately as we were beginning to partner with Amnesty International and other organizations to try to train students from across the Berkeley campus, how to find information on social media spaces that are relevant to human rights issues, and how to verify the accuracy of that information and Some of our partners who'd been working as journalists and human rights researchers really pointed out to us that a big side effect of engaging with social media content so intimately is the psychosocial ramifications of it.

We also had a member of our advisory board who'd been a long time war crimes correspondent, and she came to us once she saw what the lab was doing. And she said, Alexa and Andrea, what these kids can do is incredible in terms of finding this information online, but how the heck are you going to keep them safe?

And I think up and until then, we'd really thought about safety and the human rights and social justice space is physical safety and maybe underappreciated the extent to which we also need to be thinking about psychological safety.

Andrea Lampros:

And so just from day 1 really of the Investigations Lab, and we had partnered when we launched the lab with Amnesty International and Sam Dubberley who'd done a lot of this work around the potential for secondary trauma related to viewing graphic content. So, really, from the beginning, as students were in the very first projects really looking at video after video of what was happening in Syria, for example, and just alongside that we did some very first training and looked at what some of the research said about how do you have some awareness around this. And so it's evolved quite a bit from that time, but really from, I would say this work has developed just along with the work we've done with the investigations lab and through its many iterations until now.

Justin Hendrix:

This book starts with four very graphic images, and I'll just read them. “Bodies wrapped in black plastic bags are thrown into a makeshift mass grave by stunned and stone faced men landing with thuds each upon another, a man enters a hospital carrying a baby and a blood stained blanket.”

“One tiny limp hand poking out a distraught mother follows just behind a woman plays her piano in a bombed out home a note of elegance amid the destruction around her.

“A person riding their bike on a lonely suburban street turns a corner, dismounts, and is suddenly felled as armored vehicles fire multiple rounds.”

These all images from the invasion of Ukraine by Russia just in the last couple of years. And I'm sure some folks listening to this, they hear those words and they can imagine them.

Those images many of which are sort of seared into our minds. I guess I want to ask you a big picture question just to start. When we gave everyone a camera we put that device in everyone's pocket, we connected them almost in a ubiquitous fashion to the internet. How has this changed the way that we as humans sort of interact with this type of trauma that's happening all around us?

Alexa Koenig:

I think it's really impacted us in a number of ways. One aspect of it, of course, is what some people have called the democratizing aspect of having cameras in everybody's pockets.

We're having perspectives on war and conflict in a diversity of ways that we've never seen before. So when you have, you know, dozens of videos coming from a site of atrocity, you, it's a very different experience as a viewer than when you had one very carefully curated picture that was selected, say, to put in an article in the New York times or the Washington post, et cetera.

You know, I also think that. When you think about the information that's shared on social media, you've got individuals who are trying to get the international community to respond. And so they're often sharing the most emotional clips that they can, the most graphic content that they can possibly capture to try to drive that attention and compete with the many videos and photographs that are already out there.

That's very different than journalistic packaging of imagery where you have journalists who've been carefully trained in ethics.

And are trying to minimize the potential harm and get people to engage with some more of the content, as opposed to respond or react. So I think those are two things that just immediately come to mind.

Andrea Lampros:

Important that we're able to know what's happening in places in the world where there are no journalists or reporters on the ground. And so the changes that we've seen in terms of technology and smartphones and people's hands in connection to the Internet and being able to get information out from places where news wasn't coming and videos and photographs and.

Everything was and human rights violations were able to be committed in the dark for the world. So it's, you know, it's, this is a moment in time when this, you know, is important and positive. that we can see what's happening. And so that's why this book is sort of an attempt to look at, okay, this is a different moment and not just for journalists and for investigators and for lawyers and for others who are engaged in it on a daily basis, but for everybody.

And how do we think about it and how do we start to think about keeping ourselves safe so that we can continue to stay engaged, and that's one of the reasons why we wrote this book is so that we really can think about that because we didn't write this to say, Oh, this is negative or positive, but this is really just where we are and how do we grapple with it in ways that are healthy and sustainable so that people don't either become Completely numb or retreat from the world.

Justin Hendrix:

Well, this is a different moment. You do take effort to situate it and against a long history, a literature documenting and witnessing trauma. You mentioned images for instance, that have been pivotal and understanding everything from the Civil War to World War II era Warsaw. You pause on the 1955 image of Mamie Till above her son Emmett in Chicago. Images from 9-11, from Vietnam, from Abu Ghraib, on and on. What does that history and literature tell us? How does that help us understand this moment and how we ought to think about what to do with the flood of images on social media?

Alexa Koenig:

I think it tells us a few different things.

One, just how powerful imagery can be to igniting conversation. So certainly Emmett Till's murder is the photo from that and from his mother looking on and refusing to close his casket and asking journalists to take pictures of what had happened to her son. That has largely been credited as really igniting or helping to ignite.

The Civil Rights Movement. And so I think, really grappling with the power that imagery has, which is a very different kind of power sometimes than written text is important for us to think about. The second thing that really jumped out at me as we were writing that segment of the book was how much responses to these Really iconic images can differ between people and one story.

One moment that I really remember from when we were starting the lab is that we were doing a lot of research into crimes potentially being perpetrated in Syria, and we had a young man in the lab who came up to us and after about a month of work on that project said “I just can't do this anymore. It is too emotionally upsetting. My family has largely supported the regime and the regime. I just, I need to be on a different project.”

And so we instigated a no questions asked policy. We're going to change you to different projects, if you find you can't do it.

That contrasted with another young woman who came up to us around the same time, she was working on a project related to Iran. And she said, “My whole life, I've heard what my family has suffered in Iran. And I was helpless to do anything about it and actually engaging with that material and doing something with it and letting the world know what's happening has been the most meaningful thing I've ever worked on.” And so our reactions can be so different as individuals.

And it's so based on our identity, our experiences, our own histories, that I think that awareness and not taking for granted that your response is the response of other people is a really important thing to keep in mind.

Justin Hendrix:

I think one of the extraordinary things to think about is just the expansion and the total number, the total universe of people who either professionally or for, important personal reasons are investing themselves in looking at potentially traumatic imagery on the Internet, partly about the volume of that material. But partly also about what's happening in the world. I think of some of the people I've talked to just in the last few years who've worked on projects like the Disinformation Defense League, a group of organizations that were looking You know, working on combating missing disinformation targeting, especially communities of color in the United States.

I think about even my own experience after January 6th, I looked at hundreds of hours of social media video from the attack on the Capitol and my own experience of I think having a hint of some of the types of experiences that you. Describe in this book you know, it was important work to do some of the clips and things that we were able to find ended up referenced in, the second impeachment trial of Donald Trump and later by the January 6th committee, but that physical and psychological impact of being hunched over that material for so long, I feel that a little myself. What did writing this book tell you about the biological, the psychological impacts of perpetually looking into these various abysses?

Andrea Lampros:

I think that one thing that's super important in what we've really learned is understanding that you can have some of the similar physical and psychological reactions to secondary viewing of something than you can firsthand.

Now, of course, firsthand trauma, we're not trying to compare that and the suffering of what happens to somebody firsthand. But what we do know is that this stuff does affect us. And of course, through time, journalists and human rights workers have sort of taken the more tough it out attitude of, you know, this, my suffering and looking at this and engaging with something so difficult.

It doesn't compare at all to what someone experiences firsthand. I don't need to take care of myself. I can just have a drink at the bar later or I can just struggle through. And as you just said, the sheer quantity of this now and how ubiquitous it is in our work and in our lives, it does beg the question of how are we managing it and can we do it differently? Do we have to have this same kind of sense of tough it out or you know, just a more of a macho mindset around it. Or can we think about how to have some balance so that we can continue to do this for the long haul?

Because that's what it's gonna take. It's not going away. We're not gonna go back in time. And so we need to think about, how do we do this? How do we manage it for ourselves so that we can keep doing it and keep ourselves safe along the way.

Alexa Koenig:

We've never lived through this kind of a phenomenon in human history before where you were just bombarded with this much content from all over the world. You know, if you think about how algorithms are feeding us information that we're not even looking for, you may be on tick tock and you've gone on there for a few, like, you know, a few minutes of enjoyment or entertainment and suddenly you're being fed potentially very graphic material.

Same thing on YouTube, Twitter, et cetera. So I think that as we were working in these communities of practice and really seeing what people were struggling with, we realized that there was really, you know, there was a lack of awareness in the general population too, about how to actually minimize some of the harms, which I'm sure we'll talk about in a little bit.

Andrea Lampros:

And I just wanted to add one more thing and that's around what people are looking at, it's not only the graphic nature of things, it can also be the kind of hateful and racist and just mean commentary that we're seeing in the flood of that as well.

I just wanted to share one story. One project that we had in the lab was students were looking at, you know, pulling off the Internet. Just things that hate speech against immigrants. And this was a few years ago. And you know, the kinds of things that students are coming across was just absolutely heinous.

And after a while, even though the content wasn't graphic in nature, it was you know, they were having a visceral reaction to it. It was emotionally really taxing. And so I think that's something to keep in mind as well. It's not only, yeah, it's not only the imagery, it can be the words as well.

And one thing we did in that In that project is we got all the students together, and we had a circle and for people to talk about it. We often did that. But they were talking about, Okay, what? How can we find some words to counter the words that we're seeing here? What do we know to be true about immigrants?

Who are they? They're our family members. Our fathers are, you know, aunts and uncles. There are people we love. And what? What? And I think that, rather than the hateful words that we're documenting. So that's just one example of some ways we've tried to counter that.

Alexa Koenig:

I also think that one thing that jumps out to me is that what we're seeing in the social media space is very similar to what they were seeing with drone operators.

Where drones were being used for extrajudicial killings or targeted killings back around 2010, 2011. There was a feeling that drone operators were going to be much quote unquote safer. Then people who were traditionally go on bombing runs, because instead of actually being in an area of combat, they would be thousands of miles away, say, in New Mexico or Arizona or wherever basically taking individuals out remotely.

But what was underappreciated at the time and what the social science has shown is that they were actually sometimes having equal or even more extreme psychological reactions to their jobs than the people who were part of military units in conflict areas. And one of the big reasons, or at least the theories around this is because they didn't have the same kind of social support.

They didn't have other people who really understood what they were going through at the end of the day, instead of having, say, a drink or hanging out with other people who'd had similar experiences. They were maybe going home to a partner or a child who had a very different day that was completely unrelated to war.

And there's something about that community and that isolation that can be both protective or harmful. We had a woman come to visit us who was very high up in the human rights movement and she said, We saw sort of a sampling of what our students have been doing on our investigations and she said, this is incredible.

And you're so lucky because whereas my, the people who are reporting to me go boots on the ground and are in danger every day, your teams are relatively safe. And I think that was a really big eye opener of that's sort of the default assumption. I think for a lot of people who are first starting out in using social media content for human rights or war crimes purposes, you quickly realize that it's just a different kind of risk and a different kind of safety.

Justin Hendrix:

Let's talk a little bit about those protective forces that you've identified because you've got several. You've already mentioned the role of identity as potentially one shield against certain types of material in certain contexts. But what are these other protective forces? How would you characterize them?

Andrea Lampros:

One of those protective forces is community. And again, we've seen this a lot with the investigations lab of this community of practice of students together and going through this together is collective viewing collective grieving really, but it is. It's something that when we really looked into it more, we knew that kind of just anecdotally the power of that, because just on a basic human level, you know, being able to turn to somebody and tell them how you feel or to know that they too have seen that same image.

And maybe they go home to a housemate who has no idea what they do in that Investigations Lab and the kinds of hard things that they're seeing, but they have that community with those that they're working with. And so that we knew anecdotally. But then when we looked into this more from a research perspective, looking at the positive psychology of Martin Seligman and what he has to say about the power of resilience.

And that whole theory, well being theory where that's just, you know, has to do with community really and engagement and connection and meaning and all of those things connect us together and make it possible for us to get through difficult situations. So, I would say that I'm really a main force for protection is our sense of community and in the work that we're doing.

Or even as individuals in the world viewing this content.

Alexa Koenig:

And I would add another one being analysis and actually engaging with the material can be very protective. We interviewed a number of activists. We also interviewed people who are professional investigators who'd worked on some of the most notorious incidents of my lifetime and before, from the killing of John F. Kennedy to the kidnapping of John Benet Ramsey, et cetera. And what we heard repeatedly was that. Not just passively looking at these images, but actively engaging with them and using your brain when you engage with them can itself be protective because and I think it's because it takes it a little bit out of the emotional realm and into the realm of logic, where we can actually think about what can we do with this imagery or this video to actually have impact.

That's in direct contrast to the notion of doom scrolling and just sort of passively flipping through image after image and letting it wear on you and accumulate over time. The second thing I'd say that's somewhat related is one thing we've heard repeatedly from our students, and I think is a phenomenon across the board, is a beginning of a sense of guilt for having a life that's maybe a little less hard than, say, people on the ground in Ukraine.

Or Bangladesh, Myanmar, et cetera. We heard about one woman who was brushing her teeth that she just put her kids to bed. She's here in California, and she said, “I just feel so guilty. I had just been looking at what was happening in Ukraine and I don't know what to do.” And I think one conversation Andrea and I have begun to have is there a way to switch from guilt to gratitude?

So instead of this notion of guilt for having something when others don't, having gratitude for what we do have and using that to empower us to think about what we can actively be doing, to try to use the privileges that we have to support people who may have be less fortunate.

Justin Hendrix:

I'll just point out that this book makes reference to a lot of social science.

And I guess on this subject, you cite a study by Sarah Knuckey, Margaret Satterthwaite & Adam Brown. If it has purpose, it's less distressing. This sort of idea that if you can see an end see a, you know, an answer or some kind of positive step that can be taken.

In the work of looking at traumatic imagery or material that somehow that lessens the emotional toll.

Andrea Lampros:

Their research was really a backdrop for us in this work. And they, you know, they actually have had two studies that one and another one around a culture of unwellness. And both have been super informative in thinking about how to grapple with this and that very.

Question of what is your purpose in doing this? And the meaning of the work is just essential and essential protective force in it. And I just wanted to say about that,

Andrea Lampros:

We had a lot of conversations for the book with content moderators and others who were sort of thrown into this, and this is an extreme example and how social media companies have treated people who are just looking at terrible content.

And often a lot of what they would say to us is that, yeah, we were thrown into this and we still wouldn't know what was, what the outcome was, you know, we didn't have agency in it. We were, you know, yes, look at this, take this down but we don't know, is anything going to happen?

There's actually a really great piece tThat was done on The Verge, and it includes interviews with these content moderators and one of them was just crying because he was looking at and he had a real reaction to the harm that was being done to animals on social media and he was, he just never knew, did something happen? Did anything come of that to protect anybody? So that's why it's been really important for us and with students to make sure that they know the impact of their work, maybe it's just in bearing witness. It's just in documenting that you know that there's a value in that, that you're being intentional and listening to someone's story and relating that and documenting it.

And so that may be the only outcome you see. You may not see justice at the International Criminal Court, but you might know that you contributed in that way and that has meaning in and of itself.

Alexa Koenig:

With regards to purpose another body of literature that I think is really important comes from psychology.

So there's two psychologists who are in Turkey who've tried to support people have gone through mass atrocities, whether it's the result of earthquakes or war, and one of their big theoretical observations is that anxiety often stems from feeling like you have a lack of control over what's happening and that depression comes from having a lack of hope that things are going to be better for the future.

So having a purpose in our exposure to some of this graphic or upsetting imagery. can really help, I think, with both of those phenomena in first with anxiety, because if you do have a way to respond to what you've witnessed, that gives you some sense of control, even if it's a small sense of control over the world around you and what's happened and is happening.

And second, if you're able to do something with what you've witnessed, that may give you some small spark of hope that something can change and you're feeding that hope that can be so protective.

Justin Hendrix:

I just want to pause on content moderators as a kind of maybe slightly separate class from those who are doing this work primarily for the purpose of documenting human rights abuses or other types of injustices.

Clearly, we've learned a lot about jow content moderators operate how things go wrong for them you spend quite a lot of time sort of thinking about how to improve the lot of content moderators. Having done this, let's say a Silicon Valley executive who maybe sits on top of one of the giant outsourcing contracts to some firm that employs thousands of content moderators on his or her behalf.

What would you tell them is most important to both protect These individuals from the types of trauma that Casey Newton and his reporting at The Verge and academics before him, like Sarah Roberts, have referred to, what would you tell them they should do there and how would you, I don't know, how would you imbue their work with meaning it's different, isn't it?

I mean, they're working for a big company and the primary reason to take a lot of that stuff down a lot of the time is just because it sort of drives advertisers away.

Alexa Koenig:

Right. I have a few things so many things I'd love to add to that one. First of all, I think the very model of working with contractors is something that we heard from content moderators was problematic.

So are there ways if you're going to have to use some kind of contracting model to still make those people feel like they're part of a team, and that they're included in your company. I think it's when they're really divorced out that it becomes deeply problematic. Again I think this ties back to the notion of identity, and can you have an impact in the world? And who is your community? A second piece is to listen to them. They often have really good insights about what they need to do this work in a sustainable way. One thing that we heard that I think I was so impressed by is so many of these moderators or managed content moderators really came up with creative ways to help minimize some of the psychological impact, like making sure that they weren't doing it for more than a set number of hours in a window before having a break, making sure there were opportunities for therapy or collective conversation.

Thinking about how you share something with someone else. So maybe they take a certain kind of content that you know is particularly upsetting for you, but that they may have a very different relationship to. And then I think the thing we also heard was just paying them fairly. There's a dignity that comes with feeling like you are fairly compensated for your time and your labor.

We, unfortunately, I think someone used the term and others, probably Sarah Roberts and others around this idea of them being the garbage collectors of the Internet that we disvalue or devalue this work when it is ultimately so critical to everyone's day to day experience. And, you know, I think they have found meaning.

So some of the people that we talk with, at least, maybe had left the field or were thinking of leaving the field, but then also said we also talked to one woman who was coming back into it in part because they do recognize that they perform such a critical function for society and helping us to have a safer operating environment when we go online.

Justin Hendrix:

Let's talk a little bit about some of the contradictions in the motivations of social media companies. You spend a bit of time looking at the role that they play in deciding what human rights atrocities and events that get uploaded to their sites which ones are available to the human rights community, which ones get stamped out perhaps by automated systems before They ever see the light of day there's a kind of tension here, right?

Between keeping these platforms safe for people and doing the sort of work that you're doing there at Berkeley.

Alexa Koenig:

Absolutely. I mean, I think from an investigation standpoint, we'd like more access to often the very gruesome information that's letting us know what kinds of violence is being perpetrated around the world.

I think the companies are right to remove that content for violating their terms of service. And that they do set parameters around what can be shared openly. However, I think they also have a lot of competing interests that come into play about what they keep up and what they take down.

Everything from financial interests and what will drive eyeballs to their platforms, so that sometimes incentivizes keeping dangerous information up. Also commitments to freedom of expression and access to information that also would keep things up privacy considerations, though, fight in terms of keeping things down.

There also might be IP considerations. So I think 1 thing over the past decade that I've grown more appreciative of is just the range of factors that go into what is shared, who it gets shared with and when it's removed. Now, I would argue that I would love to see the companies maybe take a different calibration in terms of what they take up and, or leave up and take down.

For example, I know some companies have been overly deferential, in my opinion, to political speech, but we saw how that went really wrong in Myanmar, where the military who were circulating hate speech were able to have their posts up, but the activists and others who were being harmed had their content removed.

So I think, you know, when you're thinking about human rights of people and the dignity of people, the human rights movement was set up to limit the abuses of states against everyday individuals. And when you prioritize content of politicians, when those politicians may not have the best of intention, you're putting the thumb.

On the side of a scale that's already very heavily weighted against the everyday person. And I think part of what we would argue for would hopefully help to bring that a little bit better into balance.

Justin Hendrix:

So let me ask you about the role of automation. You spent a bit of time in the book focused on that. A lot of folks, of course, would look at this problem and say, well, you know, aren't we near to having the AI be able to do this work? And we don't have to have the humans. You dismiss that, you know, science fiction.

But you do allow that automation is perhaps one answer in some contexts.

Alexa Koenig:

I would love to see automation used. Less for pushing content to people to try to manipulate them into making purchases or some other behavior that they weren't necessarily going onto platforms to do in the first place. And more about automation being used to create a pro social environment, as one person argued in our interviews, in these online spaces.

I think one just clear example of how automation could be better deployed is You know, if we're going to keep this content up, why not have a I detect violence? We know we can train machines to do that and automatically blur or otherwise signal or flag that certain content is potentially problematic.

I think there's a lot of tools that we could put into place. One challenge that we're facing right now in an investigation we have is there's over 300, 000 pieces of content that have been collected that are potentially relevant. We just can't go through that many videos and photographs.

It's not human scale, so we're going to need to use the machines as tools to help us identify the signal and all that noise and make the tasks of investigators and others more manageable machines are indispensable to that. So can we just start using them to shape? Or to begin to harness the information needed by investigators to the investigators, the stuff that's harmful away from the everyday public and give us more control over operating environment so we can choose better or more effectively, whether to look at something, whether to take off the blurring or whether to disregard it.

Andrea Lampros:

In total, it takes a long time to write a book. And so a lot of things move fast with technology. And so as we, you know, are looking at where we are today from when we started writing, I mean, things have of course, exponentially sped up. The big takeaway for us to is really to underscore what Alexa was saying is that to use this capacity for good.

But that means that we have to challenge the profit motive that we have to think about social media and the and the digital environment as this is our public infrastructure. How do we continue Think about this you know, to strengthen it for the public good rather than use it to, make money for some and I think it's really important and it's not just it's critical to the very foundation of our democracy and democracies around the world that we think about that and really realize that we are in as Maria Ressa said, the last two minutes of this with technology you know, she the journalist, Maria Ressa said that, you know, this is crunch time given AI, given disinformation and given everything that's happening there.

If we don't right now start to challenge what tech companies are doing on this global scale, then we're really in trouble as a society.

Justin Hendrix:

In that context, is there a message in this book for policymakers, for lawmakers? Is this something you hope they would take from your work?

Alexa Koenig:

Just that a better operating environment for all of us in these social media spaces is possible. It's just a lack of willingness. We spoke with, for example, Emiliana Simon-Thomas of the Greater Good Science Center.

If you haven't checked out some of their research, I highly recommend it, but just in talking with her about the conversations she's had with these companies about how small tweaks to the design can radically change people's experiences online. In ways that aren't going to decimate their profits, but may make a few less people into billionaires.

But at the same time isn't, you know, we will want these social media companies if they can survive to survive. They've proven very valuable to a lot of people's lives in a lot of different ways. But some low hanging fruit changes I think are possible and they just need to be speaking to the right people who've been thinking about this for a very long time.

Justin Hendrix:

I suppose the last question I have. is about the collection of this material for the purpose of human rights investigations. Clearly we've got the experience of Syria, Myanmar, other atrocities that have taken place in the world. And I don't know exactly where it's at the moment, but I would suspect that the volume of collection in Ukraine must be on a whole other order of magnitude in terms of the type of work that's happened in past.

Is it? going to make a difference in outcomes? Will some of these atrocities ultimately result in the people who commanded them to happen being held to account? Are you starting to see evidence that's the case or at least evidence that it's reached a threshold where all of this effort and the trauma that you're describing is a worthwhile investment?

Alexa Koenig:

I'm optimistic. I think we're already beginning to see cases bear fruit in places like around atrocities in Syria in places like Germany and other countries that have been willing to prosecute people who end up in their countries. An important thing to remember is just that the arc of justice truly is long.

It took 30 years for trials to be open for the abuses perpetrated by the Khmer Rouge in Cambodia. You often need an entire generation to pass where initially the people who have hurt so many are still in positions of power. They're very well protected by people in other countries, let alone in their own.

So it really takes sometimes the time to pass for their power to diffuse enough that they can actually be apprehended and brought to justice in a court of law. I think the other thing that happens generationally is that when people have just gone through a mass trauma, They sometimes are, and understandably, are more focused on their day to day needs of rebuilding schools if they're past the moment of conflict and figuring out how to put food on their table for their families and begin to normalize and stabilize their lives again.

Legal justice maybe isn't at the fore of what they're thinking about. So sometimes it's the children of the people who have directly experienced this conflict, who, when they discover what their families have suffered, take up the cause and call for prosecutions. And often the time is right. I do think that in the context of Ukraine, we've gone from one or two platforms being relevant to a diversity of platforms, which makes the data collection that much more difficult.

But at the same time, it allows for greater triangulation and corroboration across people's voices and experiences. In ways that I think will prove out to be quite powerful.

Andrea Lampros:

I'm optimistic as well. I think that it is such an important moment that we now have so much information from around the world about what's happening.

It's just speaks to the importance of having. More people corroborating that information, verifying the information and making sense of it because the volume is so much. And so for it to be usable by journalists and war crimes investigators and others, it really requires a certain number of people in a certain amount of skill and attention to making sense of it so that we can really agree on facts and because that's going to be essential to any kind of justice in our world and we have a great opportunity right now, but it really does require. I think a lot of people. Putting attention to, making sense of the information that we're seeing and translating it, making it more usable so that it can be a factor in promoting justice in the world.

Justin Hendrix:

It does strike me that in decades ahead, we're likely to see a lot more conflict.

And of course, a lot of that will be collected on the internet. And so this set of practices that you were describing here and the approaches to doing it humanely and hopefully preserving folks, humanity in the process, very useful set of lessons for everyone. This book's called Graphic: Trauma and Meaning in Our Online Lives. Cambridge University Press just out September 14th. Alexa, Andrea, Thank you both for joining me.

Alexa Koenig:

Thank you, Justin.

Andrea Lampros:

Thank you for having us.

Authors

Justin Hendrix
Justin Hendrix is CEO and Editor of Tech Policy Press, a new nonprofit media venture concerned with the intersection of technology and democracy. Previously, he was Executive Director of NYC Media Lab. He spent over a decade at The Economist in roles including Vice President, Business Development & ...

Topics