Tech'ed Up

Profiles in Tech: Fighting Online Hate • Imran Ahmed

Niki Christoff

The CEO and Founder of the Center for Countering Digital Hate, Imran Ahmed, joins Niki in the studio to talk about what his recent win against Elon Musk means for the proliferation of hate speech online. He and Niki debate who has moral responsibility for online hate but both agree the current climate is untenable.

“CCDH advocates a market regulatory solution to a market regulatory failure.” -Imran Ahmed

[music plays]

Niki: I'm Niki Christoff and welcome to Tech’ed Up. Today I'm joined in the studio by Imran Ahmed, the founder and CEO of the Center for Countering Digital Hate. His non-profit brings attention to the proliferation of hate speech on social media, work that got under the skin of one Elon Musk. The billionaire megalomaniac sued CCDH, and they just beat him in court.

Imran, thank you for coming into the studio today. 

Imran: It's such a pleasure. 

Niki: I was really intrigued - anytime somebody pushes back on the bully of Elon Musk, I appreciate it!

So, I'm glad we've been introduced. 

Imran: Well, I'm glad I could be of service! Elon Musk is someone who has taken great pleasure of late in weaponizing the law courts to, y'know, to silence his critics and I think it's important when someone's bullying you to stand up to it.

I'm really proud that we could stand up to him because I think that's given courage to a lot of people who have been cowed by the sheer awesomeness of his, of his economic might. 

I'm also really, really proud of the way that my team worked on it because the truth is that it was a very expensive process. It cost us half a million dollars to defend that. 

Niki: And you're a small non-profit.

Imran: We're a small non-profit. We became bigger last year thanks to Elon Musk, is the truth.

Niki: [chuckles] He drew a lot of attention to what the work you do. 

Imran: Yep, he drew a lot of attention and he identified us as being y'know, as being a threat. [Niki: Mm-hmm] And [pause] we are a threat to the impunity with which social media companies have operated in the past few years.

For the first time, I think that they are feeling real meaningful checks and balances on their activity. 

I'm really proud to have been a check and a balance on Mr. Musk. 

Niki: So, I am no Elon Musk fan. 

I do like one thing he's doing, which is subsidizing space travel. I'm for that. [Imran: Me too] And I think he, I think sometimes you do need some brashness to break through new technologies, but his brashness around Twitter has been an absolute abject failure.

They've lost, I think, 75 of their top 100 advertisers. I'm on Twitter because reporters are on Twitter and I often need to see what they're doing and saying, but I very rarely tweet. And part of the reason is I sort of intuitively know that it's become just a cesspool of hateful, ugly speech. 

Let's get to the crux of the issue that upset Elon Musk about the work you guys are doing. 

Imran: The studies that we did that annoyed Mr. Musk were things like our study of the quantum of hate speech and the increase in that after he took over. Now my argument is that the social mores, the norms of attitude and behavior of a platform, are defined by a variety of different things: The rules on that platform, whether or not they're enforced and so therefore whether or not people feel those are meaningful rules. Whether or not there are sanctions attached and how those sanctions are triggered. And so, people seeing justice to be served that if people misbehave, that that sanctions are attached to it, but also, y'know, the signal sent by the owner of the platform.

Now, most social media platforms to date have said “We want this to be a healthy and tolerant place for discussion and debate.” He came, came along and said, “Nuh-uh!” A bat signal goes up on the top of his building “Welcome back all the racists, all the misogynists, all the homophobes, all the disinformation spreaders, all the foreign disinformation actors, Come back in! Twitter's open for business.”

And he literally released tens of thousands of people from, uh, who'd been previously banned by the old owners of Twitter and said, welcome back to the platform. And we wanted to know, well, what does that mean? How does that change the way that people behave? Is there a marked difference in the behavior of people on that platform?

And what we found was that there was a massive increase in the use of hate speech.

For example, the most offensive term that you can use about African Americans, the use of that on a daily basis increased 202 percent in total, on a daily basis, in the week after he took over compared to the year before.

So people thought “This is now a free speech zone for using the n-word.”

Niki: Right. I would like to say something. 

So he took over, as you said, put up a bat signal. Also this very weird, it looks like a porn shop, flickering giant neon X over the twi [Imran: Yes] They took the little bird down, in San Francisco, which I actually loved. Put this X up, brought all these people onto Twitter, there's more of this speech coming up, but then we're also seeing, this misrepresentation of the First Amendment.

So, this is going to be annoying, but I'm going to mansplain to our listeners the First Amendment because it's something that people get wrong constantly and it drives me nuts

You don't have a right to say anything you want wherever you want. Can't do it in the workplace, platforms, private companies can say “This isn't allowed here. It's against our terms of service. We’re not doing it.”

 The First Amendment protects you from the government censoring you. So, when Elon Musk is constantly talking about First Amendment rights, I'm like, I don't even think he totally understands it. 

Imran: Yeah, no, it's really, really clear he doesn't. Ironically, he does understand it when it comes to his own business.

So, if you were critical of him on the internal Slack, he'd fire you immediately. [Niki: mmh] So there are consequences for speech, which, y'know, which he feels is critical of him. And of course, we've also shown that because he sued us, and we defeated him on First Amendment grounds, among others, in court. 

The courts have now proven, and said definitively, that Mr.Musk is the kind of person who tries to take away other people's rights to free speech by using lawfare, slap suits, as they're called to try and silence us. 

So, yeah, I don't think he gets the First Amendment at all. 

The First Amendment is about encouraging speech and saying the government should not stick its nose into what we say or think, so that we can have healthy discourse. But there are other rules and there are ways that we conduct speech in life. There are unwritten rules, there are, y'know, behavioral standards. 

Try and go on the street and start shouting, y'know, the sort of racial slurs that we were tracking and you'll find out that the First Amendment doesn't protect you [Niki: No!] from having someone come up to you and go, “What the hell do you think you're up to?”

Niki: Right, social norms! And one of the things I really like about the work that you're doing is you're essentially creating a market solution for a market problem. 

Imran: It's a market and regulatory solution to a market and regulatory problem. No, I agree. And like, it is, it's a market and regulatory failure. We've got two components to how we respond to it.

There's the regulatory part, and that's really about enabling the market part of our solution set, because we don't want government to be telling you, “Here's a list of what you can and can't say.” But what we do think the government should do is give us transparency of algorithms, of content enforcement policies, of advertising.

We think that there should be meaningful accountability, which means that we should have institutions, whether it is select committees or, y'know, so congressional committees or a regulator.

And then where there is negligence and where that does cause harm and where that reaches the point of being tortious, then y'know, and justiciable negligence is, y'know, extreme negligence to the point where they could be held liable. I think that they should be able to be held liable.

And I think that's where, y'know, CCDH steps into an area which I know is controversial, Section 230. But, y'know, we have a stance on that, which isn't going to please everyone, but I do think that but I do think that what the government has done right now is not regulation, it's anti-regulation. They have sucked away our ability to hold them responsible when they truly are negligent in the way they behave.

And so, that's why I say that CCDH advocates a market regulatory solution to a market regulatory failure.

Niki: So, you said that Section 230, this is the very small but very powerful statute that protects platforms from liability for what's said on the platforms. You and I are on different sides of this issue, which is actually okay.

I had a guest, Previn Warren, who's an attorney who's leading some multi-state lawsuits against a bunch of social media companies for harms to kids. We met in line at the Supreme Court waiting to go in. I was like, “Oh, are you here for the bride or the groom?” And it turned out we were on opposite sides of the case, but I wanted to hear what he had to say. Y'know what the legal arguments were. 

They're having a lot of success in this area in the States about this, as you said, tortious harm, real-world harm, and how will this be adjusted? 

Can Section 230 be amended? I don't know. 

Should it? Again, we might be on different sides, but one thing I like about what you're doing is even just by raising awareness of the hate and ugliness itself and the impacts that's having on us as a society, people are kind of checking out of these platforms.

Twitter has lost 15 percent of its users.

Imran: And they're killing themselves. That's the irony of it. [Niki: That’s right] Is that, of course, what they're doing is destroying their own long-term profitability in, y'know, in search for both short-term profitability, but also In it because they believe that if they ever show any degree of compassion, if they ever show any degree of feeling, guilt, or responsibility for the harm that's been created in our society, they will start the erosion of what they currently have of our inability to hold them liable. 

And I think that that’s so problematic, I mean you saw Zuckerberg, y'know, you saw the sort of the tension on his face as he's asked to apologize to the parents who've been harmed by his platform.

And I thought it was quite an important moment when he did apologize. y'know, I used to work in politics, you used to work in politics. I did too, yeah. So when I advised politicians, I always used to say, never apologize for anything. Never. [Niki: Right] Because the second that you apologize, you do two things.

First of all, you acknowledge that you were responsible, and second, you acknowledge that you did harm. Bad situation. It's really hard to get out of that. 

Niki: I'm a big fan of accepting accountability, but not apologizing. [chuckles]

Imran: Well, I mean, the thing is, like, y'know, I'm someone who really believes in a meaningful apology.

I, apologized. I mean, I'm British. I say sorry when I like, y'know, when I, when I breathe, um, “Ooh, sorry. Excuse me. [Niki: laughs] Pardon.” Native Alaskans have got like hundreds of words for snow. British people have hundreds of words for how guilty we feel just for existing. [Niki: laughs] 

I think it was an important moment because it started to reverse what I think has happened over the last 28 years which is, I think, that legal invulnerability has translated itself into a belief that they should be morally invulnerable, that they cannot be held liable morally for the damage their platforms cause.

And my view is that's BS. There's so many examples from from our work and from the experiences I've had where I've taken things to them that any decent person would have said, “We've got to do something. We've got to fix this.” And they've done their best to make sure that they don't.

Niki: It's hard, y'know, so obviously I worked in tech for a long time. We were talking about this right before we got started. 

I was at Google when, y'know, copyright infringement has never been an issue. There's a clear law. They take it down. There's never been a problem with it. 

There was a moment when we were asked to start taking down revenge porn. And I remember the debate within the Google legal department was, “We hate it. It's sick. It's not right for people to be posting non-consensual sexual images of their, y'know, usually, ex-girlfriend, but it could be anyone, revenge porn. But if we start to take it down,” 

Imran: [protesting sounds] Can I…the slippery slope! 

Niki: Yes. I know!

Imran: It's like saying that, y'know, it's like if you ran a deli, saying like, “If we ever listed the ingredients for our stuff, you never know. y'know, in the end everyone will be asking to inspect our kitchen with every, every purchase.” 

It's just ridiculous.

Niki: So, I worked at Google, I'm a total Google apologist, at least, certainly when I was there. I left in 2015, so we probably didn't cross paths. But when I was there, I did feel like we had these debates, and it wasn't like anyone was saying, “No, leave revenge porn up, like, we gotta, y'know, Don't curtail anything.”

Imran: But who won those debates? 

Niki: We did start taking revenge porn down. We started taking it down that year. 

Now the legal team was frustrated with us, but we did start taking it down. And then we started taking down more things, and then we started having one boxes. 

Imran: Thank you. 

Niki: If somebody, if somebody searched for suicide, you've got a one box popping up. If they searched anything for child sexual abuse imagery, we went right to NCMEC and to the Feds. 

So, we did start to do those things internally. I think that it's having been in those rooms. I know at Google we struggled. We might have, y'know, flown in formation publicly, but certainly internally we had a lot of discussion.

Imran: And  I have no doubt. And I, y'know, a few weeks ago I was in London, I gave a speech to a conference of trust and safety professionals from tech companies and saying, “Look, I'm on your side. Like I'm here to give you the evidence you need t owin those arguments internally.. I'm not going to pretend for a second that I think you're the answer, you're the complete answer to it all. Because if you were the complete answer to it all, I don't think we'd have the problems that we have now.”

Y'know, we did a series of studies looking at TikTok. Within 2. 6 minutes of setting up an account as a 13-year-old girl on TikTok, those accounts were being served self-harm content. Within eight minutes, eating disorder content every 39 seconds on average.

And what was TikTok’s response? A to call us liars, even though we had video of it all. 'Cause we'd recorded the first half hour of the “For You” feed that we'd been studying.  And second, they removed the ability to see how many times hashtags have been viewed from their creator's center. So they, they, they made their account, they made their services less transparent. 

 I cannot believe that this country makes it impossible for me to hold liable someone who has done that and knows they do that and knows it's a problem. 

And I cannot understand how you cannot hold those platforms responsible. Just yesterday, y'know, my colleagues and I sat and talked to Kristen McBride, a mom, who lost her son. Again, and when she went to see a lawyer, the lawyer said, “Sorry, Section 230 means you can't hold them liable.”

I just cannot accept that that system is just. I, I just think it's unfair.

Niki: You are starting to see these lawsuits, but you're also starting to see Congress paying an awful lot of attention on both sides of the aisle.

Parents are. done with their kids feeling distracted, sick, isolated, fed this terrible content. When you see that, and when that messaging starts to get out, and Attorneys General see it, and members of Congress see it, and parents are feeling it and know it to be true, you do see accountability somehow.

Maybe the courts are where it happens. Maybe Congress is where it happens. Or maybe just social norms. People are just done. 

Imran: I think you're right, and I think people are, people are kind of done with it. The interesting thing is that our polling shows that the majority of the American public, a significant majority, bipartisan basis, every age, every ethnicity, everything, agree that online harms cause offline harm, that social media companies are the ones who are responsible, but very few of them have any hope that anything can be changed.

Niki: Yeah, there's a, there's a fatalistic view about it. 

Imran: Right, and there are these, and I think that's in part because there are these enormous companies. Y'know, two of the world's richest men are Musk and Zuckerberg, in the top five in the recently released Forbes Super Billionaire Index, own these companies and they know that they hold enormous political power. 

But I do think that we are at a, we are at a tipping point at which people are starting to ask the question. I'm a campaigner, I'm an advocate and I'm interested in the policy, but I'm happy to play the politics on this.

I think there are a lot of people in this country who are now going to think of Section 230 that it will be as well known as, y'know, Prop 8 or like other things that have, that have come along that have been seen as fundamentally unjust. 

More and more people will see themselves as victims of something that they never heard of before called Section 230, and more and more parents are getting annoyed about it. 

I just think that they should all operate to fair rules. I don't think it's unfair to ask companies to be transparent if they have algorithms which reshape reality for the, for significant numbers of American people.

 I think you should be transparent about the way that your algorithms work. I think you should be transparent about your content enforcement policies.

We want them to be transparent about is the advertising, because advertising clearly has a significant impact on distorting the lens through which they present the world. And I think it's important that they're clear about that. People have the right to know when they're being advertised to, and when they are just experiencing the service as it is. 

And then accountability. We want people to be able to hold them accountable and to be able to hold them responsible where they have behaved in an unjust way.

And that's what we call our STAR Framework. We say that transparency, accountability, and responsibility will lead to safety by design but because that spells TARS [Niki: laughs] and TARS doesn't sound as good as STAR. So, we went safety by design. 

Niki: You had to redo your acronym! 

Imran: Yeah.

Niki: A true, that's a policy person at heart, for re-naming them [chuckling].

Imran:  But I mean, I keep being told that we're trying to censor people, and I'm like, “What are we doing that's censoring people? What are we saying that's so anti-American?” It's actually profoundly American, the solution set that we have. 

And if you do that, and if you have a vibrant civil society, then we can have a discourse about whether or not companies should profit from hate, from disinformation, from lies. That's how America's meant to work! 

As people become aware that there is a problem, as they're aware whose fault it is, they're going to start to get very, very annoyed that Congress, the White House, our politicians are doing nothing about it. [Niki: Yeah] 

Who's got the parents back? Who's got our backs in society?

Niki: Well, I think this is, I'm probably one of those policy people who's tied up in knots over, y'know, the statute itself and the letter of the law. But I do think this, I think we're in a moment where social media is feeling like cigarettes. 

I think people are feeling it. [Imran: yep]They're looking for something. And yes, there is a sense of being overwhelmed by just how addictive these algorithms are, but I don't think you're going to see people, especially as we go into this incredibly important election season, not just in the U.S., but globally, that people are going to just see sit back and let the apps control us.

I think that there's a movement afoot. 

Imran: Here's the thing with politics, like, I've been told a thousand reasons why it's impossible that Section 230 will be reformed, and why tech companies will never be held accountable. 

And I've been in politics for long enough to know that those same commentators who talk a lot and give you a lot of reasons why something's impossible. When it happens, they'll be the same people who will write, y'know, BuzzFeed lists of a thousand reasons why this was inevitable.

Things can change, things do change, and they change in unpredictable ways. 

I think it would be so poetic if the legacy of Musk's attempt to silence organizations like CCDH, the ADL, and Media Matters for America is that he becomes the reason why this, why statutory transparency and data access pathways are legislated for in the United States. 

Wouldn't that be [Niki: A delicious outcome] so perfect? 

Niki:  I'd be for it. We agree on that. 

Thank you so much for coming in today. I know that I'm one of these people who probably drives you nuts on Zoom calls, but it's unacceptable to have this discourse happening the way that it does.

So, whether that's transparency, whether it's changes in social mores, whether that's Capitol Hill doing something about it, or maybe just voters there's an incident and something happens, I do think we're going to see a change. 

Imran: Thank you!


People on this episode