Washington-based litigator, Previn Warren, joins Niki in the studio to talk about his law firm’s multi-district litigation against social media giants Facebook, Instagram, Snapchat, TikTok, and YouTube. They discuss the growing epidemic of social media addiction among teens and tweens, and its impact on kids’ mental health, their developing brains, and bodies.
**This episode contains some serious subject matter including mentions of suicide, eating disorders, and violent sexual social media content.**
Niki: I’m Niki Christoff and welcome to Tech’ed Up.
Our guest today is Previn Warren. He’s a litigator and his latest multi-district lawsuit against social media companies is going to be one to watch. If you are interested in tech, regulatory policy, or you just want the online world to be a safer space, this is for you.
This show is a bit of a departure from our usual fare. We talk about some really serious stuff, including suicide, eating disorders, and, yes, rape, but that’s because that serious stuff is being consed by minors - not just teens but even tweens - on social media.
I’m really happy he was willing to come on the show for this healthy discussion about social media and tech responsibility. We don’t agree on everything, but we do agree that there are real-world impact of an algorithmic feed that is curated to hold our attention…no matter the cost.
Niki: Today in the studio my guest is Previn Warren. Previn. Welcome!
Previn: Thank you very much for having me.
Niki: So, you are a litigator, a trial attorney based here in Washington, DC and we have a very Washington, like, meet-cute. [Previn: laughs] Which is…
Previn: I don't think I've heard the term “meet cute” before, but it, it was a meeting and it was cute.
Niki: So that's a, it's, I think that's like a movie term when people- it was a platonic meet-cute, [Previn: chuckles] to be clear. So, and also a totally nerdy one, which I will describe. So, in Washington, if you're a member of the Supreme Court Bar, which we both are, you can watch oral arguments. So, before a very big case, which we've covered on this podcast, Gonzalez v, Google, all these nerd lawyers who'd work in [chuckling] tech policy and tech law are lined up.
And I sort of pathologically speak to any strangers around me. So, I turned around and struck up a conversation with you. [Previn: Right] And then my other, like pathology is being like, “You gotta come on the podcast and talk about the lawsuit you're bringing. Here's my QR code.” [Previn: Right] “It's not malware.”
So, here we find ourselves.
Previn: Yeah, that was, that was quite a morning.
Niki: It was quite a morning. And this is, I, part of the reason I'm so grateful you're coming in is [long pause] It's a little bit of a lion’s den. I mean, we're a cool, friendly lion’s den, but this is kind of a tech, cautiously optimistic but often defensive of big tech. [Previn: Sure] And I've worked in big tech.
So, when we were standing in that line, I was like, “You're here for the bride or groom?” and you were like, “I'm not here for Google.” And I was like, “I am here for Google.”
So, I wanna talk about what you're working on because you're bringing a very big novel lawsuit against a bunch of social media companies. And, and it brings up some really important and, and frankly, really dark issues around kids.
So let's start with that. Let's start with the case itself.
Previn: Sure, well, first of all, again, thank you for having me. It is a big case. It is a sprawling case, but it is a very important one that I think even folks who are apologists of tech should be able to get behind because of the nature of the injuries and how severe and how systemic a lot of these problems seem to be. So, the case is what we call a multi-district litigation, and that means that lots of different individuals have sued all around the country and the federal court system basically vacuums them all up together and lets them proceed in a joint fashion.
And so, along with some other attorneys, we are leading that effort to find justice for kids and their parents who really feel like they were victimized by social media and it's addictive impacts.
Niki: So, just to recap, these are individual families, mostly [Previn: Mm-hmm] who are plaintiffs who've been joined together. And you're working with other attorneys on this lawsuit that has one main theme, which I wanna dig into, which is adolescents and kids that are harmed in real ways by social media.
Previn: Yeah, I, I will tell you, it, it's sad that this is true, but a lot of our cases follow a very similar pattern in terms of what happened.
So, you'll have a kid, boy or girl, or non-binary, who was a happy, bright, socially engaged kid with a lot of extracurricular interests and activities around the age of maybe 10 or 11, maybe 12. They get a smartphone. For the kids that we represent, they find themselves really deep down a rabbit hole where they are kind of persistently and compulsively using one of these apps, unable to sort of pull themselves away: bringing the phone to bed, staying up all night on the apps. If the parent takes the, the phone away, they'll pretend to go to sleep, sneak out, and get the phone. If the parent forcibly takes the phone away, they'll, they'll act violently and exhibit really what are withdrawal symptoms that you would have for, for frankly a drug addict.
Y’know, we have clients who wind up using these apps, y’know, up to 11 hours a day. The symptoms that they wind up experiencing are severe loss of self-esteem, severe body image issues, anxiety, and depression. Many, many of our clients have attempted to take their own lives. Many of our clients have succeeded in doing that, and we represent their families and their estates, and y’know, it, it, it, it is a national tragedy. [sighing]
Niki: And we’re talking specifically about kids and kind of kids' brains on social media. And so you guys have some, as part of this litigation, you've got data that I'm not sure parents are aware of. I know I wasn't even aware of the first time we spoke about this.
There's anecdotally, it's obvious that social media is not healthy for us. I just made us take 75 photos [Previn: laughs] and asking why my arm doesn't look thinner.
which is, and I'm, y’know, I didn't grow up with any of this stuff, right? I didn't even have a cell phone until I was out of law school. [Previn: Right] So, I’m still influenced by it. It matters what I look like online, then, y’know, multiply that times a jillion [Previn: Right] for kids.
So we sort of anecdotally know, and there's been some research [interrupts self] actually, the very first, the very first episode of this podcast was with Cecilia Kang, who wrote a book on Facebook. [Previn: Mm-hmm] And we, we taped it the week that Frances Haugen was speaking on Capitol Hill as a whistleblower about, “They do know. These companies do know what's going on.”
But just to back up a little bit, cuz I wanna get to what the companies know and sort of what your remedies are. Some of the data that you guys have uncovered about what, what social media really does to brains.
Can you dig in a little bit more on the specifics of what you found that really is like fortifying the case?
Previn: Yeah, sure. Well, I would say generally speaking, when you have a whole bunch of observations that are similar, there's usually some science behind it and some reason why there's that pattern, and that's true here.
I think the social media apps that we're suing which are Instagram, Snapchat, TikTok, Facebook, and YouTube; they have some really shared design features that exploit the neural circuitry of children. And there's a few that we discuss in the, in the lawsuit.
So one of the main ones is called Intermittent Variable Rewards, and that traces back to the fifties. There was this psychologist at Harvard named B.F. Skinner who experimented on mice and was able to demonstrate that mice more compulsively press a lever to get food if they don't actually know if the lever will deliver the food.
So, if there's a pattern to when the food's gonna be delivered, they exhibit addictive tendencies, but if there's not a pattern, they exhibit even more addictive tendencies, right? So, that randomization of delivering a reward creates compulsion. And that's kind of what happens when you're on a feed, right?
You're scrolling through your phone and you know every x number of times you land on something that's really interesting to you, or you find really stimulating, or could even be something that you find scary. But it triggers your attention, right? The feed that, the algorithm that powers the feed knows exactly the schedule to keep you on to maximize your engagement.
And it's not a predictable schedule. And it's not even a random schedule. It's data-driven, right? They're, they're ingesting billions of points of data about your usage and what's going to most engage you. And they're structuring your feed to deliver rewards on a schedule that's totally unknown to you to make sure that you're as hooked as possible.
Niki: And it's, and it's not just so you could draw the analogy to slot machines. Right? [Previn: Right] And in fact though, this, I'm not sure people do know, these big tech companies hire engineers who work on slot machines.
Previn: Right! Yeah. I, I think, I think there was a real consciousness around what they were doing because, look, they make money through ads. I mean, these are ad-driven businesses, as Mark Zuckerberg famously explained in front of the Senate, [chuckles] y’know, “We make money by selling ads, Senator.” Right?
Niki: [chuckles] Okay. That was one of the few times I felt for Mark Zuckerberg.
Previn: Right! That was the question from-
Niki: Right? “I don't actually build iPhones, and I, if you wanna hand your phone to me for me to fix it, [chuckling] I will.”
Previn: Yeah, right. Exactly! So, so, y’know, look, more engagement is more ad revenue, and it's really that simple, and so, there was a race to figure out the way to, the best way to keep you glued, and look, for kids who have developing brains and, and that's a neurological fact it's not just me kind of moralistically saying “kids are immature,” it's just neurologically true, they have less developed prefrontal cor-cortexes.
I don't know if I said that right. Cortex. Cortices? Y’know what, who knows? That's not even a word. I don't know.
Niki: We know what you mean!]
Previn: You know what I mean, right?! The part of your brain that, that balances risk and reward is less developed for kids, and so they're just a lot more susceptible to getting sucked down these rabbit holes.
Niki: And even as adults, I mean, I have mentioned, I think before on this podcast, I got one of those kSafes, which stands for Kitchen Safe. [Previn: Okay] To put my phone in to lock it away from me because I have impulse control around my app usage, [Previn: right]
And I, again, I don't have, you know, my whole social life isn't online. I don't have those cues that kids are getting from other peer groups. [Previn: right] I'm my brain, you know, this is as good as we're gonna get on impulse control. So-
But I feel like for it, I can sense that myself. I wanna read a book and I reach for my phone, [Previn: right] And so, if I'm doing that and you're 12, 13, 14, and maybe more susceptible because maybe you're isolated or maybe you've been in Covid. [Previn: right] It's, but these aren't eggshell plaintiffs. This is pervasive across-
Previn: [interrupts] It is, it is. And you see them from, y’know, all different demographics, all different walks of life, socioeconomic statuses. Y’know, and it's, it's not just the getting hooked, it's the marrying of that with the social comparison features, right?
It's the “How many likes did I get on this post? I didn't get enough likes, I'm a bad person,” or “My post wasn't good, or I didn't look cute enough.” Right? And the notifications about, y’know, whether you got the likes keep you coming back to check, like, “What's my, like tally?” Right?
The filters, which, again, it's not like this is third-party content. These are features of the products, the filters that let you manufacture your appearance to look a certain way. A lot of people, a lot of young kids, start to think if they don't look like that in real life, then they look bad.
Niki: Yeah, right. And nobody looks like that in real life.
Previn: Of course, of course not! It's it's, it's literally made up.
Niki: Yeah. It's made up. And it's sort of, even though you might know that it's made up intellectually and kids know that too, it doesn't stop the emotional drive.
Previn: It doesn't. And, and listen, eating disorders are really, really, powerful and scary and they're physical, and our clients have been hospitalized for months and months at a time in residential programs, outpatient programs, dealing with these issues because once you develop an eating disorder, it's, it's a really, really difficult thing to shake.
Niki: Yeah, it's a lifetime. It's usually a lifetime chronic issue you're dealing with. [Previn: Exactly]
And I wanna talk a little bit about parents because I think they're, I actually think that the American public and maybe the global public is starting to realize, “Actually, it's not parents' fault that these kids are addicted to their phones.”
There's, you and I spoke once about the illusion of control that parents have, [Previn: Right] That they might feel like they're putting some restrictions on, but they're, I think the words you used were like, they're just totally outgunned.
Previn: That, that, that's right. They are totally outgunned. I mean, y’know, [sighs] in theory, you're not even supposed to be on the platforms unless you're 13 or older without verifiable parental consent.
But what mechanisms are there on these apps to actually prevent that from happening? I mean, kids just lie about their age. It's that simple.
Niki: Yeah. They're not dumb! They can subtract.
Previn: Right, exactly! And if, if, if you don't succeed, try, try again. I mean, if you put in your right age and get bounced, you open up a new email account, and you're done. Right? And you just log in using that one and, and fake your age.
So, y’know, look, I know the defendants in these cases are, are actively scrambling to try to find new features and parental controls and, and whatnot, probably in response to this lawsuit in, in large measure, but they're just not there.
And the, frankly, the, the, the powerful force that is these algorithms is, is just too much for parents to police against.
Niki: Yeah. One thing I've, I've, so first of all, it, I'm not sure that I'm always intellectually consistent, but I do think I'm intellectually nuanced when it comes to the platforms. [Previn: laughs] I'll just say, so that's what I, [Previn: There you go. Sure.]
So obviously, I worked at Google for a long time, [Previn: Right] And when I was at Google, we didn't have any social media platform, and YouTube, even at the time, did not have a ton of engagement. [Previn: Right] So, the idea at Google was to get you the quickest result you want through search results. [Previn: mm-hmm] And get you moving. They wanted you to go through the content really fast, not stay forever on Google, but click the ad and go buy something [Previn: Right], which is very different than, than the attention economy, which is what we've shifted to and what you guys are working on. So, [Previn: Yes] in 2007, when I started in tech, we, yes, it was ad-driven, but it wasn't attention-driven.
Previn: That's right. And so it's attention-driven and it's data-driven. And the two are really married to each other because the more data you get about your user, the better you understand how to capture their attention. Right?
And, and, and listen, Instagram didn't used to be like this. [Niki: Nope] When it launched, it was like a chronological feat of latte art and cool travel photos. Right. I mean, kids were not getting hooked to that and experiencing these symptoms.
But after the acquisition by Meta in particular they really changed strategy and it became a race to the bottom with their competitors on changing the way the feed is organized to really maximize user attention. And that's where you start to run into some of these really negative psychological and society-wide consequences for kids.
Niki: Yeah. And so I think when I think about your defendants that your plaintiffs are suing, I think of Google sort of on one end, YouTube, and you're right, YouTube very much is in a race with competitors that it wasn't when it started. [Previn: right] Instagram, I loved OG Instagram [Previn: chuckles] and I am really in general not for the FTC blocking mergers, but man, oh man, if we could go back and just get old Instagram.
Previn: It was cool! I liked it too back in the day when the photos were all square.
Niki: The photos were square! Yeah, exactly. So, miss those days. And I think that, you know, my observation is, its a little bit, skews a bit older on Instagram. It's not, it's not that kids aren't on it, but then you get to Snapchat.
I had dinner recently with a 16-year-old and I'm like, “Are you streaking?” [Previn: right] Y’know those streaks? Like, how long have you been engaged on the app? If you wanna explain what it that is.
Previn: Yeah, I know! I mean, that's a great example of something that's not the feed but is like a collateral feature that just amplifies these harms, like a, so it's called a snapstreak. And basically, it's like a, it's a score that you can rack up with between you and one or more of your friends. And, like, if you keep responding within a certain amount of time, your snapstreak continues. But if you don't respond within a certain amount of time, you break the snapstreak.
So there's immense social pressure from your friends to keep responding to this thread so that the snapstreak goes up and up and up. Right? And it's, it's just this manufactured thing to keep kids using the platform. Right? But because of the combination with social pressure and these are kids, they really feel compelled to do it.
Niki: Yeah. And it's, it's a gamification of their attention and their time.
Previn: Yes! That's exactly right.
Niki: So then you've got Snapchat, which very sort of tween and teen that I know is on. That's their main messaging app that I can tell. [Previn: Mm-hmm] I mean, I have no kids, but I poll them constantly. [Previn: laughs] Just like I talk to you in line, I'm constantly like interrogating teenagers.
Previn: Yeah. No, I've got kids. I've got a 10-year-old and, and a now eight-year-old. Actually, today was, is my now eight-year-old’s birthday. [Niki: Happy Birthday]
So we had donuts this morning, so I'm totally hopped up on sugar, [Niki: laughs] which may explain some really long answers. But yeah, my 10-year-old in particular, he's in, he's in fifth grade and, and it's starting, y’know. The sixth graders that he talks to on the bus, they're all on Snapchat. That's how they communicate.
Niki: Yeah, and then I would put, then for me and again, I'm very against TikTok. It is set up in a slightly different way, which it leads with the “for you” page, [Previn: Right] So it's not who you're following, it's who they know you want to see more of.
So it's really sticky. [Previn: Yes] It's like you hop on TikTok and you're seeing stuff you didn't even know you wanted, but man, do you want it! [Previn: Yes]
Previn: Well, and just notice how that the existence of TikTok has had consequences for the whole social media industry. I mean, YouTube launched YouTube shorts [Niki: Right] like about a year ago, and it's the same vertical short video format.
It's not, not like regular YouTube. It's a lot more like TikTok. Instagram launched reels, y’know, a few years ago for the same reason to try to nip in the bud competition from TikTok and others. And so, it's been a race to the bottom.
Niki: Yeah, and I think for TikTok also, one of the controls, and I just have to fully disclose, so there's a, there's a sort of boutique law firm in Washington who I've done some press work for. [Previn: mm-hmm] I would do this pro-bono, that's how strong I feel about TikTok, but I was paid to help them with some of their communications. And one of the things that they brought to light is that parents can think they're setting controls, but in fact, that's not really what they're controlling.
And in TikTok’s case, they sort of have their fingers crossed behind their back. [Previn: mm-hmm] And are like, “Sure, sure, sure. This is only content for kids 17 and up.” But then, even if you put in that you're 13, you're still getting terrible content- rape play. [Previn: [sadly] Yeah] Like that is not something anyone thinks a 13-year-old; I think I can universally say nobody thinks 13-year-olds should be seeing violent imagery like that.
But they do because the, either the platform can't control it or they just don't control it, [Previn: Right] but that was part of the case I was working on. And I also think that there's sort of turning a blind eye from, y’know, the app stores [Previn: Yeah] because they need people downloading these cuz they're the most popular apps in America!
Previn: Well, that's, that's moving the, the chain of liability upstream, but I, I think you've got a good point there. Y’know, wait, why should kids even be able to download these things in the first instance? And, and look, you're, you're not wrong about things like “rape play”, and it's horrifying to even have to say those two words [Niki: Yeah] in that order [Niki: Exactly] but, but there is an insane darkness to what's circulating on these platforms.
And again, our lawsuit isn't about the fact that there is certain kinds of content on the platforms. It's about the fact that the platforms are defectively designed in a way that keeps the kids using it with such intensity and frequency that it winds up steering them towards that stuff. Right? [Niki: Right]
And y’ know, the addiction alone is a harm, to be clear, right? It, it forces kids out of their extracurricular activities. It causes them to become disengaged with their family and friends. Their grades suffer, but with the addiction come all these other injuries.
Niki: Yeah. It's shaping brains. And actually, just so that people know how this happened, so the, the law firm I was working with took burner phones, [Previn: right] said they were a 13-year-old boy. So they were able to sign up for TikTok and then just typed “OnlyFans,” which by the way, I'm a super fan of OnlyFans. [Previn: chuckles]
There are a lot of themes on this podcast. One is I hate TikTok. I'm for OnlyFans, but not for 13-year-olds. [Previn: Yeah]
So, they just typed it, and then that's when TikTok would start presenting this certain kind of content [Previn: Yeah] that is frankly, completely, [interrupts self] parents don't know their kids are seeing that. Like, I would say the vast majority of parents don't even realize that that's being pushed out to their children.
Previn: No, I, I think that's right. And, and then look, I will, y’know, pair this with another problem that we're suing over, which is direct messaging and geo-locating features that allow kids to come into contact with adult strangers.
And y’know, when, when you're down this rabbit hole looking at this kind of stuff and you have the ability for strangers to get in touch with you, that's really scary. And so, one of the internal studies from Meta itself that we uncovered, and cited in our complaint, found that every single day, there are half a million underage Instagram accounts that have an inappropriate interaction between a child and an adult.
Previn: Every single day!
Niki: From their internal data? So, you found that out through a whistleblower?
Previn: We found that out through discovery. [Niki: Ohhhh! ] We got that, we got that through discovery and, y’know, we, we quote that internal study in the complaint. And that's just from the horse's mouth.
Niki: Yeah. That I think is sort of what's been interesting is seeing what the data that is known internally about these things. And then you see, I think, some good faith efforts to try to create - like hiding like counts and, maybe having a timer, but sometimes those things just don't even work.
Previn: Right. Oh, I mean that, well, [huffs] that’s funny you should mention that!
I mean, another thing that we learned in discovery is that Meta actually understood that its timer didn't work and, y’know, rolled it out anyway [laughs ruefully], which is kind of astonishing. Y’know, one of their own, of their own employees said, and I'm quoting here, “The tools we currently have aren't effective at limiting time on the app.” Right? And, and said that the data that the tool was reporting was incorrect, and yet they put it out there [Niki: Yeah] and held it out there as, as a feature that would limit addictive use.
Niki: Yeah, I think this is sort of part of it. It's like; it's not like people don't know. It's just where do you get to? And one of the things you're highlighting are these really profound harms that lead to eating disorders, suicide, brains being changed, withdrawal symptoms.
And I know, I mean, a lot of the reasons parents just give in. I mean, I don't, again, I don't have kids, but I, you see it, [Previn: Right] It's just like not worth it! You're already fighting so many battles. [Previn: Right] So you just give them the phone. [Previn: Yeah] And, and they, they that the phone is essential to their social lives now. [Previn: Right]
So I guess my question is, what are the remedies you're seeking? What do you, what do you want to see happen from this?
Previn: [sighs] Well, there's a lot of different things. I think the first thing is that we want our clients to, to have justice, right?
I mean, we, they, they are owed compensation for their harms. I mean, in some instances, y’know, thousands of dollars of out-of-pocket medical bills. Right? In some instances, they've had to send their kids outta state to be able to have the kind of residential treatment that they need. They shouldn't have to pay for that.
And, and, y’know, look in, in the context of the big tech economy, maybe numbers like that don't feel so staggering. But for, y’know, for families across America, that's a ton of money and they should get it back. Right?
And then there should be some financial consequences too to these companies. Y’know, they, they shouldn't be able to get away with this kind of behavior with zero accountability and that's kind of what they wave Section 230 around to say they deserve zero accountability, right?
I mean, we are here to prove that ordinary people who are users of these platforms can stand up and fight back. It's not just a question of sending an email out into the ether that says, “Please block my kids' account,” and never getting a response back. Right?
Niki: Oh man, [sadly] “Please block my kids' account.”
Previn: Yeah, I mean, it, it just goes nowhere and we wanna show American families that they have a voice in this fight.
Niki: Yeah. Which sort of leads us full circle to how we met. So, at the, we covered this on I think, a really good episode that we did with Emily Birnbaum, who's a Bloomberg reporter.
Previn: I listened to it. It was great!
Niki: Thank you. Yeah, it was exactly about that day we were sitting there [Previn: Yeah!] and sort of how, how each justice was reacting to Section 230, [Previn: Right] which is very powerful, although very short as a statute, [Previn: Mm-hmm] that essentially protects tech companies from all liability. It is sitting in front of the Supreme Court, unfortunately, with sort of a weak sauce plaintiff's case, in my opinion.
Did you work on that? Sorry if you did.
Previn: I didn't work on the case. I did help coordinate some of the amicus filings.
Niki: Okay. Oh, good! Well, some of those were really good! [Previn: Some of them were.] And you saw Justice Jackson [chuckling] trying to drag 'em into the arguments. [Previn: Right] She's like, “Can someone please cite these?” [Previn: Yeah] But that's, that's key because it's at the Supreme Court.
I was a litigator for about two years of my life. But this could go on for a really long time for you. [Previn: right] And I think, unfortunately, we're just gonna continue to see the harms piling up, but I also think that's gonna bring about a lot of public pressure to do something.
Previn: Yeah. I, I think there will be a lot of public pressure. I mean, you, you even starting to see, y’know, corporate actors get in the game. So, Dove, the soap company, [Niki: Yeah] just released this really powerful commercial that is kind of going viral about body image issues and social media, right? It's, it's really well done. I encourage people to, to watch it. But you know, that's a company that, you know, really have skin in the game on this fight, but they feel so compelled by these harms to say something.
Niki: Right. Yeah. We'll actually drop that ad campaign in the show notes. And one thing that, that Cecilia Kang, New York Times reporter, said on this podcast is she said, “People will get to the point like they are with cigarettes and just say like, “We're done. [Previn: Right] Like, we're done.”
Previn: I, I hope that's right. I mean that, y’know, a little plug from my firm, Motley Rice. I mean, we were kind of at the forefront of fighting the tobacco wars and preventing cigar companies from marketing to kids. Right? So, we won that fight in the nineties. And to me, this is really the sequel.
Niki: Yeah, this is the sequel. And I will say, this is my personal opinion. I tend to be, listen, you're over 18, you're over 21, you wanna smoke cigarettes, that's fine. [Previn: Right] Very, very different for kids. [Previn: Yeah] Very, very different, obviously, for kids. Duh! I mean, I was from an era where we got candy cigarettes in our Halloween bags,
Niki: Which is unbelievable to me, but in our Halloween bag, that's not okay [Previn: Right]. You would never do that now. And I think that's where we might get to because what, what we're, what's happening to their brains is gonna impact them for a lifetime. [Previn: Right] And it's, it's not okay.
So even though I started by saying, like, I'm a tech apologist or defender when it comes to children, I and minors, I just don't agree with that at all. [Previn: Yeah] They have to be protected.
Previn: That's right. You can, you can be a libertarian, you can be an industry proponent, you can be hardcore conservative or Republican, and you can still agree with this issue. And honestly, we have a pretty growing bipartisan consensus. Y’know, I was at a hearing on Section 230 reform in front of the Senate Judiciary Committee maybe a month ago, and Senator Josh Hawley was a huge proponent of this cause, right, and, and was in real agreement with Senator Blumenthal, who's, y’know, really far on the other side of the aisle. So, I think you're starting to see some public pressure mount. I hope!
Niki: Yeah, I think so too. So, what we might do is check in with you, like, a little bit later and see what's happening [Previn: I’d like that!], but I'm really grateful for you coming on. I'm so glad we met and that you agreed to pop by the studio. Cuz I, if we weren't just standing there, I don't think I would've had someone who is a litigator suing
my former employer, and it's just been eye-opening to read about and talk to you about what's really in this case and, and what's happening to kids.
So, thank you.
Previn: Thank you very much and, and thanks for being open-minded enough to, to have me in the lion’s den. And thanks to your open-minded audience.
Niki: Join us for our next episode featuring Jessica Powell, CEO and Co-Founder of Audioshake, and oh yeah, my old boss at Google. She’s super funny, she’s super smart, and she recently coined the phrase Deep Fake Drake for a Medium title. Which I thought was pretty catchy! We’re gonna talk about AI and the music industry and probably tell a story or two about early Google. It’ll be a fun one, I promise!