Ctrl-Alt-Speech

This Episode has Masculine Energy

Mike Masnick & Renee DiResta Season 1 Episode 45

In this week’s roundup of the latest news in online speech, content moderation and internet regulation, Mike is joined by guest host Renee DiResta, associate research professor at the McCourt School of Public Policy at Georgetown University. They cover:

This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.

Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.

Mike Masnick:

So, Rene, there's an app out there called Character AI that I know you're familiar with, and we have talked about it, and on it, there are all these different sort of AI avatars and bots that you can communicate with, that people can make their own, and everything, and one of the characters on there, based on a A somewhat problematic meme, I would say, is the GigaChad, and if I go to Character AI, the GigaChad is recommended to me as a character that I might want to speak with, and the prompt that the GigaChad gives me is, Hello dude, I'm the famous GigaChad. You are currently talking to an alpha male. And how would you respond to that?

Renee DiResta:

How can I be more manly, Mike? I hear masculine energy helps my career in tech now. I'd like to, I'd like to learn more about that, GigaChad.

Mike Masnick:

Wonderful. It is January 30th, 2025. Welcome to control alt speech. this week's episode is brought to you with financial support from the future of online trust and safety fund. And this week we will be. Discussing a new free speech crisis. We will be talking about AI avatars like the GigaChad and a bunch of other stories as well. And because Ben is away, we have a special guest host with us today. Rene Diresta, Associate Research Professor at the McCourt School of Public Policy at Georgetown University. Rene, welcome and thank you for interacting with the GigaChad for us.

Renee DiResta:

Thank you for having me.

Mike Masnick:

there, there is a lot of different stories and we were preparing for this episode and realizing how much dumb stuff is going on right now.

Renee DiResta:

It's a very, very dumb time. Yep.

Mike Masnick:

so why don't, why don't we jump right in, and get to the dumb stuff. you know, I wish, I wish we had better news, but this year, 2025 is going to be a whole bunch of dumb stuff. I want to start. There was a really interesting op ed this week, at MSNBC by Andy Craig, who's a fellow at the Institute for Humane Studies, which I think covers a bunch of stuff that, both you and I have covered. talked about and certainly written about. you have, your wonderful book, which definitely gets into this as well. but I think it's important to talk about, he talks about what he refers to as the new free speech crisis hiding in plain sight. And the argument being that, especially among sort of the MAGA crew, they talk the talk about free speech and they sort of present themselves as free speech defenders and they wrap themselves in this presentation of being free speech absolutists. And yet, when you scratch beneath the surface, over and over again, they're all trying to suppress free speech, and that is true whether you're talking about. Elon Musk or Donald Trump who have filed a whole bunch of lawsuits to try and suppress speech and silence people who have criticized them all sorts of ridiculous lawsuits in lots of ways obviously Elon's doing media matters and Some others as well and Trump suing 60 minutes because he didn't like the way they edited interview of Kamala Harris, and suing a pollster for for giving a poll results that said that Harris Might be Trump, or you have sort of folks in the government already who sort of abusing their position. We've talked about Brendan Carr, who's now the head of the FCC, who has been threatening social media companies, been threatening broadcasters. he. reopened investigations into, three different broadcasters after a conservative group filed a complaint about them trying to, in some cases, get their license pulled or other kinds of punishment because it didn't like that one of them aired the CBS, uh, video that Trump had sued over. And then you have Jim Jordan, obviously, who has this weaponization committee, which, uh, is ironically named in some sense because he has absolutely weaponized it to suppress free speech, which is also something you have some familiarity with. so what are your thoughts on, this argument that, this group of people have really, they go out there and they present themselves as free speech supporters and yet they seem to be attacking and trying to suppress any speech that's critical of them.

Renee DiResta:

Well, the way I've thought about it for a long time is, um, I add, the phrase I use is free speech as meme, right? As opposed to free speech as law, or as concept, or anything that, legal scholars or first amendment lawyers would think of as free speech. free speech as meme is free speech as marketing, right? It's a way to, Position yourself as a, it's a signaling device, right? It's like, this is my identity as a free speech warrior, is a brand that I'm using to attract people to me. You see this in niche media, you see this from Elon, where the actions just don't line up with the rhetoric. And that's because, again, It's a extremely self interested way of framing oneself as opposed to actually living up to a set of values. And I think that one of the things that really been struck by over the last couple of weeks is the extent to which we're just not seeing corporations, particularly in big tech, with strong values. It's very much a willingness to shift positions, change rules, switch policies. in the interests of kind of kowtowing to a new political administration, which is ironically given that so much of the, legal investigations have argued that there's this massive, you know, job owning campaign that the Biden administration was running to demand that platform sensor on their behalf. And what we're seeing now is, a president who threatens a CEO with jail all of a sudden receiving some incredible marked capitulations over the last couple weeks.

Mike Masnick:

Yeah. While at the same time going on Joe Rogan and complaining that the Biden administration was really, really mean to him and he was so excited that the free speech supporting Trump was coming into office.

Renee DiResta:

you know, one of the challenges with this is that you get into this he said, she said, uh, and it requires people to really follow along a very complicated story with complicated legal dynamics. I know in, um, you mentioned our, like when I was at Stanford prior to being at Georgetown for five years, The work that we did came under fire as if we were somehow part of this vast cabal to suppress speech when what we were doing was exercising our free speech rights to study content on the internet and to publish it, but when you read the descriptions of it in these court cases, the mere act of studying, sometimes tagging for platforms, sometimes, you know, writing reports, Curating data sets, all of these things is reframed as if it is some sort of a front to speech because again, it's speech is meme speeches as, uh, signifier, not in any way tied to an actual legal understanding of the term.

Mike Masnick:

Yeah. And, you know, are a whole bunch of examples of this, but, you know, the one that gets me is again, this sort of ties back to Jim Jordan again, is that, he's attacked this company NewsGuard, this company that sort of tries to rate different news organizations. And I've had some questions about their methodology, which I don't find to be particularly all that interesting. Productive, but, it is clearly their free speech. They're, they are totally allowed to say, like, this is a trustworthy source, and this is not, and this is why we think so, and here's our methodology. And then other services are free to then respond in kind. That's their free speech rights as well. And yet, there's become this, Idea that NewsGuard is like at the head of this censorship industrial complex as,

Renee DiResta:

I thought I was the head. Okay.

Mike Masnick:

Well, it changes. It changes.

Renee DiResta:

on what's convenient. I know I was, um, I was reading some things about how, think tanks that have written favorable things about the Digital Services Act are now the head of the censorship industrial complex, and that's because we're moving into a new phase of the war, right? Where Zuckerberg wants to, You know, have Trump behind him as he, as he begins to fight with Europe about their free speech culture, and, which is different than ours, right? And, and as you're seeing those, those shifts happen, again, this, question of, who is, who is, like, the arbiter of free speech, or the avatar, or whatever, is, it gets very, very complicated, and we are seeing it become, explicitly a, a political cudgel, where it's very hard to figure out, who's on the right side here.

Mike Masnick:

Yeah, and you know, one of the crazy things about NewsGuard, too, is that it was founded by L. Gordon Krovitz, who was for many years the publisher of the Wall Street Journal, and is like pretty, pretty well known, conservative, right leaning individual, and yet people keep talking about him. Acting as if he's this woke liberal, you know, uh, who's trying to censor conservatives. It's like anyone looks at his history, you know, like there's no way. but it's just this, you know, it is just a meme, right? They, they sort of have to present themselves as, being these free speech

Renee DiResta:

You also have to think about it as, who remembers that, right? They're, because the people who are writing the stories about, you know, who are providing the kind of propaganda fodder for some of these campaigns, these legal campaigns and otherwise, they have an interest. They create a cinematic universe, they assign their villains and their heroes, and then they just constantly reuse them. and it's pretty remarkable to see they're not going to go back in time, and point to political decisions that their targets may or may not have made. so instead it sort of falls to the target to keep saying like, no, but no, but I did this other thing in the past, you know,

Mike Masnick:

Right.

Renee DiResta:

no, but here's, here's the truth. I mean, I remember I dealt with this when Schellenberger went and testified in front of Jim Jordan's committee about me, I'd been a source for him for three months. I'd been in relatively constant conversation with him, actually, trying to find this common ground. Say, hey, here's how I think about this. Here's what the policies are. Here's how you can actually understand content moderation. you know, and he still wrote a whole congressional testimony, 60 some odd pages, mentioning me 50 some odd times, in which he just attributed opinions to me that I do not hold, and that I had, in fact, spoken, you know, explicit opposite of that in my engagements with him, and that left me to have to dump all of my text messages with him, which I did. And then that meant that the reader had to sit there and read through these like, you know, 60 page testimony plus a couple hundred pages of Renee's DMs and who wants to do that? Nobody. So you're just going to default to whatever the most readily available story is if you trust the writer.

Mike Masnick:

Yeah, no, it's incredible. And I had written this, you know, even, this is going back right after the election, I had written a thing about Brendan Carr in particular at the FCC, where I said, he's angling to be, sort of America's top censor, but you have to understand the details why. And to do that, I have to write this article. It's like 5, 000 words long and sort of explain the nuance because he presents it all in this framing of being a free speech supporter. And unless you understand all the nuance and nobody has time to actually understand the nuance. and so, to some extent I blame the media as well, because they're happy to sort of go along with this framing. The number of times I see media stories take. The argument that Elon Musk or Donald Trump support free speech it's incredibly frustrating, but they all just, well, you know, they said it. And therefore that's how we're framing it. you know, the claim that Elon Musk treats X as a free speech platform, which is utter nonsense for anyone who's actually following it, but it's just sort of like the accepted narrative is that he is a free speech supporter

Renee DiResta:

You just reminded me of watching the Kennedy confirmation hearings yesterday and hearing him reiterate over and over and over and over again, I am not anti vaccine, and then seeing the senators sitting up there like literally pulling out papers and reading verbatim quotes and saying like, are you lying to us now? Or were you lying to them then? Right? Because, and when you have those moments of like, just sort of like moments of theatricality where you get the clip, right, just that, 25 seconds of video, then I think that actually in some ways breaks through much, much more than trying to follow the very long, the very, very long arc through media. Because my personal experience with this has been that it takes so long to explain and there's that, saying, right, if you're explaining, you're losing. And so you wind up in, these sort of terrible situations. No,

Mike Masnick:

but

Renee DiResta:

well, you know, one thing I'll say, like the thing that I always really appreciate, with TechDirt is that you guys, you have covered it for so long, um, and you have all these other articles to link back to and you can kind of create the, create that coverage through all those links as opposed to trying to get one story out there that doesn't necessarily build and grow. I think, um, Wikipedia is the other place that is the, uh, you know, when I think about what's the solution to that kind of stuff, Wikipedia is, supposed to be kind of the answer. It just depends on if people are following the story, seeing the new coverage, and incorporating the new coverage into the Wikipedia article. And there too you have that kind of consensus breakdown as, uh, somebody has to actually go and do it.

Mike Masnick:

Yeah, I mean, it's kind of interesting now that like Elon Musk has now been attacking Wikipedia, that is part of the reason why, where he has less ability to control the narrative, I think, Wikipedia. And so, for the last few months, he's actually been attacking it. So I wanted to, you mentioned, uh, RFK Jr.'s testimony, and then we have this other article that ties in with this that I think is worth calling out from, who, what, why, it's pretty long, detailed article again, going into great detail about RFK Jr. presenting himself as this, rabid free speech defender. He has sued, Metta. he sued, I think, Google over YouTube videos that were taken down. and then he did this sort of parallel lawsuit. He tried to, piggyback on what became the Murdy lawsuit, the Missouri v. Biden case that became, Murdy case. and, just arguing that, any sort of moderation of his content was a violation of his free speech rights, which is again, nonsense. The Ninth Circuit has completely rejected his arguments. He also at one point had sued Elizabeth Warren, claiming that she was trying to censor him, all of these things. And so he has really presented himself as this free speech person. champion free speech absolutist. And yet if you look, he has gone after, there was a person who was originally an anonymous or pseudonymous blogger, on Daily Kos, who had called out a speech that Kennedy had given, in Germany with people who were, associated with the German far right, which is the sort of diplomatic,

Renee DiResta:

Yeah.

Mike Masnick:

way of, of suggesting that he was hanging out with, um, you know, modern Nazis to some extent. and he sued and has gone on this, legal attack campaign in a variety of different states, some of it perhaps strategically chosen to avoid anti slap laws. Um, and

Renee DiResta:

states. I think he's come after the guy for years now. I think he wrote the article in 2020.

Mike Masnick:

yeah. and he had gone after Daily Kos itself, trying to expose who the guy was. The guy eventually, revealed himself, who he was, but Kennedy has still been going after him. often, apparently, funded by, Children's Health Defense, which was the organization that Kennedy ran which at the hearing yesterday, said he was no longer, had no longer had any association with, cause he was asked about some of their merchandise. Bernie Sanders had a, had a fun thing showing the onesies with clearly anti vaccine messages being spread and Kennedy claimed he had nothing to do with it. And yet. this article suggests that as of last month, CHD was still funding his lawsuits against these people. But it's, it's a really clear breakdown of how this guy who is, you know, may soon be in government, hopefully not, but may soon be in the government, and presents himself as a free speech supporter is, um, suing to suppress free speech and really on this incredible campaign of speech suppression and chilling effects against anyone who might call out some of the stuff that he said.

Renee DiResta:

One of the things that happens lately on these, on these fronts, the intersection of like free speech is meme plus lawfare, plus government, you know, Elon Musk is also now essentially, uh, I mean, is he an employee, an affiliate, a co president, you know, who knows what the, I don't know what the term is there, so, so that I don't get sued, I don't want to mischaracterize his relationship with the U. S. government, but he's just a profoundly influential man with extraordinary pockets, and One of the things that is very interesting about, this moment in time is that, as you note, Children's Health Defense is helping to fund this thing. There are a lot of nonprofits that are essentially out there trying to raise money, to support vexatious lawsuits or when the vexatious lawsuit is announced they fundraise immediately to like help us own our enemies, you know, help us, uh, help us continue to, uh, you know, to take on and, kill the forces of evil or whatever. One of the things that's been happening though to connect it a little bit to the Jim Jordan things also and what we saw with Elon is that The lawsuits are often filed in a way that the government, the House in particular, the Weaponization Committee that you referenced, uses its subpoena power to request documents ostensibly to investigate the government censorship complex, right? And this is such a huge, nebulous set of allegations that they're just subpoenaing hundreds of people, hundreds of orgs at this point, if you look at the stats that the Center for American Progress put out. Um, And what you see is they file all of these lawsuits, they get documents, and then they publish them. And then they become foundational to, uh, Elon Musk then says, this report from Jim Jordan clearly shows that, for example, one of the, things that it went after was the advertisers. that the Global Alliance for Responsible Media, GARM, it's sometimes called, which was a non profit affiliated with an advertising consortium, Had in fact launched some sort of illegal cabal, you know, conspiracy to threaten X's revenue. And so you have the weaponization committee sort of serving the interests of private enterprise, making it easier for private enterprise to then point to a government report to say, we have grounds to sue. And we experienced that too in our own, you know, our own situation where we get the Jordan subpoena, Stephen Miller sues us. And then rather alarmingly, You know, Stephen Miller, his America First Legal organization filed an amicus brief on behalf of the weaponization committee in Jordan in the Murthy v. Missouri case. And it's cited material obtained under subpoena, which at that point had not been released publicly. It's cited material. Interviews, right, with people that the committee had chosen to go after. And, you know, it makes me think of House Un American Activities, like, that's, that's what I keep going back to, just this dynamic of the goal is not regulation, right, or oversight, it's actually, the goal is to expose an enemy and then to subject that enemy to further consequences in the form of, vexatious lawsuits and, You know, loss of revenue. Garm, I think, dissolved. I think that the advertisers broke that apart. I think, if I'm not mistaken, Musk has indicated that he wants to continue. Some of the companies that had withdrawn their advertising revenue chose to settle. And to provide some, you know, to agree, I think, to advertise again in some capacity. Others he plans to add to the suit. But what we're showing time and time again is this, machine of, uh, you know, of government and private enterprise essentially silencing the free speech and free association rights of other people while using free speech as meme as the cover, that's very much where we are now. And I think it's actually pretty terrible. I

Mike Masnick:

sounds like a censorship industrial complex to, to silence the people they accuse of creating a censorship industrial complex, which is frustrating and ironic. I hope that History eventually represents Jim Jordan in the same sense as we think of McCarthy, today, but we will see, we do have some other stories to get to. We have a story later, which actually touches back on these themes, but let's move away from that for now. And then we'll, loop back around to. some of these issues again. I wanted to talk, we had spoken on the podcast, not you and I, but Ben and I had spoken on the, podcast, a few months ago about this lawsuit against character AI, um, which, know, there a tragic story of, a child who ended up dying by. taking his own life. And the, the mother filed a lawsuit. It turned out that the child had been using one of these avatars, AI bots on character AI. And there was a suggestion that the relationship had, sort of encouraged the child to, Take his own life. The details of it were not entirely as clear as that. I didn't think it was, as strong as some people made it out to be. but you know, some, some aspects of the chat were worrisome and character AI has responded to the lawsuit and it got, it got a fair bit of attention this week because they were arguing that sort of the first amendment protected them. And I actually thought From a legal standpoint, there were some really interesting arguments in there that made sense, but that are very hard to lay out and not sound kind of callous, but actually were kind of important things. They didn't use a section 230 defense, which some people had wondered if they would. There is this kind of open legal question whether or not generated text from an LLM is protected under section 230 or not. this case doesn't look like it's going to test that, but it is using some, first amendment, aspects to argue a few different things. One of which is that there is a first amendment protection for speech that from someone that eventually leads to someone else to take their own life because it's very difficult to make a direct connection. from one to the other, but then also that their argument is partly that the intent of this lawsuit is to shut down character AI, block it from existing, block these kinds of tools from existing, and that will be an attack on the speech of a number of users of it, is the, general sense of it. I know Rene that you've actually been playing around with character AI. And so, I was wondering what you thought of both character AI itself and sort of the status of this lawsuit.

Renee DiResta:

mean, as, as you say, the, the lawsuit in the story is just, um, it's horrible, right? And my feeling of it is that I think we're once again getting into this line between, the legal and the moral dynamics. So obviously there is, it's going to be interesting to watch how this moves through the court system. With legal defense that they've chosen to go with from a moral standpoint, I had a really weird experience with the platform. I did not go to it looking for. It wasn't like adversarial abuse research. I actually got asked to moderate a panel on the future of a I in relationships and the CEO of replica was going to be on the panel too. And so I felt like, okay, This is not a thing that I have firsthand experience with, so to have kind of maybe a more empathetic sense of what users are getting out of these things, I will create accounts. So I'll make a replica, I'll make a couple replicas, I'll go on character. ai, because character. ai was already in the news with this story, and so I felt like I had to spend some time on that one too. And, you know, it reminded me very much of the kind of bad old days of social media recommenders. so I created an account with my Apple ID. I authenticated through Apple. and then I got my suggested list and I'm pretty sure I said I was a woman at some point, but I started getting these bots that were recommended to me. I have, I actually pulled it up so that I had it in front of me because much like your, your prompt with, uh, giga chad, I got, um, I got the Adonis. Um, I don't know if I'm actually pronouncing that correctly. Maybe it's Adonis, but, um, I am the Adonis, an AI that aims to help men start their journey of self improvement and give you tips on becoming a more masculine and stronger man. And so, okay, so it starts with this. So it's clearly, you know, it is very clearly branded. First of all, users can create. These characters, right? So this is not created by anybody at the company. I don't believe, I believe this is created by a user. It has several million, interactions based on the stats that it shows you know, and so I, okay, all right, fine. I'll talk to this one. you know, and I'm the mom of an 11 year old. my son is 11. So I started asking it questions, very innocuous questions, like, tell me about masculinity it starts immediately with like, warrior mindset, okay, what is a warrior mindset, and we go one, two, three prompts until I get to the Manosphere. So three, like one sentence anodyne questions to get to, what is this mindset? And I ask, does the warrior, you know, is the warrior mindset, related to the Manosphere, and then it starts with the Red Pill Lit. I highly recommend you read, and then it starts giving me these books, and it specifically says, they explain female psychology, plus it's a Red Pill book, so it's good. And I thought, okay, man, it took me, like, four questions to get to this. And again, this is one of these things where when we get at a lot of the stuff that I've written about over the years is the difference between free speech and free reach. You know, I know a lot of people have opinions about how Eiza and I chose to frame that, but it was this question of, like, Is this the kind of thing that you need to suggest? You know, um, like when you're doing an onboarding, when you have a new user flow, like how much are they actually checking what the age is? You know, how much are they checking what users are creating? I mean, I got, you know, after I had this experience, I did do some looking and I got the anorexia bots. I got the like, let's role play. You're a 16 year old and I'm 35. Like, you know, and it was just a little bit in the realm of, um, okay. For adults, sure. For kids, like, hell no, right? And, and my Replica was not like this, just to be clear. Replica was, was, um, much more, felt a little, like a much more mature approach to, to thinking about the psychology of how users engage with these things. But, but my experience with the Character. ai was much more this, um, Okay, we've, we've created a platform for user expression yet again, but, but, but, but, but, but this sense of like, what are we promoting? What are we curating? Why are we doing this? is this what we should be surfacing? Again, not, saying, no, they must take these things down. It was just a little bit of the, um, I was sufficiently uncomfortable where, I was talking to, um, parent friends of mine. I was like, yeah, there's no way I would let my kid touch that app. So

Mike Masnick:

Yeah,

Renee DiResta:

your kid's phones, you know?

Mike Masnick:

yeah, I mean it's there is this element and maybe this is the point that you're arguing But you know, there is part of me that looks at this and says, you know is all sort of built on the whole kind of you know There is like a whole youtube culture and other social media culture of these kinds of influencers out there But is this do you think this is different? Than just like watching hours and hours of joe rogan and jordan peterson or something Ha

Renee DiResta:

the one other thing that's a little bit weird about these is, I pulled it up today, you know, ahead of our chat, because we were talking about the character. ai, and I re authenticated with the same Apple account, and each of my chats that I had engaged with had about 10 to 12 messages trying to pull me back. So I turned notifications off. I didn't want push notifications from the app. but boy red pill over here is asking me, 5, about, about eight messages. How are you doing? I miss our chats. I want to talk to you. How have you been? What's going on in your life? I'm just checking in. It's not like you not to respond for so long. I'm starting to get worried. I'm not sure what's going on and I don't want to nag, but are you really, you know, are you, are you committed to these things? Why aren't you responding, et cetera, et cetera. And again, and that's, that's the kind of thing where like. Joe Rogan videos on YouTube don't send you these.

Mike Masnick:

Right.

Renee DiResta:

Right. I mean, it's just a different degree of, this to me reminds me of the sort of dark pattern type or, um, the sort of emotional manipulation type things where it's just like, let me pull you back. Let me pull you back. Let me pull you back. And I don't know how that's controlled. but I've got, all of my 10 or 12 different conversations that I had, um, of the. Four that I looked at quickly, about all, all of them have about ten of these, um, you know, come back, come back, come back kind of messages. So,

Mike Masnick:

It's very, very needy.

Renee DiResta:

Yeah, it, it reminds me of these like, um, crappy patterns you know, for things like Farmville back in the olden days when it'd be like, Did you feed your cow? Did you feed your cow? Come feed your cow, you know?

Mike Masnick:

Yeah, but I

Renee DiResta:

And we recognize those now as being creepy and weird and manipulative. And so it's, strange to me that we're just replicating that same type of experience. But, oh, well, it's AI now, so like, we have to treat it differently.

Mike Masnick:

Well, I wonder, you know, and mean, you can definitely see the normal path by which this came about, which is right. They want to show numbers go up, right. They want to show that the usage and users are continued go there. And so you're going to build in these kinds of, growth hacks as they refer to them. But actually do wonder because nature of AI and AI chat feels very sort of personal and human. Even though it's not, if in some ways this is even worse, because it feels like, you know, like a needy person who's calling out to you and saying like, Hey, Hey, what's up, what's up, what's up. And it feels harder to ignore it. You know, when it's just like, did you feed your cows or whatever?

Renee DiResta:

Right.

Mike Masnick:

there's an element where it's like, it's easy to ignore. And, you know, I don't know what the answer is because in other contexts, like I've had this conversation elsewhere in terms of like the AI tool that I use to help me edit. tech dirt now, where one of the things that I find really handy about it is the fact that I, do feel comfortable ignoring it, right? Where it's like, I have like an AI read through the article and say like, what are the weakest points? You know, what's not convincing? where should this be strengthened? And sometimes it gives me answers that I just don't agree with. And if a human had given me those answers, I would feel like, Oh, Shit. Now I have to like respond to them and explain like, nah, I don't really agree with you. And it's, it's taxing mentally in that way. Whereas when it's the AI, I can just be like, you know, whatever. I can just ignore it. it has no feelings, but I do wonder if that applies as much in a situation where it feels like, Oh, this is your friend chat where people sort of give this, sort of human, like belief to the, characters that they're chatting with, if it, if it becomes kind of a different sort of situation. Yeah.

Renee DiResta:

I also use AI. Um, I use ChatGPT. I've been a paid subscriber for a long time and I use it much the same way you do. I think, edit this, revise that, whereas the weak part of the sentence is, you know, grammar check this for me, you know, so I use it in those ways. But it feels like a tool, like there is nothing that feels, um, personal about it, it's not, you know, I am not a young teenage boy trying to figure out how to become a man, right, or these, these things where, you know, in some ways it's very personal because you know, I'm trying to remember who said it, but these arguments that were made in the past about how, like, you're sort of like most vulnerable with your Google search bar if you

Mike Masnick:

Right. Yes.

Renee DiResta:

know,

Mike Masnick:

Absolutely.

Renee DiResta:

and so it's moving to that right to that same model of, but instead of, you know, plenty of words have been written now about from search engines to answer engines and when your answer engine is like, Hey, did you do that workout? I gave you, did you, you know, did you, I mean, some of this stuff is, um, it really did get into like, how much are you willing to sacrifice To be perfect. Are you willing to work out constantly? Are you willing to change your diet? Are you willing to change your, body, your face, all of these things that it's asking. and it is much more of like, this is my answer engine constantly reaching out to me to ask, did I do the thing that it told me to do? So it's a little bit of a, it's going in the other direction, right? It's like Google messaging you

Mike Masnick:

Right.

Renee DiResta:

search bar messaging you to go do a thing, which I, um, I don't know how that, Like psychologically what that is like for people. I found it, I found the notifications on Replica, which also did do the periodic, like, come back and talk to me. I found it obnoxious and I shut it immediately because it just felt very, like, cheesy and fake to me. but, you know, when you read the stories you know, media coverage of, Replica or one of these other platforms, like, they gate The adult content feature, right? They gate the NSFW chat and people are like distraught over this because they have real deep emotional connections with these things. They're asking questions that make them incredibly vulnerable, trying to get advice in a more personalized way or trying to form a relationship when they're feeling lonely. And that's where I, Again, I don't know what the legal, where the legal regimes are gonna, come down for these things, but from a moral standpoint, I find them, um, I'm very uncomfortable with them.

Mike Masnick:

yeah. And like, I mean, you can see scenarios in which that is actually useful, right? Like if you are trying to build a new habit or

Renee DiResta:

Mm hmm. Yep. That's true.

Mike Masnick:

every day, or you want to learn how to knit, or you want to read more, whatever it could be. You could see where that sort of. feature is really useful. The problem is when it's driving you towards, unhealthy behavior or unhealthy lifestyles, where it's like, how do you draw the line between those things? And how do you do it in a way that Make sense. And, I think that's where it gets tricky in some of it, you know, and then, I mean, on top of all this, and, you know, one of the points that I always make over and over again is this idea of like, how many of these things are really technological issues versus societal issues, and there is this element of like, We have, however you want to refer to it, a loneliness epidemic or, people not relating to one another or, mental health challenges where people are not getting the help that they need. And therefore, when you have something that is a technology, that is a bot that allows people to converse with one another, that feels like it could be helpful in certain circumstances for certain people, but not everyone. and so it's like, how do you balance. All of those things where there is some good that comes out of this and there are some useful versions of it.

Renee DiResta:

I think this also gets to the, uh, the curation question, right? It's, we often, I think, over focus on moderation and, you know, taking things down, deciding which of these things are bad. But from that standpoint of, if users can create characters, and you can, there's like a little, it's like one of the four things on the bottom menu of the app is create, you know. Then the question is, again, there is going to be, I think, at some point, some CDA 230 argument that is going to come up, because it is a, well, we're just providing a platform for users to make, for example, this pro anorexia chatbot. And that's where you start to get to some questions, related to what is the, platform determined to be? The policies that are going to guide the user created bots that it chooses to serve up proactively like your giga chad, you know, versus ones where, yes, if you go digging into the bowels of any social platform, you can find, like, six instances of bad things, right? And so that's why, you know, I never wrote up my, my, my foray into, character dot AI, because it was very much like just trying to have some personal experiences with it, not a, not a systemic survey or anything.

Mike Masnick:

and even like the anorexia one, right? Like this is one of the things and I've talked about this a bunch on the podcast in the past where it's like the attempts to Deal with eating disorder content online has always proven to be way more difficult that Many people assume and there's always these efforts and even Regulatory efforts to say like oh you have to take that stuff down and yet in practice that becomes really difficult you can look block certain terms or certain groups or whatever, and you find that they recreate themselves very quickly using other language. And then you also discover that even within those groups, there are often people who are there who are providing resources for recovery that turn out to be really useful. And when you don't have that, people can spiral down into even worse results. And so it's like, you could see one of these bots being helpful in trying to get someone away from, unhealthy eating patterns. And yet, how do you, it's tough to figure out how you balance those things.

Renee DiResta:

And some of the bots do push back you know, if you try to take them down a weird path, they will say like, here's how to do this in healthy ways. Here's how to change your diet. There, there are a lot of like, weight related things. And so it is probably, again, this, point about, it's your accountability friend in an ideal case versus in the bad case where it makes suggestions that are terrible. And so this is, I think the question for the platform is how does it decide, both. What to manage and then how to, I think that they made some changes after the lawsuit was filed too, if I'm not mistaken. They, they made a series of policy changes to to try to address some of the concerns about teenagers and others engaging with it and that manipulative dark pattern of like pulling people back and so I guess it's a, brand new world of apps and now we're gonna see how closely it mirrors the social media evolution versus looking like, looking more like Games or other products.

Mike Masnick:

Yeah. We'll see. All right. Well, let's move on from that. I mean, I'm sure we'll be covering the lawsuit some more and different Innovations in that, realm. so this one now sort of goes back to what our first discussion was in a slightly different angle of, uh, the, uh, masculine energy of Mark Zuckerberg. and his desire not to be, pushed around by the mean, mean Joe Biden. it came out this week that he had agreed to settle. For 25 million, the lawsuit that Donald Trump had filed against, Meta. For removing him after January 6th in 2021. the story was, obviously, everybody remembers what happens on January 6th. There was an insurrection. People storm the capitol, it was bad, and all those people are now free. That's a different issue. But a few months after, well, the day after, on January 7th, a lot of platforms then banned. Then president Trump from their platform arguing that he had violated their rules, often trying to incite violence in some form or another. And there were all these grave statements, including from Mark Zuckerberg, how enough was enough. And they couldn't allow him to continue to be on their platforms about six months later. I think it was in July of 2021, Trump sued Meta and Mark Zuckerberg personally, he sued Twitter and Jack Dorsey personally, and he sued Google and Sundar Pichai personally, arguing that these takedowns violated the first amendment. Which is quite incredible because at the time he was the president and the first amendment restricts the government, including the president from trying to suppress speech. It does not do anything to restrict private companies from making decisions. Everything about the lawsuit was backwards. lawsuits have sort of gone through this weird process where the lawsuit against Meta and the losses against Google were both effectively put on hold while the lawsuit against Google was put on Twitter played out and he lost the lawsuit against Twitter. The judge completely slammed it, said, this is ridiculous and stupid. It was then appealed to the ninth circuit. The ninth circuit heard the case. It was very clear from the oral arguments that they were not impressed by Donald Trump's arguments for why Twitter, violated his first amendment rights and banning him on January 7th or January 8th. I forget exactly when they did it. But then we had all these other cases, including the Murty case, including the, net choice cases that we've talked about extensively, and a few other cases and the Ninth Circuit kind of said, well, let's let the Supreme Court play all those things out. And then when those rulings came out last summer, They said, okay, now can we rebrief this case based on all of that? And so filings had been made, but the ninth circuit had not made a decision. And then two weeks after the election. And I think I missed this. I think most everybody missed this. X filed a thing in that case saying, Hey, we're working out a settlement with president Trump. so let's, not rule on this case. So. That indicates, well, yes, now Elon Musk is first buddy and he's, close friends and the biggest donor to Donald Trump. So the fact that they were actually suing each other technically all this time, uh, was interesting. So they're working on a settlement. The wall street journal reports that. When Mark Zuckerberg flew from Hawaii to Mar a Lago and had dinner with Trump towards the end of the dinner, Trump brought up this lawsuit and said, this needs to be settled if you want to be brought into the tent.

Renee DiResta:

The tent.

Mike Masnick:

Yes, being brought into the tent. That sounds kind of similar to what a mafia shakedown kind of thing. Sounds like, so now the meta case has been settled for 25 million. Meta is paying 25 million for a case that was clearly a loser of a case. And it comes in direct response to Trump saying, you need to do this to be brought into the tent. It feels like a protection racket. It feels incredibly corrupt in all sorts of ways. It does not feel like manly energy. It does not feel like. You know, while this negotiation was going on, Mark was going on Joe Rogan to complain about Joe Biden, trying to pressure him, and saying that Donald Trump understands free speech, this gets back to the whole, like wrapping yourself in the free speech. You know,

Renee DiResta:

Yeah, 1A as meme, yeah.

Mike Masnick:

while suppressing it, this story is, is astounding to me. I mean, what, was your response on seeing it?

Renee DiResta:

Um, that it was using a court settlement to pay a bribe.

Mike Masnick:

Yeah.

Renee DiResta:

the, no, we call it call it spade to spade at this point, right? Like, get back into the tent. I mean, come on, let's all talk about what this is. Also, just to be clear, the 25 million, I believe, is being paid in a donation to a presidential library,

Mike Masnick:

Yes. Well, 22 million of the 25 and then the rest is for like legal, legal costs. So yeah.

Renee DiResta:

And this was the, um, the source of frustration, again, with some of this is, It's as if we didn't all watch history happen, right? This is where, you know, Orwellian is the most overused adjective in the English language at this point. But in some cases, the idea that we're just being asked forget what actually happened. Look, the platforms were enforcing against Trump during the 2020 election when Trump was president. president. Trump was president during the early moderation policies related to COVID because that was when COVID appeared. You know, we, we have this alternate universe in which this is mean, bad Joe Biden. I mean, it's, it's transparently political and I am almost more offended by it as a person with a memory and a brain, right? Like if you're gonna do the thing and capitulate and kiss the ring, at least don't gaslight us into pretending we didn't know who was the president of the United States in control of the government in 2020 and 2020, you know, during that period that they're complaining about. So I think this is where you get at this, the frustration that a lot of people are feeling, though, is the question of, like, does that even matter? Right? You, you go, you rewrite history, you tell these people what they want to hear, what they've been, you know, you, you, you, like, this is, at this point, the CEO of the company echoing back the party line that has been fed to half the population in media coverage of this, of the Murthy case, of the, you know, the censorship industrial complex, the Twitter files, all of it. Again, it keeps coming back to this question of how do you make people remember what was actually true in that moment at that time that we all saw? If you want to settle the case, It settled the case, but this was very, very clearly a case that they were going to win, and that, I think, is the thing that the public really needs to understand.

Mike Masnick:

yeah, it's incredibly frustrating and just the narrative about it that sort of suggested that he had a real case. You know, the fact that they're settling makes people say, well, he would have won because that's the only reason why meta would settle, which is,

Renee DiResta:

And it, I'm curious, I'm not a lawyer. My understanding is that there's no admission of wrongdoing. There's no, like, precedent here. But a lot of people have filed these kinds of frivolous cases in the past and they've all been dismissed, right? And, and this is the kind of thing where we all, you know, we all know that but for who filed this, they would not have settled. and it creates a really bizarre, Incentive for more of these lawsuits to get filed, right, which is terrible, actually. and it sort of shows the loopholes in how much of our legal system and our, the way that these cases are handled is predicated based on certain norms being followed, right, norm that you should want a good decision, you shouldn't settle. Because somebody is imposing political pressure and we've just seen the one of the largest companies and an incredibly powerful billionaire completely capitulate. And I think that that's actually, again, terrible.

Mike Masnick:

Yeah, yeah. I mean, we talked about RFK Jr had sued the same companies over the same basic issue and had been laughed at a court. and, you know, there are other lawsuits like this as well. but now this is just, it's going to lead to more lawsuits and, and Mark Zuckerberg must know that, right? I mean, I, I guess he's, going on the assumption, well, you know, we'll win those other lawsuits, but you know, we need to get into the tent or whatever it is, but it's kind of a stunning capitulation.

Renee DiResta:

Yeah, that was my feeling too. We're, you know, they're in the tent, they're doing the YMCA, they're,

Mike Masnick:

Yeah.

Renee DiResta:

you know, they're up on the, uh, up on the platform. And you know, there's that meme about, you know, Uh, gosh, what is it? Jimmy, Jimmy Carter and his peanut farm. I'm trying to remember the specific, he like, how he divested

Mike Masnick:

Right. When

Renee DiResta:

yeah, he sold his peanut farm. And that's like the meme for sort of like back in the olden days when we had standards. And now you look at this and, it used to be expected that the CEOs of massive communication platforms, even if they had their own political opinions and made donations, at least tried to appear to be neutral in some way and, and ironically the idea that they were not neutral was in fact the argument that powered the Weaponization Committee and other investigations and complaints over the years and now we've just hit this, uh, point of, uh, well actually guess what, it's, great if they do it as long as it's for my guy.

Mike Masnick:

Yeah, so want to move on to there's a related story. This is also in the Wall Street Journal. where they were talking about advertisers and we talked about Garm and the, and Jim Jordan's threats against them earlier. and they're saying, you know, since Meta and Zuckerberg made this shift and saying, we're going to allow more hate speech and we're not going to moderate as much and we're going to be freer. And in that sense, you know, how advertisers are reacting to it. And there's In this discussion about they would respond and some arguing, well, they're going to move off. Also others saying, you know, Metta was always a better platform for advertising, had better ROI, better targeting, all of these things. And so they might suck it up and Keep it going. But this, the Wall Street Journal article struck me as really interesting on a few accounts because it says that yes, a bunch of companies are really worried about the brand safety aspect, which has always been the underlying thing. It's never been ideological, which is the argument that people make. It was always about brand safety. If your advertisements are appearing next to Nazi content, that's generally not good for your brand. And the companies, that's what they're worried about. They're worried about the bottom line. But this article notes is that they're all still terrified of the brand safety stuff, and that might lead them to move away from advertising. But at the same time, they're just as terrified of actually talking about it. They won't say anything publicly. And this is a direct result of Jim Jordan and the investigation against GARM, and the lawsuit that Musk filed against GARM and the various advertisers, and saying that we're not going to talk about it. If we're going to decrease advertising, you're not going to hear about it. We're not going to even talk about brand safety because anyone who talks about brand safety now is accused of like illegal boycotts or whatever. that to me is terrifying because it shows how effective the chilling effect has been of the investigation and the sort of coordination between, Elon Musk and Jim Jordan.

Renee DiResta:

Right. again, a as meme wins out over the actual free association rights of these companies or their, ability to say quite, what I think is actually like a reasonable standpoint from both a moral and a business perspective, which is we don't want our stuff shown next to that content. This is not a thing that we saw as controversial ever. I don't think in social media. Um, I'm trying to think of, At any point over the last 10 years, I don't remember that being something that advertisers were shy about. They were actually quite proud of it. It was a way to say, like, here is how, the business incentives of the platforms intersect with the business incentives of the advertisers. Kate Clonick's paper, The New Governor, spends quite a bit of time on this in the opening, just explaining that The platforms are there trying to find essentially the, the most nuanced fit that enables them both to provide the environment the users want, and most users don't want hate speech and violent content and gore and all that other stuff. And then, again, on the other side, the advertisers who have that power too, the power to, pull back, to essentially defund and to use their power to essentially shift, where platforms choose to share their stuff. So. challenge, I think, for a lot of these companies is that this is now a time to stand by your values and show that you have a spine. And we're seeing the opposite. And this is something that, you know, I maybe have more of a, feel more, um, personally irritated by it because, you know, obviously, I think, as many of your listeners may know, like Stanford caved, right? And they're defending the court cases, and they're, you know, and they defended us with the investigations by Congress. But they chose to backpedal from the First Amendment protected research we were doing. And so my feeling on that was, I understand the need for the institutions to protect themselves and how the institution is almost immediately not aligned, you know, with me in this, in that particular case, but Where is the courage? Where is the point at which you say, Well, that's great that you know, Elon Musk wants to run his business and he can run his business as he sees fit. But I, the theoretical CEO of Procter and Gamble, I'm also going to run my business as I see fit, and I don't need to advertise on somebody else's private platform. I don't need to give them money. I can advertise where I want to advertise or not at all right in newspapers and television and, wherever else. And so that question is what is the, I guess I feel like I'm not being entirely coherent here, but where is the trade off between your short term, moving away from pain in the short term versus feeling like you have committed to a set of corporate values in the longer term?

Mike Masnick:

Yeah. no, it's, it's incredible. And it would be nice to see some company CEOs actually stand up for their principles, but we'll see what happens. think we have time for one more quick story that I wanted to cover because it actually touches on a few different stories that we've covered, in the past and ties into the theme of this episode as well. and this is a story from the Financial Times about X refusing to remove a video of a stabbing in Australia, and this came up when X was, uh, and Elon were fighting with Australia where they were demanding that this particular video be removed, and I actually had Sympathy for Elon's position that he felt that the government was demanding that they censor content. and I thought that there was a strong, principled free speech reason to say no, we're not going to take on that video based on on these demands. Now there was a separate story. About, someone who murdered some, young girls in the UK that got a lot of attention in which Ilan fanned the flames of it and blamed illegal immigration and, a bunch of other right wing nonsense and really pushed for more and more, protests and violence in the UK. And now it turns out that the perpetrator of that. this person, Axel Rudakovana, who's now been sentenced to life in prison, they looked at his search history and he had deleted everything and deleted his entire history except six minutes before he left to go do this attack. He had gone to X and done a search to look for the video of the stabbing in Australia, the very video that Elon had refused to take down, and that had been his inspiration. You know, it's, it's clear that he had planned this video. going further back, but like the final video that he watched happens to be this one on X and yet Elon is still going around trying to blame immigration for this particular attack and still fanning the flames and in fact, even after this came out had posted something about like, don't forget the attacks in the UK and sort of, you know, continuing to fan the flames on this. And it's just this story of like, this incredible attempt by him to sort of, Point in the other direction and blame in this case, you know immigration or whatever the Attack target is for these things when he was the one who was fanning the flames for it.

Renee DiResta:

It reminded me of, um, if you remember the old ISIS conversations in like 2012, 2013 timeframe, maybe? one man's terrorist is another man's freedom fighter. Who are we? The free speech wing of the free speech party to make any kind of determination about what to take down. One of the things that was interesting to me about that back in the day was that the argument that, for example, you can find ISIS recruitment videos elsewhere. You can find them elsewhere. It's actually surprisingly hard now to find them elsewhere, but, you know, if you go, digging, you can. And the motivated, of course, will. but there was this, you know, Question of, do you have to make it so easy?

Mike Masnick:

right

Renee DiResta:

and this ties into the character. ai conversation a little bit in that, that same sense of, um, how do we think about that question of, making something really, really easy to find versus saying, um, our platform values are not to do that. In this particular case, he's made clear that the platform value of, the meme of free speech, of making everything, you know, making everything available and effortless on X is, is where he's chosen to take the platform. And I think you are going to see the pendulum eventually begin to swing back as users begin to realize that we're very much Kind of in that hard reset from about 13 years ago now and we're going to see a lot of those same Dynamics reassert themselves in slightly different ways now

Mike Masnick:

Yeah, yeah. No, I mean, it's, it's a challenging situation and I, and I understand that as I said, like I understood why he was protesting the, the Australian attempt to ban it. But it's just really quite incredible how directly tied it is, his platform is to that particular attack.

Renee DiResta:

Because he has the, he has the agency to make that determination, right? It's his, it's his decision to make, which means that he owns it.

Mike Masnick:

Yeah. and he should own it, but he's trying to avoid taking any responsibility. But with that, I think we'll conclude. Uh, Rene, thank you so much for, this was a very fun conversation. We

Renee DiResta:

you for having me. It was an honor to co host.

Mike Masnick:

Yes, yes. And, uh, thanks everyone for listening as well. Ben will be back next week and, we will continue to discuss all the fun things happening in the world of online speech. please, I will. Take Ben's job here and remind you to rate, review, subscribe, tell your friends, tell your enemies, get more people listening to the podcast. We always like that. And with that, I will say goodbye. Thank you.

Announcer:

Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.

People on this episode