
Ctrl-Alt-Speech
Ctrl-Alt-Speech is a weekly news podcast co-created by Techdirt’s Mike Masnick and Everything in Moderation’s Ben Whitelaw. Each episode looks at the latest news in online speech, covering issues regarding trust & safety, content moderation, regulation, court rulings, new services & technology, and more.
The podcast regularly features expert guests with experience in the trust & safety/online speech worlds, discussing the ins and outs of the news that week and what it may mean for the industry. Each episode takes a deep dive into one or two key stories, and includes a quicker roundup of other important news. It's a must-listen for trust & safety professionals, and anyone interested in issues surrounding online speech.
If your company or organization is interested in sponsoring Ctrl-Alt-Speech and joining us for a sponsored interview, visit ctrlaltspeech.com for more information.
Ctrl-Alt-Speech is produced with financial support from the Future of Online Trust & Safety Fund, a fiscally-sponsored multi-donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive Trust and Safety ecosystem and field.
Ctrl-Alt-Speech
Chief Equivocation Officer
In this week's round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:
- X takes Indian govt to court, alleges arbitrary censorship of content (Business Standard)
- India criticises X for calling compliance website a censorship tool (Reuters)
- Musk’s X suspends opposition accounts in Turkey amid civil unrest (Politico)
- Elon Musk pressured Reddit’s CEO on content moderation (The Verge)
- Snapchat CEO Talks Zuckerberg, Content Moderation, AR Glasses and More (SocialMediaToday)
- YouTube CEO on content moderation: ‘Where the world was five years ago is very different’ than today (Semafor)
- Porn on Spotify Is Infiltrating the Platform’s Top Podcast Charts (Bloomberg)
- Ofcom fines provider of OnlyFans £1.05 million (Ofcom)
- A New Social Media App Punishes Users for Rage-Baiting (Wired)
This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.
Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.
As regular control alt speech, listeners will know Mike. We start every week of the podcast with a phrase or prompt that you might see if you open an app or a kind of web service. And we've been doing this because we think these little bits of text and taglines prompt us and make us interact and engage. This week we could not do signal, right? Everyone has been talking about signal. Even if we've done it before, I can't remember. But Signal doesn't have a prompt or, tagline in the same way. So we're, we're stealing a bit of marketing material from one of their online pages. And so this week I'm prompting you to speak freely.
Mike Masnick:Well, I would just like to say that if any random CEOs of tech companies want to accidentally add me to their group chats and, uh, tell me what they're thinking about content moderation these days, I, I, I would not be opposed to that.
Ben Whitelaw:We're gonna talk about a few of their CEOs today
Mike Masnick:I, I think so. I think so. And what and what about you, Ben? Uh, can you, uh, speak freely for me please?
Ben Whitelaw:Well, I was speaking freely at the marked as urgent event that we had in London last night. I'm a little bit worse aware this morning as a result. Uh, it was a super fun evening, but, I managed to incorporate, you'll be unsurprised Mike, my renovation into my presentation about content moderation. So I spoke freely both about moderation and renovation and yeah, I got a few, comments and a few jokes about it, but it was fun.
Mike Masnick:And you didn't, you didn't end up winning the, Twitter sign from last week's, uh, discussion for, for as part of your home renovation.
Ben Whitelaw:Not this time. Not this time, unfortunately. Hello and welcome to Control Alt Speech, your weekly roundup of the major stories about online speech, content moderation, and internet regulation. It's March the 28th, 2025, and this week's episode is brought to you with financial support from the future of online trust and safety fund. This week we're talking about Elon Musk, courting of tough men, porn appearing on platforms you might not expect and rage baiting. My name's Ben Whitelaw. I'm the founder and editor of Everything Moderation, and I'm with a man who's in a lot of signal chats and maybe lurking in one that you are into. Mike Masnick.
Mike Masnick:Please, please accidentally add me to your signal chats.
Ben Whitelaw:Yeah, I'm presuming you've never had this done to you before. I was gonna ask you.
Mike Masnick:I have never had anything. I am in, as you note, I'm in quite a lot of signal chats. That is the, it appears to be the group chat, app of choice for lots of people. But I have never been accidentally added to a group chat,
Ben Whitelaw:What a wild story.
Mike Masnick:of the nature of Yes. The US government right now.
Ben Whitelaw:What such a shame. Such a shame. So yeah, if anyone's listening and wants to add Mike, feel free to do so. I've got some congratulations to give to you. Mike. I saw on the internet that you have been awarded a special award this week. The, uh.
Mike Masnick:I think, I think technically I don't get the award until October.
Ben Whitelaw:Okay. Okay. But you, but it's been announced. You don't, you don't
Mike Masnick:been announced.
Ben Whitelaw:Talk us through it. The, the protector of the internet award.
Mike Masnick:Yeah, yeah, I've, I've won the protector of the internet award. I've asked them to give me a shield to go with, with the award. Uh, it's, from the Internet Infrastructure Coalition, which is a really good group. I've, done various projects with them in the past. I've spoken at their conferences in the past, and they. I think only recently I've started doing this. this is not the first year that they've done the protector of the internet award, but they've been doing it for a little while. they have a big thing where they, fly a bunch of their members into DC and they meet with Hill staff and stuff, and then they have an award ceremony where they give out awards to a few people. And this year, very nicely, uh, they have given me the protector of the internet award. I forget who won it last year. They had told me, but I'm, I'm sort of forgetting now. But it's a very, it's a nice, nice, you know, little, uh, thing. Uh, and yeah. So that'll be in October?
Ben Whitelaw:So you'll get to dress up and receive your shield.
Mike Masnick:Yes. Yes,
Ben Whitelaw:Excellent. well you might remember last week that we had a call out for some podcast reviews and I'm, I'm glad to say Mike, we've got some very funny listeners'cause the bar for submitting a review was not hating the podcast
Mike Masnick:That, that this is, let's be clear here. This is what you told them. You told people, if you don't hate the podcast, please review it. And our listeners took us up on that with that specific prompt.
Ben Whitelaw:yeah. With glee. With actual glee. We had three reviews, which is three more than we've had in the last six months. So thank you to those three listeners and all of them said in their review in some form, I don't take this podcast.
Mike Masnick:Which is
Ben Whitelaw:Which is great, which is, and I'm very glad to hear that. So, thank you to those people who've left the review. For those of you who haven't left a review or who left a review a long time ago and want to update how you feel about the podcast, please go and leave a few words and a star rating on your podcast platform of choice when you can. It really helps us really reach our tentacles out into the wider world. And, uh, if you want to leave any other kind of coded messages within your reviews. Please do, and we'll, we'll try and discern them and, and, filter out what it is that the, the secret language is. really, really, really appreciate it. And, uh, with that note, we'll crack on with today's stories. We're gonna start with a familiar figure, Mike, but in a, in a, in a context that is slightly to him and to us. you found a few stories about Elon Musk and his, courting of a couple of strong men.
Mike Masnick:Yeah, this, this has been interesting. I think, you know, last week we tried to avoid talking about Elon, but you can only avoid talking about Elon for so long before he must enter the conversation. I think it's part of the, you know, contract he has with the world at this point. he must be the main character. and so yeah, there were a couple of interesting stories and it sort of. Started with the fact that X decided to file a lawsuit in India. basically complaining about the nature of, the, content moderation demands from the government. This struck me as interesting on a, on a few levels. first being that we had spoken about the fact that old Twitter had also sued the Indian government over its demands. And in fact, that had been sort of an ongoing fight. And at one point the Indian government had raided Twitter's offices in India. Nobody was there'cause I think it was in the middle of Covid. and there was all this sort of back and forth and pressure, and when Elon Musk had taken over Twitter. And declared that the old regime was all into censorship and stuff. And then almost immediately he started obeying a bunch of the, orders coming from the Modi government to take down speech of critics of that government. And we called out the fact that, hey, this looks bad if you're going around saying that you are. against censorship. And then you're, obeying the censorship commands from the, from the government. it raises questions about how committed you are to that, especially when the, the former administration that you claimed was anti speech was willing to fight them in court. And then there were a bunch of stories for the, the next two years basically of x being willing to take down critics of Modi over and over again. There was a documentary, there was a, a few other things along the ways. So just the fact that he is now stepping up and decided to sue is interesting and different because that's a big shift and, and as we've seen over the last few years, the thing that became clear was that mostly I. Elon was willing to fight when he disagreed publicly with the government. So if it was a more left-leaning government, then he would go on Twitter or X and declare himself to be a free speech mortar and talk about how, oh, you know, the awful Brazilian government is trying to get me to take down stuff. The Australian government is trying to get me to take down stuff. But when it was a more right-leaning authoritarian government. Turkey and India being the sort of classic examples, he seemed to be willing to go along with it. So it was really interesting to see him shift in India, at this moment too, because this was also a moment where his other companies are like in India.
Ben Whitelaw:Yeah. So before we get into that, just talk us through the kind of actual, case that he's bringing against the Indian government and, what it leads into.
Mike Masnick:So this is actually really kind of interesting, so. Indian internet law has been kind of back and forth over the last like decade or so, and there had been lawsuits on this, and I'd written about this years ago, where they have these different, it acts, that sort of lay out the intermediary liability questions around content moderation. And for a while it actually looked like the Indian law was, actually going to kind of match section two 30. but then. That upset some people in the government who wanted content to be more easy to take down. And so it shifted in a pretty drastic way and made it so that the government had a lot more power to sort of order content to be taken down. And so what. X is now doing is challenging the sort of latest version of the law. It's section 69 A of the IT Act. and they're saying that that violates free speech rights because it basically creates a way for the government to. Send information to the platforms that they say have to be taken down. and the way the law is currently being interpreted, and again, this is after a few different court cases and a few different challenges and changes, the way it is being interpreted is that if the government sends you a. Content that they believe should be taken down that you really will get in trouble if you don't. And we saw that again with the, previous regime and Twitter who did try to fight this in court and eventually ended up losing. Now the government has responded to this and I thought this was really interesting. And this response came out Just before we started recording, to be honest. and they're presenting it as a very different thing. So in the lawsuit Elon is referring to, or not Elon, but X and their lawyers are.
Ben Whitelaw:love it. I'd love it if he wrote his own lawsuits.
Mike Masnick:that would be, that would be quite something. But, um, they're referring to it as a censorship portal. basically saying the system that is set up to have the government send requests is a censorship portal, which it probably is actually a fairly accurate description of it. The government is pushing back and saying, no, no, this is not a censorship portal. This is just a website that allows us to notify you of harmful content. This struck me as really interesting because this has been the debate that we've had in other countries and in particular in the US and the whole thing with the Twitter files a few years ago when Musk took over, which is that Twitter has set up various portals to allow governments or government officials to alert them to content that they believe might be violating the terms of service. And this is, there's like a very specific distinction in here that is important, which is that the US system. And, you know, the way it works is that certain actors have the ability, they have access to a, portal where they can submit stuff, but it still is up to Twitter. Or the company, you know, to decide does this actually violate our rules? The point is, it's like flagging content and it is a, a more trusted flagger because it's coming from the government, it'll be reviewed in order and determined whether or not it actually violates. And as we saw with the actual details that came out later of the Twitter situation, Twitter often would, reject those and say, this doesn't actually violate our terms of service. and we will reject them. And so the Indian government is sort of presenting this as the same thing. Like, this is just a way for us to alert you. And I think they're deliberately sort of mimicking the language that was used in the US to present it as not as threatening or problematic. Whereas what X is claiming and what I actually think is probably more accurate is that under Indian law. Unlike in the US when you get a request from the government through this particular portal, under this particular law, you feel very strongly compelled to remove that
Ben Whitelaw:Hmm. Okay.
Mike Masnick:And so, I find it interesting that India is sort of now using the language that was used to describe the American situation where it wasn't censorship, even though some people claimed it was censorship, including Elon Musk to now reflect. The situation in India where there is much more clear government coercion as part of this process, and now we'll see how it, fights out in the courts. Though it is very strange that at a time when Elon is, seen as fairly close with Modi and has been doing deals for his other businesses, including both Tesla and starlink, for him to suddenly decide that this is a fight worth fighting
Ben Whitelaw:Yeah, that's what's really interesting is that Elon Musk and Modi met in Washington in February, and I remember the photos, there's a bunch of photos, that were published of him with Modi and some of his kids as well. It was a very kind of odd photo op Dunno what they were trying to achieve, but, clearly a kind of photo moment. and then as you say, he's trying to kind of expand his business interest there. Do you think that this court case is being used as leverage in the push to expand into India with Tesla and, increase the market share of, of starlink?
Mike Masnick:I have no idea. I mean, it, it, it sounded like he was getting those deals anyway, so I'm not, I'm not entirely sure that this is beneficial, so I'm, I'm honestly a little confused by it. You know, I'm wondering if more information will come out at some point that something else came up within this process. It strikes me as a weird way to have leverage because. I'm sure Modi is the stronger player in, in this situation. you know, Musk needs access to those markets for his other companies. So it's, it strikes me as strange, just also the fact that, you know, for the last, two years he's been willing to go along with Modi's demand. So I'm not sure what pushed him over the edge here. And in fact, that leads into, you know, one of the other stories that we did wanna mention here, which is that. in Turkey, which is the other place where Musk has shown a, a willingness to roll over and, do what was demanded. He's been taking down the accounts of various, activists and opponents to the Erdogan government. And so again, we've seen it over and over again. When Musk is in agreement and aligned with the governments of these authoritarian countries, he seems to have no problem pulling down. content that the government is criticizing, and he did so again this week in Turkey, but in India, suddenly he's challenging it. So it's, it's a very strange situation and I, I really have no idea why.
Ben Whitelaw:Yeah, it's funny in a way that our main story. This week is the fact that, Elon Musk doesn't have a consistent approach to speech. And that's, it's a, it's a story that isn't a story in a way, but it, it's, there's enough there, I think to kind of bring it to listeners and explain that. must, continues to be very hard to predict when it comes to, understanding why he says one thing and does something else. And not only that by why That seems to change from almost month to month.
Mike Masnick:Yeah. I mean, there's clearly no consistency and, and so I'm sure there's some other reason why X suddenly decided it was worth fighting this, But this one doesn't seem to fit the same pattern where you can sort of easily slot it into, well, he likes this government, he doesn't like that government, or he needs some, other thing here. this one is just surprising and I'm, I almost wouldn't be surprised if this lawsuit gets dropped very quickly, if Because of other back channel discussions. Musk is like, Hey guys, knock it off. I mean, maybe is it that he was distracted because he is running the US government and wasn't the one to make this decision. And then once this gets back to him, he will change his mind. I don't know. but it's, it is a slightly surprising development
Ben Whitelaw:Yeah, it's not the only story in which he appears this week, Mike, and we both spotted as, as is always the case. We both spotted a story in the Verge this week that told of Musk back channeling again. the CEO of Reddit, Steve Huffman about some issues that he had about. Moderators on Reddit, taking down and blocking links to X. and so this is a kind of story that picks up from last year, but is has been recently reported this week about how the, essentially these two CEOs were texting each other, probably sat back on the couch chilling on a, after a long day of, thinking about how they were gonna take over the US government And, and he was kind of complaining about this, development, and subsequently Huffman, banned some of the subreddits, deleted all of the comments. And so what did you make of this as an indicator of, Musk's again, inconsistent approach to
Mike Masnick:Yeah, I mean. it's yet another example of his pure hypocrisy. Right. And so, you know, Huffman and Musk have been friendly for a while and, Huffman in the past has, clearly been inspired by some of Musk's actions and sort of, you know, freeing up, other tech CEOs to be a little more aggressive in their viewpoints. this, it's just crazy, right? Because we know that, Elon and X have blocked links to all sorts of competitors for various reasons, sometimes for a while, right? They were slowing down links to Substack for a while. They were blocking links to Mastodon. They were closing accounts of people. Overall X has, completely down ranked links because Musk wants to keep people on the platform and so it's, he's admitted a few months ago, he finally admitted it. People sort of recognized it, that posts with links don't get rated as highly in the algorithm, you know, so he's clearly. Done things to try and keep people within his platform to then go and complain to Huffman that a few subreddits had decided as a, as a kind of protest that they were no longer going to include X links or, links to X would no longer be allowed in those subreddits. That this was some sort of. Major breach that needed Hoffman to step in. It suggests a level of hypocrisy, which is not uncommon with Elon Musk, but it does seem notable. The other element of it was, it wasn't just the, blocking of links to X that he was concerned about, he was also concerned about, people calling out. doge employees, the various kids that Elon has brought into the federal government who are, wrecking havoc all over the place and, he was upset that some of those, the people were being named or talked about. There's a claim that it was like advocating violence against them, though I think that was somewhat exaggerated. people are saying like, even naming them is advocating violence against them, which is not. Accurate. and so he, he seemed to be partly upset about that. And then as part of that discussion, also apparently upset about some of the subreddits and the moderators within those subreddits, deciding that they weren't going to allow links to, the former Twitter,
Ben Whitelaw:Yeah, does, it does make me think that I should be throwing my weight around via text a lot more. I. Do you know,
Mike Masnick:Apparently that's the way, uh.
Ben Whitelaw:think of it as a, as a weapon in my armory, but, um, is making me think that it's, there's maybe some, some people that I can make me do something for me, uh, via
Mike Masnick:Well see if you can get into the chat with the officials from the US government and start throwing your weight around there.
Ben Whitelaw:that would make for a good podcast next week. Um.
Mike Masnick:Yeah, yeah. Telling JD Vance to shut up would be, uh, there's a start.
Ben Whitelaw:Yeah, bring me a coffee jd. Um, so, uh, we're, we're gonna talk a bit bit more about CEOs of platforms now because I, I've kind of been listening to a couple of podcasts with some CEOs of platforms this week, and there's a really interesting, difference in how they talk about content moderation. Let's start with Evan Spiegel, the CEO of Snapchat. Who was on Diary of a CEO with Stephen Bartlett this week and talks at length about a whole range of different issues, with a, a little segment about both Snapchat's approach to content moderation and also, other kind of tangential issues like how meta is thinking about, content moderation as well. And. We're gonna play a bit of a clip for you. This is kind of technical wizardry that we haven't tried on control or speech before, but you're now gonna be able to hear a little bit of that interview with Stephen Bartlett, because I think it's a really good, response, uh, and a really good interview around some of the issues we talk about here all the time on control or speech. And there's a great example, I think, of a CEO, who, who gets it, to a large degree. So have a listen.
CLIP: Evan Spiegel:You know, when you, when you think about self-expression, the importance of self-expression, the environment that you're in really matters. Right? And that's why we have content guidelines because we want people to feel like they're an environment where they can express themselves. And I think some of the, the conversation about different content guidelines or having content guidelines or not having them, has been really interesting.'cause I think people are missing. The broader point, if you have a, a platform with no content guidelines and it's full of people yelling at each other or saying really mean or offensive things or posting a lot of pornography, that's a really uncomfortable thing for most people. Right. That's, that's uncomfortable. You say, oh, this, maybe this platform isn't for me. Maybe I don't feel comfortable expressing myself here because all this stuff I'm seeing isn't really appropriate or, or aligned with my values. And so one of the things we discovered really early on is if you want to create a platform where people feel comfortable expressing themselves, feel comfortable communicating with their friends and family, having content guidelines is really helpful because it means that the content experience is one that that feels more comfortable,
CLIP: Interviewer:but isn't that. People would say, well, that censorship, I'm thinking now of the video that Mark Zuckerberg released about Met has changed to their moderation systems, moving to Texas, realizing that, I think he said that they'd overindexed with their moderators in terms of left-leaning politics. So a lot of the right-leaning content had been censored. What do you make of that argument for content moderation that we don't wanna censor people?
CLIP: Evan Spiegel:I think it's a misunderstanding of the, the First Amendment and, and how it applies. If we look at our country, uh, the way, you know, at least here in the United States with the First Amendment, that really focuses on the way that the government interacts with content creators or content publishers, and it says, Hey, it's not okay for the government to interfere with individuals or publishers self-expression. Right. That's not allowed. Mm-hmm. But one of the things the First Amendment also does is say, you know, platforms or individuals can make choices about what sort of content they. Want to promote or want to have on their platform, that's part of the First Amendment. You can't force the Wall Street Journal to, you know, put this article or that article or accept any article from any author all around the world. The Wall Street Journal as a paper can decide what, you know, what authors you know, it wants to include on, on its pages. And that's part of the protected First Amendment expression we have here in this country. So this whole notion of, of censorship doesn't apply to, to. Companies that are private businesses that actually have a First Amendment right to decide. What content is on their platform and they may want to decide, we're open to literally anything. Anything goes, no problem. And it seems like some platforms are making that choice, but other platforms like ours say, Hey, in order to have a healthy set of discourse across our platform, in order to make sure people feel comfortable when they're viewing content on our platform, we don't want people to come across pornography, for example, or violent content or you know, hateful content that that's not something that makes people feel good. And we actually. Want to make sure that, that that content isn't on our platform because it doesn't comply with our, our guidelines. And that may be one of the reasons why in some of these studies, it shows that people feel better when they use Snapchat'cause they're not encountering, you know, really violent, uh, content when, when they're using Snapchat
Ben Whitelaw:so listeners have heard Evan speak there about how Snapchat approaches content moderation. Mike, I mean, I think his response to Steven Bartlett around the First Amendment was particularly interesting and I wanted to note the fact that Bartlet has said a, a bunch of things in the recent memory around content moderation in relation to meta. So dunno if you remember, he posted on his LinkedIn page, which has many, many millions of followers about the fact that, Meta's move away from fact checking. And it's, changes to its content moderation policy represented one of the most important videos that, people will see this year and a course correction to what he seemed to suggest, and it was slightly coded, but seems to suggest was a kind of overreach around content moderation. And, and I've listened to a few kind of Stephen Barett podcasts. I don't love the guy, but he has a slight tendency to kind of air into the manosphere. I find, um. It's interesting that he, quizzes s spiegel quite openly about his content moderation, censorship Spigel has a really good response, I felt. were you surprised by how well he handled that and, the way he under seemed to understood it?
Mike Masnick:Yeah. I mean, partly and partly not. I, I. partly not because it was a great answer. I mean, it's an absolutely fantastic and very thoughtful and correct answer. Understanding the things that this is not a First Amendment issue, that, values determine what kind of community you want to build, and that is what users appreciate as well. And that there are reasons to do this that have nothing to do with censorship, but just what kind of community you're trying to build. I think it was a, a fantastic answer. SP's been. the founder and CEO of Snapchat for a while, he's gone through a bunch of these fights and, and arguments, and they've been involved in some of them, and I feel like he has a, a really deep grasp, so I'm not surprised in that he gets it right. the only thing I'm surprised in is that like I. It feels like every other CEO in the tech space no longer does. and if anything like Spiegel had the reputation historically, and this is probably unfair that he was, you know, he, he was a little bit more of like a frat boy, not really deep in the policy weeds on these things. And yet this answer suggests someone who's really thought deeply about these things and actually has a deeper understanding of it and is willing to explain it clearly and not you know what a lot of CEOs do is kind of deflect and mislead and sort of dance around it. And he was just very direct. He's just like, this is not a First Amendment issue. It's a values thing. We wanna build a community. This is what our people expect, this is what our users expect. And this is the kind of thing that we've decided that this is what our values are based on. And it was, you know, it's fantastic and clear and appreciative and you can hear, I really have, you know, I think I've maybe heard of Bartlett, but I've never seen any of
Ben Whitelaw:Oh, really?
Mike Masnick:videos before. I, I was not really familiar with him, and I only watched this one little section, you know, a little bit longer than the clip that we played, of him talking about the content moderation stuff. And, but I did get that sense even in the way that he frames the question when he, talks about meta and he was just like, well, you know. mark Zuckerberg made this decision because they were taking down too much, conservative speech and they had to move to Texas for it, which is like, we know is not true. But he, he seemed really bought into the narrative of what happened. And so it was really nice to see Spiegel just kind of push back on him.
Ben Whitelaw:Yeah. And Barla has a kind of Zuckerberg aesthetic, doesn't he? He's, he's got the kind of black T-shirt. He doesn't quite have the, the gold chain, but, um, you know,
Mike Masnick:needs the Latin phrase on the t-shirt.
Ben Whitelaw:yeah, exactly. Exactly. And, Barla has been, kind of, had some focus on him for. showcasing and highlighting kind of health misinformation on the podcast as well. So he's incredibly in the UK at least, he's incredibly well known. He's incredibly well listened to. He's got various books out and he does seem to kind of, I would say, spotlight some slightly odd. Health experts, inverted comm. And so again, you might expect S Spiegel or anybody on the podcast to somewhat side with him. And actually I thought Spiegel did a great job of kind of standing his ground One guy who didn't do a very good job of that was our next CEO, who is on a podcast this week. Neil Mohan of YouTube. He was on the SEMA four podcast mix signals with Ben Smith and Max Tan. We're gonna play a bit of a clip now of, how he was, responding to some questions about YouTube's content moderation policy and some of the tensions around the US administration and the kind of clash in ideals. You'll notice, I think in this clip, a very different sound, a very different tone, and he's not only, defensive I'd say, but also quite evasive in his answers.
CLIP: Interviewer:You've said that the number one priority for YouTube is the safety of YouTube's ecosystem. And we're in a moment when that's actually like a slightly unusual thing to say and a lot of platforms are really backing off anything like content moderation, probably because of pressure from this, this White House and this administration. And I wonder if, if you feel like you are, there's tension between you and the, and the administration on, I guess, particularly issues around public health. Hmm. Um, I'll say a few things. Um, I. First and kind of probably most important and kind of. Really at the top is, um, everything that we talk about, everything we just talked about in terms of the business, um, how content works on our platform, et cetera, is back to our mission statement, which is to give everyone a voice and show them the world and the first half of that mission statement.
CLIP: Neal Mohan:Is really about, um, free expression and freedom of speech. And I can say for myself, and I know for many of the colleagues that I work with every single day, that's why we come to work. Like that's the power of YouTube, right? Like that. If you have an idea, you have a thought and you wanna share it with the world, then YouTube is a place where you can go and share it without somebody telling you that you don't sound the right way or you don't look. The right way, or you're saying the wrong thing or what have you. And that is core to our mission. And everything that we do is ultimately, frankly, in, in service of that. And so, um, it's, it's the reason why actually, I think we've had community guidelines from the very early days, um, and, um, and. In order to allow creators to be able to share their ideas, have this free sort of voice, freedom of expression, and to earn a sustainable living from it. We also have rules of the road in terms of how our platform works, right? Like no porn or adult content or financial scams or what have you, right? Like back to the question around like when you turn on the tv, like that's not what consumers are looking for when they turn it on and are. Mm-hmm. Advertisers, right? The brands that support that content aren't looking for. Uh, and so, um, our approach to responsibility is with all of that in mind, right? But ultimately towards this goal of, of a freedom of expression. That's how I've always looked at it. Um, you know, and even in years past when you and I've talked about it, hopefully I've been consistent in terms of, of that, that sort of core thesis.
Ben Whitelaw:So, yeah. Mike, what did you think about how Mohan responded to Ben Smith's? Kind of very pointed questions.
Mike Masnick:Yeah, I mean, this is the more typical, unfortunately, the more typical CEO's response that we hear on these kinds of questions where someone doesn't want to come out and say anything that then will be taken up by, you know, probably a bunch of idiots on various social media platforms, taken outta context and presented as, something to rally around. And so he says a lot of nothing, and he does so in a very sort of, Defensive way, and doesn't, directly address the issue. And he could have, right. I mean, it would've been great to have him come out and respond the same way that Spiegel does. and so, you know, I think this, the contrast between Mohan and Spiegel is really, really notable. And it's, you know, it's unfortunate for Mohan that they both came out around the same time, but it's a, just such a different answer to effectively the same question.
Ben Whitelaw:Yeah, and, and also I noted the fact that Mohans Responses have changed significantly in the last two to three years as well. it was only, I think, probably three years ago, and I know the world has changed a lot that time, but he gave an interview to Caseon Newton, a platformer, in which he talked about working with, other partners in the space, civil society, organizations, nonprofits, kind of experts to help remove borderline content and that probably at that time made sense. you can tell from the clip that he, that idea is dead, you know, is very much, just doing the bare minimum and, and, and no more.
Mike Masnick:Well, I think it's, in some sense it's even worse than that, right? Because this kind of answer it's a, not trying to say anything answer. It's an answer that's designed to not, to try not to get anyone upset by not actually saying anything and. It's a lost opportunity. It's an opportunity where he could come out and say the same things that Spiegel said, which is that, it's our place and we get to determine how things are. And like, yes, we're trying to enable free speech, but one of the best ways to do that is to have a setup that reflects values and that people don't feel harassed or feel that there is misinformation flowing there. and he didn't say that. And so, I think it's a lost opportunity.
Ben Whitelaw:but maybe an opportunity. Gained in the sense that you can now send this to folks in the Republican party in the US and and justify, and you're shaking your head now, but you, and justify like we, you know, I've said publicly, something that you agree with and therefore I'm
Mike Masnick:right. I mean, it it's not even, it's not even saying that. Right. It's not even saying what the, Republicans want him to say. It's, it's saying nothing. that's the problem. Right. You know, I mean, with, Zuckerberg at least I, I mean, okay, so yes, you could say that like, this isn't like the complete capitulation that Zuckerberg going on Rogan and saying a bunch of nonsense was like, just obviously. completely ridiculous untrue fantasy land stuff, that the Republicans got excited about. Nobody's gonna get excited about this. What he said was nothing. It was empty.
Ben Whitelaw:Yeah. But sometimes the bare minimum is all these guys. wanna do, isn't it?
Mike Masnick:I mean, sure he can point to it, but I don't think, this would satisfy anyone. I don't think it satisfies anyone on any side of this debate, because it's not, the full throated endorsement of any particular position. It is clearly like trying to, tiptoe around landmines.
Ben Whitelaw:Do you think that the US administration will, have the potential to go after Snapchat or put pressure on YouTube because they're not saying what they would like them to, they're not kind of towing the line or,
Mike Masnick:We will see. I mean, you never know who the next target is gonna be. You know, the, obviously there are investigations going on now with the FTC that we've talked about where there're. You know, they want to go after tech, but we don't know who it is they're going after because most of the tech companies have sort of, towed the line and, kissed the ring. And so it's unclear. So maybe Snapchat just becomes a target, but I don't know. They haven't really been, you could see where it's like, it's been really easy because Snapchat is often considered one of the ones that kids use. And so whenever there are kids safety discussions, Snapchat will often come up. So, I could totally see a kind of moral panic. Begin around Snapchat and they'll say that they're not handling kids well and that they'll, do stuff around that. But I don't know, it's impossible to predict with this administration.
Ben Whitelaw:Yeah, nonetheless, I think it's interesting to see two CEOs of two major platforms talking about content moderation in the same week. And it does have the sense of these platforms trying to shape the narrative and, be on the front foot with how they, talk about this topic. Spiegel talks about proactively scanning for. Pornography in a way that Spotify on our next story might have something to, uh, learn from. You found this story about, Spotify being unprepared, let's say for some graphic content upon the platform.
Mike Masnick:Yeah, this was, this was almost hilarious, right? So anyone who's been in this space for any length of time knows that if you any kind of user generated content, like at some point you're going to have to deal with pornographic content and have a clear policy and a way to enforce it that is, even true with text, but. soon as you get to, video or imagery, you know, so Spotify has always been audio for the most part, and music, and so that was less of an issue. They've gotten bigger into podcasts and there's been some controversy there, but now sort of realizing how much stuff is video and how many podcasts are now video, they've sort of moved into the video space and apparently. Even though their policies are that they will not allow pornography, they were unprepared for, for, uh, sexually explicit material, suddenly showing up and getting very, very popular. And so they had the, top business podcasts. Their listing of top business podcasts apparently included. Some fairly pornographic material, which is not normally what I associate with business content. Uh, and they were sort of taken by surprise and had to respond and said, oh, of course. You know, it wasn't, we didn't intend for that, and that violated the rules and they eventually took it down once it was called out. But it suggests that, they may have moved into this sort of video market without preparing their trust and safety folks for the level of, pornography and, the ways that people are going to attack things like the trending lists and the top
Ben Whitelaw:Yeah, indeed. And it's surprising because it wasn't that long ago that they were in broiled in the kind of Joe Rogan, scandal, and they faced a whole bunch of heat for their trust and safety approach, I would say. They then bought a company called Kinson, which actually did some smart work too. Identify misinformation and kind of false narratives deep within podcasts and, you know, kind of often hidden in, far reaching corners of the platform. full disclosure, I did some work for Kinson at some point before they were acquired by Spotify. That's how I know
Mike Masnick:Ding, ding, ding. I finally get to do the ding, ding, ding.
Ben Whitelaw:Ring the bell. And, um, so it's surprising that they have kind of fallen into the same trap of, you know, not necessarily tooling up or maybe skilling up around what has been something that has been known for a long time
Mike Masnick:mean, everybody knows, and you don't wanna make too big of a deal of it, right? Because like, again, content moderation, scale is impossible, right? There's always gonna be something that slips through. People are gonna make mistakes, things are gonna get missed. So I don't wanna make too big of a deal of it, but it is noteworthy that like I. you would think any platform as they're expanding into video and pushing video heavily, they have to realize, and in fact, so I think the video that sort of made it onto the top business list was completely, it was taken directly from PornHub and in fact had like PornHub logos on it. And so somebody just ripped it, put it in there, and was able to, get a bunch of downloads. but the fact that it made it into the, top business podcast lists. you would feel that there would be a little bit of extra, review before something gets to that level. And it's, it's just sort of noteworthy that it appears that at least this particular attempt to get pornography, witnessed on Spotify made it past the, uh, the guards.
Ben Whitelaw:detective ever have a porn problem?
Mike Masnick:Uh, in the comments? Um, uh, not a bad one. I mean, we, we certainly had the issue of spam, right? And so, and a lot of spam is sort of like linking to pornographic content. Um, I. So we definitely have had that. we did have, at one point we had this weird thing where somebody was showing up and basically writing a novel in the comments to a very old tech post. It's, the comments are probably still there, and, it was like, I mean, just reams and reams of text
Ben Whitelaw:well, like an,
Mike Masnick:to do.
Ben Whitelaw:like an erotic novel.
Mike Masnick:Yeah. Yeah. it may still be somewhere in the tech archives. Uh,'cause I don't think we pulled it down. It was like, it was on like a really old post, so like nobody was reading it, it wasn't interfering with anyone, and so I can't remember if we left it up or if we pulled it down. It was. They were like coming back every so often and just like adding another chapter. It was massive. It was massive. We've had some weird things happen in the comments over time, especially like older comments that just go back ages and, people will put in all sorts of weird
Ben Whitelaw:Maybe we'll do a, a special recording of the podcast in which we read out the erotic novel written on a, in the comments that are the tech out post decades ago.
Mike Masnick:I, I, I don't even know if I could find it again. Uh, we were looking recently, we're, somewhere at 82, 80 3000 articles on Teched at this point. And so yeah, even finding that and over 2 million comments. So it's, uh, we, we've got a fair bit of
Ben Whitelaw:Yeah, good luck finding that. Um, it's worth saying that Mike and I have been thinking about, accompanying the audio version of Controlled Speech with a video version and we have been uming and ing about it. We would love
Mike Masnick:need to get to the top of the business list on Spotify.
Ben Whitelaw:It seems like we have a way to do that. Um, but yeah, if, if, if listeners, I, we'll keep it clean, I promise. If listeners have, have a, a view on whether they would listen and watch, a Control Speech podcast on the platform of their choice, drop us a note Podcast at Control. speech.com. C-T-R-L-A-L-T speech.com and give us your thoughts. it might spur us on to, do a, version where we have both of our big heads on a screen together, talking of adult sites. Mike, the kind of other story that I noted this week was, one about OnlyFans and it's in relation to off.
Mike Masnick:We're not starting an OnlyFans Ben.
Ben Whitelaw:Revenue diversification, Mike, we've gotta make this podcast pace somehow. You promised. Um, so off comm has handed a, 1 million pound fine to OnlyFans for, failing to supply accurate information about how it prevents underage users from accessing explicit content. And this is a longstanding investigation, which has kind of finely come around this week. it's a bit of a gaff really. OnlyFans had told Ofcom that it's challenge age. So the age at which it prompts users to, tell them how old it is, was 23, only to find out from the tech provider that provided that service that it was actually set to 20. So there was a kind of gap of three years between what it told Ofcom and what was actually true. And that has led it to. Be given and to accept a 1 million pound fine. that in itself is, interesting, but I think what's really fascinating to me, Mike, is the release of this story and the timing of it. Okay. So last week we talked about the online Safety Act, the brand new, but long in the making legislation in the UK finally being rolled out and intermediaries in the UK being liable for, the SA. Only a week into that being true. do we have this announcement? And the announcement is actually not related to the online Safety Act at all. It's in relation to, a regulation that predates the OSA, it was in existence before the OSA came to being. So basically Ocom could have brought this, and did obviously bring it, against OnlyFans at any point, but it waited until the OSA had been rolled out. I. Yeah, I think to potentially give the impression Ofcom kind of doing its job and, being the kind of enforcement power that it wants to be seen to be. and I read around the reports about this story, the Guardian, the ft, and others don't mention the OSA or the kind of more niche regulation. this enforcement is brought under, so it kind of gives the impression to the unsuspecting eye that actually this is related to the OSA and the Ofcom is suddenly kind of doing its job. So. The kind of cynical journalist in me thinks that this has been coincided very well with, the OSA last week. It's actually nothing to do with that, but, I have heard on the grapevine that there is some, enforcement being prepared around the OSA and there's, there's some naturally, as we've talked about in the podcast, there are, some, platforms that are being closely looked at. So. What did you make of the kind of timing of this? Are you as cynical as I am?
Mike Masnick:Yeah, I mean, I don't know it, the timing might be right just based on like, apparently, OnlyFans alerted ofcom to the error in January of 2024, which is, you know, a little over a year ago. There was the investigation and the back and forth and then sort of figuring out what it was gonna be. You know, the timing seems about right. I mean, having it come out now is not like Totally out of the ordinary, under that,
Ben Whitelaw:Look at you being all friendly to off what? Have what? What have I made you do?
Mike Masnick:But yes, I mean, I, it is, it is entirely possible that the exact timing of the release may have been, let's say, pushed back a few weeks or a month or something, recognizing that the OSA was about to go into effect and that everybody would be looking to Ocom and to see how and when they actually started enforcing things under the OSA. And so, you know, it wouldn't surprise me if. the timing was massaged in some way to make this work. but, you know, we'll see when, when the actual enforcements come out. But, yeah, it'll be interesting to see and to see whether or not this has any impact. And if people think like, oh, okay, Ofcom is actually, I. Trying to enforce stuff. it's possible. I do wonder, I mean, you could argue too that, Ofcom was, man, I'm gonna defend Ofcom again. They, they, they might reasonably have been concerned that if they had announced this, you know, three weeks ago, that it would confuse people into, because like it was pre OSA and so it's like, well, wait, I thought this law isn't going into effect for two more weeks. So, you know, why are they enforcing it now? And so, there may be some reasons where it actually did. Makes sense just to, keep everybody else from being too confused by it.
Ben Whitelaw:Yeah, it would certainly have distracted from the narrative of the OSA, I guess, if this had come out. But, the way that the coverage has transpired, and I know it's difficult for journalists to necessarily know every nut and bolt of every single piece of. Legislation and, what it refers to. But, it was something that pricked my ears at least. let's round up Mike on a, slightly kind of more interesting, quirky story that you found, a platform that we may end up referring to in, a future episode of, the podcast in the opening section.
Mike Masnick:We have to see what the prompt is.
Ben Whitelaw:yeah. Tell us about C US and what it's doing.
Mike Masnick:Yeah, it says us, S-E-Z-U-S. is this new platform? It, I had heard about it, I think late last year. There was some talk about it. and it is a sort of, Another Twitter like platform. It was created by Joe Trippy, who is a sort of semi-famous political consultant figure. sort of became famous in 2004 as the campaign manager for Howard Dean and his sort of upstart internet fueled campaign. and then ever since then has been sort of in and around specifically democratic politics, in the us and so. Late last year, there was some talk about how he wanted to set up his own social media platform and he wanted to do something different and it was going to be more respectful. And, you know, to some extent, like we've heard all that before. You know, like lots of people said that, especially after Elon Musk took over, was this idea that like, oh, you know. And, and usually presented in a way where you're just like, wow, this person is incredibly naive about the realities of, of human beings,
Ben Whitelaw:Yeah, yeah. Yeah.
Mike Masnick:and let alone getting a bunch of them together. but there do seem to be as, the app is now officially launched, they do seem to have created a few interesting elements to it that I think will be worth seeing how well they catch on. And so in particular, rather than just relying on like a, a. Team of trust and safety officials to determine who is violating the rules and who isn't. There is an element of sort of crowdsourcing stuff where they've built in, they call it a reputation engine that, so users themselves get to rate other people's posts. And so it's sort of a mix of like community notes and Reddit up votes and down votes and, even a little bit of like Wikipedia elements to it where it's like, sort of crowdsourcing reputation and the idea being that. users who have a high score, their content rises to the top users that have a low score. Their content, will be not as readily. It'll still be there, but it won't, work into algorithms or be as readily, shared and, viewable and, you know. It's an interesting idea. I have questions about how well it'll work in practice. You know, as soon as you get into this kind of thing, you worry about things about brigade and deliberative attacks on certain kinds of speech and how it can be abused and how, I mean, there's always fears about things like echo chambers and stuff, which I think might be a little bit overblown, but it's an interesting and different approach. And right now the one thing that I do believe very strongly is that. the more experiments, the better. And so, I'm happy to see an experiment. I'm happy to see how it works. I might be a little skeptical that this will work out as well as they sort of think it will. But again, what we need right now is experiments. We need differentiation. We need people to try different things. And so I'm excited to see them enter the space.
Ben Whitelaw:Yeah, I think it's an interesting, feature and an interesting idea that might kind of inadvertently affect how platforms moderate and act as a, way of slowing down harmful or egregious content. it kind of made me think a bit about the challenges of breaking through. a platform like that though, if somebody has a reputation and they have built up a reputation on the platform, does that mean that a kind of other voices can emerge and can content can emerge in a way that,
Mike Masnick:you, you definitely have a fear of like, this becomes a sort of winner take all situation and the people with the most clout and the most power sort of stay that way. you have a little bit of that no matter what on any social media platform. Obviously people with bigger audiences just have bigger audiences as this is the natural way things are. but yeah, there is a concern that this leads into there's going to be a strata of users who are like the royalty. And have all the power. And we have seen how that has failed on other platforms. And so, most notably, dig, which is now coming back apparently, you know, which was like the early version of Reddit where people would vote up and vote down stories. they had some sort of ranking system where particular users, their votes, if they were considered, good signalers, their votes counted more. And that got to a point where it was actually kind of crazy where like the leading users on, Dig, we had this happen to us. in fact, someone came to us and said, Hey, I have like strong power signal on Dig. Do you want me to promote Teched
Ben Whitelaw:Right.
Mike Masnick:and I know that in some cases there were people who were like. Sort of selling their ability to do that. That wasn't the person who approached us was just like, I like text articles. I would promote them. And I actually told him no.'cause I felt that that was, I felt like cheating, um, to rely on someone like that. but like once you have that kind of power, then there's corruption, potential
Ben Whitelaw:Yeah, I remember talking to a guy called Rob Allen, whose, username on Reddit is Gallo Boob, and he had a ton of karma. He's like one of the most kind of like, like, largest Reddit uses for Karma. And he also used to get loads of approaches from companies and brands. In which he was kind of invited to basically shell on behalf of the company and he was pretty principled about it. But yeah, there are all these kind of unintended consequences of focusing on reputation and it relies on having mitigating systems in place to, to I guess avoid that. So really interesting experiment will be interesting to see if that can scale and if people, enjoy it or see if those ideas proliferate onto other platforms as well.
Mike Masnick:and I'll note too that I do appreciate the fact that they're trying to build this in a decentralized way. they're using the decentralized social networking protocol, which is the, the Project Liberty project. And, there's a few different. Social media apps that are using that. As someone who believes in protocols over platforms and decentralization, I'm excited that they're doing that rather than trying to build up a brand new thing from scratch.
Ben Whitelaw:Yeah, indeed. great. Thanks Mike. That brings us to the end of today's episode. Thanks to our listeners for tuning in. if you enjoyed today's episode or if you didn't hate it, you know what to do,
Mike Masnick:No. Let's, let's raise the bar. Okay. Last week it was, it was, didn't hate it This week, this week. Let's get some. We really, really like the podcast.
Ben Whitelaw:Okay. If you really, really like the podcast, I think these three reviews have gone to your head. But yeah. Okay. If it's, if you really, really like the podcast, leave us a review in which you tell us that you really, really like it. And we will really, really like you. And, if you like the podcast enough to sponsor an episode, we are, in the market for sponsors. you get a mention at the start and the end of the podcast, and a excellent. 10 minute interview, with one of us. And, our listeners are growing all the time and we're getting lots of really great feedback. So get in touch, podcast@controlalspeech.com. Thanks for your time as ever, Mike. It's been great to chat to you. thanks for all the listeners tuning in, and we'll speak to you next week.
Announcer:Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.