
Ctrl-Alt-Speech
Ctrl-Alt-Speech is a weekly news podcast co-created by Techdirt’s Mike Masnick and Everything in Moderation’s Ben Whitelaw. Each episode looks at the latest news in online speech, covering issues regarding trust & safety, content moderation, regulation, court rulings, new services & technology, and more.
The podcast regularly features expert guests with experience in the trust & safety/online speech worlds, discussing the ins and outs of the news that week and what it may mean for the industry. Each episode takes a deep dive into one or two key stories, and includes a quicker roundup of other important news. It's a must-listen for trust & safety professionals, and anyone interested in issues surrounding online speech.
If your company or organization is interested in sponsoring Ctrl-Alt-Speech and joining us for a sponsored interview, visit ctrlaltspeech.com for more information.
Ctrl-Alt-Speech is produced with financial support from the Future of Online Trust & Safety Fund, a fiscally-sponsored multi-donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive Trust and Safety ecosystem and field.
Ctrl-Alt-Speech
Outrage For The Machine
In this week's round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:
- He’s a Master of Outrage on X. The Pay Isn’t Great. (NY Times)
- The vulnerable teen drawn into far-right extremism online (Financial Times)
- X, Bluesky and Reddit in France’s crosshairs amid porn clampdown (Politico)
- EU sidesteps Macron’s ultimatum to ban social media for kids under 15 (Euractiv)
- Commission seeks feedback on the guidelines on protection of minors online under the Digital Services Act (European Commission)
- OnlyFans releases stern statement as it DEACTIVATES Bonnie Blue’s account for defiant reason (The Tab)
This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.
Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.
So Mike, I am now firmly in the world of baby apps. This is this, I've been, I've been launched headfirst into a plethora of, apps for sleep deprived parents, apps for, you know, dotting family members who want more content. And I've stumbled across Tiny Beans,
Mike Masnick:Okay.
Ben Whitelaw:which is a kind of scrapbook app for. dotting parents of humans and of pets interestingly. And, uh, the Tiny Beans app prompts me on a daily basis to add a memory.
Mike Masnick:Oh
Ben Whitelaw:And so I wanted to, you know, start this, podcast in which I returned to controlled alt speech with, that prompt to you. can you add a memory?
Mike Masnick:yeah. Well, you know, I, I would like to add the memory of the before times
Ben Whitelaw:The whole before times.
Mike Masnick:whole before times, the time before we had, doge taking over the, the US government and, you know, police and military invading Los Angeles and potentially other cities, there was a time when the world was not this crazy, or at least the United States was not this crazy.
Ben Whitelaw:Mm-hmm.
Mike Masnick:And while I know that the Adam memory prompt often means like something from today that you want to remember in the future, I would like to think back to the time before all this. And remember, I. There is a world that is not quite so crazy all the time and not so angering all the time and not so fearful all the time. So that, that is what I would like to do. I would like to, to remember those times.
Ben Whitelaw:I'm so glad that nothing has changed in the last month.
Mike Masnick:Yeah. Yeah, yeah. What about
Ben Whitelaw:same old Mike Masnick.
Mike Masnick:What about you? What memory would you like to add?
Ben Whitelaw:I would like to have a memory at all. I, my, my mind has gone com. My mind has gone completely blank. I.
Mike Masnick:to Parenthood.
Ben Whitelaw:I basically only know it's Thursday because we're recording this podcast. So thanks for, you know, helping me recognize what day it is. I'm sure some of these memories will return and I will have a functioning brain soon, but right now it has gone to part,
Mike Masnick:You'll, you'll, you'll make it through. You'll make it through. Everyone does
Ben Whitelaw:Hello and welcome to Control Alt Speech, your weekly roundup of the major stories about online speech, content moderation, and internet regulation. It's June the 12th, 2025, and this week's episode is brought to you with financial support from the future of online trust and safety fund. This week we're talking about the complexities of being a teenager on the internet, the challenge of making money from inflammatory content on X, and the difficulty of being a porn site operator right now. I'm Ben Whitelaw. I'm the founder and editor of Everything in Moderation, and I'm. Very glad to be back in the chair, or be it slightly, a lesser version of myself wi with Mike Masnick, founder of Tech Dirt and Recipient of the Blue Sky Bell. We had some people asking for it, Mike. Uh, so, you know, it's, it's important. Some let people know it's there and it's doing its job.
Mike Masnick:Yes, it is. Uh, yes. Never shall we be without the bell again.
Ben Whitelaw:Never. How are you?
Mike Masnick:I am, uh, a little crazy. Uh, you know, as per usual, I'm not, not having to deal with a, a newborn, uh, like you are and not. sleep deprived because of any of that, but, generally, you know, very busy with all sorts of things happening in the world and, trying to keep everything moving and, you know, figuring out all those kinds of things. So it is crazy, but, um, not quite as crazy as I imagine your life is. How, how was being a father?
Ben Whitelaw:Um, yeah, it's, it's something, it's really something. Uh, I was incredibly naive to think that this would be, anything short of the most bonkers month of my life. And it, you know, the little guy is okay, he's fine, and my wife is, doing really well. And so I can't even imagine what it would be like if those two things weren't true. but the. sleep deprivation, the trying to read certain cues, the constant nappy changing, um, has, has been hell of a thing. Um, but it's been great. It's been great. I'm really, you know, psyched that he's here. And we've had a great couple of weeks and I'm really grateful for you holding the fort and bringing on such brilliant guests. in, you know, my stead, Kathy and Hank and Zeev were all fantastic I took great pleasure in listening to those in the hours between 2:00 AM and 5:00 AM on various days. So, uh, thank you for that. amidst everything else you've been doing.
Mike Masnick:I was gonna say, did we help you get to sleep? Did we, do we bore you to to sleep?
Ben Whitelaw:I, I'm not at the point of playing the podcast too, my boy. But, I actually have other family members that play the podcast as a means of soothing themselves to sleep.
Mike Masnick:we go.
Ben Whitelaw:So maybe
Mike Masnick:a little scary.
Ben Whitelaw:I won't say who, but you know we're not at that stage yet, thankfully.
Mike Masnick:Good. Good.
Ben Whitelaw:how's things on tech? Have you, you know, I've been trying to keep up with what you've been writing and producing anything that the listeners should be, checking out anything we should be adding to the show notes,
Mike Masnick:Uh, I mean, lots of stuff is going on. Obviously spending a lot of time this week talking about everything that's happening in la, which, again. As I've said before, sort of feels like somewhat far field of what Tech Dirt normally covers. But I think is is again one of those things where if we're allowing the federal government to send in the military to create violence and, effectively allow the federal government to take over from what, State and local law enforcement is supposed to do, we're losing what makes, American freedom and, and civil liberties and civil rights possible. And so, at the moment that is. One of the most important things, and it's just yet another example of many of, democratic backsliding and authoritarian takeover of the United States, which is terrifying in all sorts of ways. But, as I keep saying, if we don't have a. free and open United States than all of the other tech policy stuff and innovation policy and online speech stuff that I would like to be spending my time writing about is impossible and doesn't matter. so, you know, the priorities sort of shift when your country is under siege in the way that it is right now. I still have a few of those other stories and, I'm still covering other stuff and there are a bunch of other things happening. There are a bunch of stories I wish I could cover. I have, uh, approximately 650 taps open right now, many of which are stories that I would like to cover, but haven't been able to because every day something crazy is happening
Ben Whitelaw:Mm. goes back to that post you wrote about Tata becoming a democracy blog now, and I'm glad, glad to see you still, you know, plowing a, a kind of difficult furrow in that sense. but, you know, great, great to see that that happening and like, I guess from my perspective, as somebody who's read Tata a long time, very important that, tech Data and other sites like it. Are funded and sponsored and supported to do this kind of reporting. So I think that's a good reminder for me and for all our listeners as well. while I was away, there was the kind of, you know, slither of silver lining that emerged between the kind of dark clouds of our trust con headline, uh, announcement. I'm calling it a headline announcement. I don't think the organizers of trust gone have billed it as such, but, um.
Mike Masnick:to trust God.
Ben Whitelaw:Yeah. come and find me. I, I will apologize personally. Um, the trust con is the major trust and safety conference takes place every year in July in San Francisco. It's where many, many of our listeners go. and last year we did our first live recording of the Control Speech podcast. And not only have they led us back this year, but we've been given the final slot on the final day. Which in Glastonbury terms, you know, is, is a big deal. It's the kind of pyramid stage,
Mike Masnick:everything is leading up to us.
Ben Whitelaw:everything is leading up to us. and I think, we've got some great, great panelists, some great guests. I actually won't be there, because I'll be, parenting, in the early few months of, my son's life. but you'll be there and you'll be there with an absolute great panel.
Mike Masnick:Yeah, we have, we have a really fantastic panel, uh, sitting in your seat will be Alice, who, you know, is a, good fill in for you, uh, and who was on the panel last year and has been a, a guest host in the past, and obviously does a lot of stuff with everything in moderation. and then we have two wonderful guests as well. And you know what, why don't we dribble out that information o
Ben Whitelaw:Oh, okay.
Mike Masnick:few weeks. I was like, I was like, we, we could go into it, but we have, you know, two additional panelists, with really different perspectives. One from academia, and sort of the think tank world and one from industry. And, it's gonna be great.
Ben Whitelaw:Yeah. You've, you've done a great job of kind of finding some really interesting, voices in the trust and safety space that we probably haven't heard from as much as we should have done. So really, really exciting. We'll talk more about that in coming weeks. you know, the live edition of the podcast is gonna be very similar to what listeners here every week. we go through a couple of big stories that Mike and I have, found and talked about and feel like they're worth analyzing. And then we talk about. few others that you might want to go away and read yourself. we're gonna start today as, as ever, with some of those big stories. And you picked out a New York Times profile piece, Mike, of a somewhat unsavory character who's been given a bit of a softball, writeup. It's fair to say
Mike Masnick:Yeah. Um, yeah, this piece came out, at the end of last week. you know, in the New York Times, they do profiles on people. They, a few years ago did a profile on me, which was very nice.
Ben Whitelaw:That's pretty, that's pretty softball as well, I thought
Mike Masnick:Are you suggesting that there are deep, dark secrets that they passed over in, in their, coverage of
Ben Whitelaw:maybe.
Mike Masnick:and so, you know, the New York Times does these profiles of, people who are, in the news for some reason or another. And in this case they did a profile of Dominic McGee, who's. Better known. Uh, I forget his, his, it's Dom Lucer, I think is what he goes by on Twitter. he has like 1.5 million followers on X and, is a huge Trump supporting. Extremely maga conspiracy theory. Spewing, you know, fool
Ben Whitelaw:Yeah.
Mike Masnick:is the, is the best way I would put it. and so there were a few things about this profile, one of which was that it got a ton of pushback, in general in that it is sort of a soft lens profile of someone who is. pretty objectively terrible and has done some really terrible things. And in fact, one of the, things that, why he is widely known on the internet was that at one point, somewhat early on in Elon Musk's ownership of the platform, I don't know if this was while it was still called Twitter or after it had switched to XI forget exactly the timeline. but Dominic had been suspended by.
Ben Whitelaw:by
Mike Masnick:Elon Musk's Twitter because he had posted child sexual abuse material and not just any child sexual abuse material, but what some have considered to be the worst of the worst. in fact, the imagery that he posted was. At times apparently not believed to be true. The description of it was so horrific and so terrible that the FBI at one point wasn't sure that it really existed, or if it was just sort of like this thing that people talked about. But it is true, it is horrible. it is some context in which he posted it, which was he was trying to talk about the kind of content that people were sharing. He was trying to sort of. Explain about someone who had shared this kind of content, um, to try and, create controversy around someone that is not a reasonable reason to do this, that is not an excuse, that is horrifying in all sorts of ways he claims. And the New York Times sort of takes this at face value that he found that picture in a news story. That's extraordinarily unlikely to be true, because this is not an image that any any news organization would ever post. I think, uh, or at least not any legitimate one. I. and he was suspended and the whole story was that Elon Musk then stepped in to have him reinstated, which was a new story in itself because he had claimed when he took over that dealing with, CSA material was priority number one, and he would never back down on that. And it was always the most important thing, even as he was firing the teams that were doing that kind of work. And here he, unsuspended someone who. Posted publicly some of the worst of the worst of that content because he was a big, you know, Maga Musk right wing supporting, internet troll.
Ben Whitelaw:Mm-hmm.
Mike Masnick:And so there was some consternation about the fact that the New York Times would publish a, a sort of profile that, mentions that incident, but really, really glosses over. seriousness of it and kind of like what happened there.
Ben Whitelaw:just before we go into the kind of detail of the profile, which I think is what kind of the meat of what we wanna talk about, just give us a sense of like what the reporter, the kind of reaction and the reporter pushing back because I, I, I think again, like is a mainstream, media site platforming, a Q Anon conspiracy theorist and, publisher of, CS a as you say, the journalist, doesn't respond that well to the criticism of, or pushback, does he? Yeah.
Mike Masnick:Yeah. The, the reporters is Stuart Thompson, who's been at the New York Times for a while, and he covers sort of internet digital stuff. he posted a link to the story and then a lot of people started yelling at him about it and saying like, This is bad. you're giving this profile that really plays down some of the awfulness of what this guy is, and you're sort of raising him up as someone who is interesting and worth. Knowing who it is, even as you talk about some of the outrage farming that he does. so as he got yelled at, Stewart posted on Blue Sky, every time someone comments on this thread, which is the thread where he announced the article, I earn a little bit of money under Blue Sky's revised revenue program. Which is a joke, that doesn't exist currently. and so he's, he's sort of mocking the fact that he is farming outrage for clicks and likes, and that he's sort of, you know, mocking the fact that like, this is the way the game is played and I'm playing the game, which is I drive outrage. You click and I'm happy about it, even if you're mad at me.
Ben Whitelaw:Yeah, he's kind of the taking, taking the kind of Dom Luca approach to, know, gaining engagement on his own article in a way. yeah, so there, there's problems with that, and I think that's worth mentioning because we talk about the way that content moderation and internet regulation is covered by the media quite a lot on this podcast. So I think that's, an interesting thing to note. What else is interesting about this profile from your perspective?
Mike Masnick:the thing that I did think was actually interesting in the profile is that Dom, like really opened up his finances to the reporter and sort of revealed how much he's actually making, from x in particular. And this is part of the, the headline of the story, which is he's not making that much, even as he's sort of, he was ranked somewhere as like the third most influential, person on, the platform. and you know.
Ben Whitelaw:the, on the, on the right,
Mike Masnick:On the right.
Ben Whitelaw:the far right.
Mike Masnick:Yeah. as if there are influential people on the far left, on, on X, I'm not sure I buy that. Um, so I think that might make him the third most influential no matter what. Right? So, Elon Musk, really, really spent a lot of time promoting the idea that he was gonna turn x into this platform where creators could make money. And, you know, the more you, post and the more people respond to you, the more money you can make. And it was all sort of like supposedly tied up in the. X premium, I don't even know what they're calling it anymore, program where he wanted you to pay$8 a month or$15 a month, or$20 a month, I don't know what, what the price is anymore. you know, but the more you, Got people to reply to you if they were also paying and if there were ads, you were getting a cut of it and you were gonna make, and it was supposed to be like this big sort of flywheel to help drive revenue. This guy has 1.5 million followers. the thing that he is good at is outrage farming. I mean, I. They talk about in the article, he views it somewhat clinically, like, I'm going to, get people really mad by posting this set of things in this order, which he knows, is absolute nonsense or just, just designed to, create outrage. He doesn't care if they're true, in fact. You know, I, I think he prefers them not to be true. You know, in the way you read, the way he looks at it. All he looks at is what is going to drive engagement. and the more outrageous, the better. And there's this belief out there that there's this sort of grifter economy of people who are doing outrage for the sake of outrage just to get paid. and what this article reveals is as the headline says, like, the pays not, not very good. not nothing he is making what is a living, but it's not, it's not a huge amount of money. and you know, he basically at one point admits that he has like$7 in his bank account. I mean, he's, he's effectively broke The headline numbers are that, in the, two years since they started. X has paid him$157,000. but it, it varies month to month. It's not predictable numbers. so it was, like it said, there were$67,000 in the, first year, but then 12,000 last year, he was removed from the advertising program. I. But then he complained and Elon Musk said he would get him back in and then like another 16,000 came in. So it's like, it's somewhat variable and it's somewhat, apparently at the whims of Elon Musk. Um, there was another story that came out like, a month or two ago, which I don't think we covered. I. Did show how traffic and revenue from people who criticized Elon Musk suddenly disappeared. there was another story that was mainly about all of Elon Musk's children, and the way that Elon Musk, Apparently approaches some women that he talks to on X and, asks them to bear his children, which is horrifying in all sorts of ways that there was one, female influencer who was making a lot of money and Elon Musk, propositioned her that she should bear his children. And when she turned him down, suddenly she stopped making her, her revenue dropped by like 80% or something along those lines. so a lot of it does really feel like. As I think I said, this program was in, in the earliest days, basically, Elon Musk points to certain people that he likes and tells, the Twitter finance team, whoever, they're like, pay this person a bunch of money, you know? so he, just hasn't made that much. There is also the, you can subscribe to people on x, which is a program that predated Elon and I actually thought was a really. Really clever idea, like you can get private tweets if you subscribe to someone and something like that. He makes some money from that. But again, it's not a huge amount and in the end he is making some money, but it is not like this incredibly lucrative business that I think some people think it is.
Ben Whitelaw:Yeah, it's a really, really interesting kind of data point. Now we don't really know how, typical that is. You know, it might be that other people who are as influential or or less influential who are somehow monetizing what they're doing better than he is, via a kind of diverse means of revenue streams. But it goes back to this idea that we talked on the podcast about, is that these revenue sharing schemes. Change the nature of speech. You know, he, it changes the incentives at play for creating certain types of content. And as you say, he doesn't care if it's true. And, you know, he has systematically found ways to exploit, X'S algorithm in order to generate that small-ish chunk of cash for himself. I, in some ways, I'm, I'm torn, Mike. I'm torn, right. You know, in some ways I think, you know. 60 odd thousand dollars for making up a load of bullshit is, is way too much money, um, right. To be paid. That's, that seems incredibly unfair that people who do a much, better day's work for that amount of money. and at the same time, I wish that he'd been paid more. I wish, I wish I, in some ways I wish it was kind of financially sustainable to spend 14 hours a day clipping up, rage videos and, Tweaking them in such a way that people can come along and, reply to them and, and, you know, bait them to say something. I'm really torn. And again, this goes back to the fact that the revenue sharing programs are such an important and under researched, part of the online speech infrastructure. you know, for me it makes you think, okay, this is a strong case for regulating some of the algorithms that Dom Lua has exploited. he has worked out what works to get him that 60 odd thousand dollars, however little and took, he will try and increase that amount of money over time. Is that not something we should be kind of looking more closely at, how those schemes work and, and
Mike Masnick:I mean, which, which, like what does regulation, what role does regulation play here? Like what is the outcome that you're, you're looking for? I mean, because there's a few different arguments here. Like one is that you're, you just think content creators should, be paid more, or that content creator pay should be more transparent or that, you know, I, I mean, there's a number of different directions that you can go here, some of which might have. Regulatory solutions and some of which might not. And so I'm not clear what the, what you think regulation steps in here to solve.
Ben Whitelaw:Well, I think understanding, I guess through assessments of the algorithm that Luca and others are exploiting
Mike Masnick:Mm-hmm.
Ben Whitelaw:about, what the impacts of those kind of algorithmic choices are on, the targets that he goes after or, on a wider scale. The nature of democracy, as you talked about, the top of the, kind of episode. Like I don't, I don't feel like there's enough understanding about, um, that,
Mike Masnick:Yeah, but that is, that, I'm not sure that's a role for, government to play. I think that's a role that academics should play. And of course, we've now attacked the academics who were trying to study that. and so, you know, there are arguments there around like the transparency aspect of, you know, should these platforms be more open to working with academics and maybe sharing data so that the, these kinds of things could be studied. I think there's value in that. But I'm not sure that, you know, as soon as you get into the idea of like the government coming in and saying like, well, we need to see how the algorithm works. It's like, where does that stop? And again, I go back to the analogy of like, how is that different from the government coming in and saying to the New York Times, like, we need to analyze which stories you put on the front page to see what impact they have on democracy or how people are feeling about things because it's. pretty hard to distinguish within the law how that's any different. And so you definitely begin to raise serious speech concerns and especially then when you have a government like the one that we have currently in the us, how are they going to use those kinds of regulations? It's going to be to promote more of the Dom Lucas of the world and less of the people who are pointing out that this is crazy and that no, LA is not having riots right now, or if there are riots, you know, it's the police that are doing the rioting, not the people. that stuff is gonna get suppressed when you put that into sort of the regulatory framework, I think, there is an element here that is important around the, transparency and understanding this stuff and understanding where this things go. The, these things play out and, and to that. I appreciate that the New York Times was able to, get him to open up and share the data on what he's earned so that, people can think about this and talk about it, but you know, I think there is this element of, in the sort of Instagram, TikTok world where. People believe that like being an influencer is like a lucrative career, and there are certainly a few people at the tippy top. Who have made a ridiculous amount of money. but often that's through like other things like sponsorship deals and other stuff. but the idea that just like being totally outrageous is like a path to riches. I think this at least throws some, you know, it's one, data point as you noted. And, X in particular may be like the worst platform to try and become like a, rich influencer on, but. it's interesting to note that it's, you know, it's maybe not quite as lucrative as people wanted and, and, and maybe. for all your talk of like, maybe he should be, be getting paid more. And I agree in that like, I would love for like good content creators to be able to be paid more. But like distinguishing good from, crazy and, and outrageous is, not something that you can easily do. But I would like to think that maybe the fact that he's not making that much money suggests that the outrage farming grifter class. Might begin to fade out where if it's not so lucrative in the long term, then maybe those people have to go find honest jobs.
Ben Whitelaw:Yeah. Particularly if you know Russian influence operations can't pay you via a secret secret shell company, as was the case with those YouTubers, uh, at the end of last year, um, that
Mike Masnick:made a lot more money than than Dom did.
Ben Whitelaw:Yeah. Yeah. Uh, Dom, if you're listening, that that's your way to a, uh, you know, financial free future. Um, okay. Well I think that's, definitely, there's a few interesting strand, to that. Definitely worth going to have a read, and we will see what profile, who, the New York Times profiles next. Mike? Um,
Mike Masnick:It's your turn Ben.
Ben Whitelaw:I am waiting for that email. Um, let's, let's go onto the next story. Now. It's a story that, I think is pretty well reported by comparison. and I'll talk a bit about why this is an FT piece, and I should note at the top that I work for the ft, although not
Mike Masnick:ding. Ring the bell. We, we, we need the bell for you.
Ben Whitelaw:the, yeah, the FT. Bell doesn't have quite the same ring to it and probably won't be used as often. but I think it's important to note. So I don't work in the newsroom, but I, I work for the ft. You don't know the journalist who, who wrote this piece, but it's a really detailed and devastating account of the final few years of the life of a 16-year-old called Rhiannon Rudd. And she is essentially the kind of youngest person to be charged with terrorism offenses in the uk. And it's been reported now because her inquest concluded on, on Monday. just to give you a bit of background about her, so she was arrested in October, 2020. For essentially kind of right winging extremist views. both posted online and in person. So she was found to be downloading bomb making material. She was posting about attacking synagogues. She'd spoken to her family and friends about, um, wanting to kill Jews. She'd actually carved a swastika on her own forehead. This is a very kind of troubled young woman. And over a year later the case was dropped because she was found to have been groomed by a American kind of neo-Nazi, that she met on Discord and. In the intervening period, she started to kind of work with, prevent, which is a program run by the UK government to prevent, to deradicalize people, like her. but unfortunately she took her own life, um, as a result of the kind of, treatment and the pressure, and partly because she ended up, going back onto the platforms and speaking to some of the kind of, you know. her internet, friends, essentially that, were sharing views neo-Nazi views with her. So this is, this is a story about the suicide and the kind of troubling last years of a very young woman. And it's really well reported. They pull on lots of sources. They, they are sitting in the whole of the inquest. And I think it's worth talking here on the podcast, Mike,'cause there's a few. Key differences to some of the story, similar stories that we've talked about, but there are also a lot of similarities. Before I kind of dive into those, I wanted, wanted to get your sense just about the general story and, what your reactions were to it.
Mike Masnick:Yeah, I mean, as you said, it's, it's a long, it's very detailed. it's a gripping story. it's tragic, as you can imagine from the description that, that you gave. but I do think it is worth reading. and it, it certainly reminds me of, we've covered some of other, detailed reporting on various tragedies. and I do think that there are, are some important similarities, between those and, and this. But it is this kind of, vision of a very, very troubled young person who, many different parts of the world and the system failed along the way. and I think that kind of reporting is really helpful in terms of thinking about how do we deal with, troubled people of all ages, going forward.
Ben Whitelaw:E Exactly. That really, I mean, this is a different story in the sense that it's a young woman. A lot of the stories that we have brought to the podcast have been young men who've been sex extorted via, often Instagram or on other platforms. We talked about, the young guy, Jamie from the Netflix hit TV show adolescents. You know, there's been a lot of talk in recent months about the kind of lost generation of boys online, and so it's really interesting that kind of Rihanna is a, is a somewhat anomaly or kind of counterpoint to what has been a, a rich seam of, discussion, certainly in the uk. it's also something different in the, you know. The inquest didn't confirm that it was a suicide, which I, I was thinking about and made me realize that actually if you are a family of this young woman and you don't get that kind of closure in the way that some other, of the victims of other similar, tragic circumstances have, then that's really difficult as well. and so there's a couple of small differences there, which, which I think are worth noting. However, like you say, there's a lot of similar aspects to, many of the other stories we, talk about and the main thing is that she's an incredibly vulnerable woman who had a lot of. Difficult, kind of real life, offline problems, which culminated somewhat in her suicide. So the story lays out how she, you know, fled from home with her mom and her brother, age seven, and was in a, a woman's refuge, presumably'cause of abuse from somebody, sub danger from somebody she had. A history of self-harm and she had kind of undiagnosed autism. her mother brought home a partner. This is the craziest thing really. Her mother brought home a partner who was a bonafide neo-Nazi who shared views with her and with, other members of the family about, killing Jews and, and, other, other Nazi views. and she was also. Said to have gone to, a close confidant and said that this man sexually assaulted her as well, which something she later dropped. So this is a very kind of troubled young woman, in any circumstances. And, you know, the fact that she was groomed online felt like, at least from the telling of the story, the kind of end point of a number of those situations. Did you feel that as well?
Mike Masnick:Yeah, I mean, you go a little bit into the story before suddenly they mention that like, her mother had a partner who was an outright neo-Nazi, and in fact, I think they reached out to him for comment. He doesn't deny the fact that he's a neo-Nazi, and shared those views. and again, it comes down to a point that I've raised a bunch and I think is also demonstrated with this story, which is, There are lots of people in life throughout history who have been troubled at all different points in, in their lives, and they react in different ways and sometimes dangerous ways. It's only in the recent time that we. Want to blame it on the internet. even though there are similar stories that have happened in the past and the internet part of it is, you know, everybody uses the internet and if you are having you know, if you're in a troubled situation. it's going to happen that you're also going to use the internet. And so there's the, the, the causality, like how do you track down, the attribution of, what's really at fault here? And so many people rushed to say, well, it must be the internet. That was the problem. Oh, she met this guy in the US over discord, through like a gaming community. But you have all these other things that suggest that, she was in a really bad state just because of the world around her and the, the failings that society had in not helping her. Then that, it troubles me to say that this is an internet, causal
Ben Whitelaw:Totally. And that, to be fair, to the ft, the headline of the piece is The Vulnerable Teen Drawn into Far-Right Extremism. and, and it, the subhead header is, could the British State have done more to help Brianna Road? So there's actually no mention of an online platform, per se, which others might
Mike Masnick:It is. I,
Ben Whitelaw:the headline.
Mike Masnick:I, it doesn't mine, mine. Headline says, extremism online
Ben Whitelaw:Yeah.
Mike Masnick:Okay. Okay. It does say online.
Ben Whitelaw:it says online, but it doesn't say like, you know, discord
Mike Masnick:It's not blaming a particular platform. Yeah.
Ben Whitelaw:Yeah. Which I think is, you know, we, we've seen that in a number of different cases. I mean, the whole story made me think about, and you'd have to bear with me for this, Mike. It, made me think about last click attribution marketing. which you will know about as a, you know, one of the, grand old men of the internet, uh, if I can say that. And, but just to explain for anybody else, like, if you buy something online and you click from a particular platform. the last click attribution model says that it was Facebook or an e-commerce site that drove that purchase. And that is, one way of kind of, measuring the return on investment of your marketing spend. And it basically credits the kind of final touch point, that that person, goes through. It fails to understand that people's purchasing decisions happen. in lots of different forms. You know, they, they happen because of real life chats with people because of, billboards, because of, conversations in the street because of, you know, something you see on the tube. And so the last click kind of model of marketing is not thought to be very sophisticated and doesn't take into the kind of nuance of, of how marketing works. I think that's also a kind of good parallel for. The way that some people think about online safety, you know, because this young woman, Rhiannon Rudd, met somebody on discord and was groomed by them. That must make this an internet problem. I think it's very clear from this story and from a lot of what we talk about and think about that that's not the case. And, and I wonder, you know, I'm interested if, other people, other listeners think the same, like, do you think that kind of equivalence or that metaphor is helpful in
Mike Masnick:Yeah, no, I think it's, I think it's really valid, right? Because yeah, there is this. question of, everybody wants to figure out what the cause is and it's easy to just go to like one step removed. And the reality is, as, as I keep saying, and as you know, we keep discussing, is that it's always a lot more complex and the reality of the situation and all of the different variables and all the different factors go into this are varied and, and not easy to, Pick apart and sort of parse out and you can't say, but for this particular platform existing that, the end result would've been particularly different. and so I think you're right and I think that, even though there is still. Some marketing that is done through last click attribution. And in fact, there was the scandal recently with, the add-on honey that was, sort of stealing last click attribution, uh, was like a whole thing. Uh, you know, that's where they want the last click attribution. but here, I think you're right that it's too simplistic and it is not a fair, accurate model of what is really happening in a complex world.
Ben Whitelaw:Yeah, I think that's, it's maybe a helpful, kind of equivalent that people can draw on to help understand why it's, not always the last click. It's not always the, platform where, you know, in this case a young 16-year-old was groomed that that was the cause of. Of this issue. but yeah. Great, great kind of piece of reporting, definitely worth going to read. Um, it is long. So, bear that in my listeners. let's look at kind of a few other little stories, Mike. we could call this kind of porn corner, uh, or you know, I'm not sure, catch on. Um, but we, we both, we both came up with kind of. Stories in which pornography is very much in the kind of cross hairs of, regulators and platforms. Again, it's, been something we've seen in the news a lot recently. take us through kind of, your political story that you found.
Mike Masnick:Yeah. the political story and, we should ring the bell here because it does mention blue sky. Um. There we go. I'm on the board of Blue Sky. You can consider me as biased as you would like. Um, I have no insight about this with Blue Sky. I haven't spoken to Blue Sky about it. And the story is not really about Blue Sky. This story is actually about France. and that they have been, they're really pushing to do a kind of, social media ban for, kids. but as part of that. They were talking about designating certain general purpose sites to be adult content porn sites effectively because they have some amount of adult content on them. And so the ones that are mentioned in the article are, X, certainly Blue Sky, Mastodon, and Reddit. Right? So sort of three that are, you know, microblogging sites and then the community site, Reddit. These are all general sites. They have all kinds of content. Some of them include some amount of adult content. but the idea of like designating them as adult sites, which then have additional requirements, including age gating and age restriction rules seems like a fairly. Major, major step. It reminds me a little bit in the us a few different states, including Texas and, I think Arkansas, passed laws that said, if a site is more than 30% adult content, then they have to have age verification with no indication of what does that 30% mean. How do you calculate it? Is it time, is it, a number of use? I mean, there's no, there's no
Ben Whitelaw:30% of genitals or something.
Mike Masnick:Yeah, it's like run it through an AI and, and tell me what percentage of the site is adult content. I mean. it struck me as a little bit odd too in that, I don't usually think of France as being particularly prude, uh, and, and sort of reactionary in terms of like, oh, like adult content. We need to deal with that immediately. maybe I haven't been in France in a few years and, and things have changed. Um, so I don't, I don't, I don't know. Um, it does strike me as. Part of this sort of general moral panic that's going on right now about kids online and the idea that they might be exposed to bad stuff, whether it is, far right, extremism, terrorist content. Or adult content. And so, everybody wants to do something and here France is stepping up and saying, and Macron in particular is saying like, oh, we have to, we have to ban kids. And again, like there's still no evidence that says, like, that is actually helpful. and so, I find it a little bit troubling that they're doing this and that they're extending it to this point where they're suggesting that general purpose sites should be deemed. And the same level as sites that are dedicated to adult content. Um. And also there's the separate issue of like the EU spent all these years, trying to, in theory, carefully calibrate things like the DSA and how that works. And then you have sort of France barging in and being like, forget all that. Like we've decided that this is how the internet should work. and I find that to raise some questions about the credibility of the larger EU process in terms of how they regulate the internet.
Ben Whitelaw:I don't have, the kind of full details. Um, I would be interested to know from somebody who was very in the weeds of the DSA, what they thought. But I have noted over the past few weeks as I've been coming back online that there is this kind of beef, slight beef emerging between France and the kind of European commission around kind of who's driving online safety, the of online safety conversation. So you might seen last week that. on TikTok, the Skinny Talk hashtag was banned so that users could not access eating disorder content. And this has been something that's been in the news for years now. French politicians claimed a victory there by saying that it was them going over to Dublin a couple of months back and discussing it with the TikTok team that led to this change whilst. Other media reports suggested that it was actually the European Commission's, Michael McGrath, who held an online meeting with TikTok, CEO Show Chu, of which Skinny Talk was on the agenda. So there's this kind of, parallel regulation going on, and I'm not entirely sure how that will pan up, but it's something we need to kind of definitely keep tabs on.
Mike Masnick:and I'll note really quickly that the EU is currently, and they actually just extended it though. It, it ends this week. they're looking for feedback on, extending the Digital Services Act regarding things like age verification and age assurance. They're sort of, putting under the, Frame of, you know, child safety requirements under the DSA. And I think the originally they, they wanted comments last week, but they gave it another week. So I think it ends, weekend.
Ben Whitelaw:and not to forget that the DSA launched a formal investigation into PornHub Strip chat to other sites for preventing, setting up any kind of age restriction measures. So that's already happening. France suspended access to PornHub, already, which is its second largest market. And then Ofcom, the UK regulator announced it, opened an investigation into two lesser, adult sites for again, their lack of age assurance. So there is this swell of, you know, regulatory interest in, adult and, and pornography sites, which I think is, it's not necessarily surprise that a lot of support publicly for going after these platforms. but it's, I think it's fascinating that it's happening all at the same time. just before we go onto the next question, do you think, Mike, that there's an element of, going after some of the more compliant platforms here? Because, you look at the ones you mentioned, maybe X si, but Blue Sky, Reddit, mastodon, you know, there's some of the kind of. More compliant, platforms you would say. Do you think there's a, this is regulators going after maybe the wrong targets? we know, we know that there's, you know, loads of other sites who are very
Mike Masnick:Yeah.
Ben Whitelaw:laissez-faire approach to.
Mike Masnick:Well, there is, I mean, right. this is a regulatory strategy that a lot of people take, which is like you go after a site that you know will sort of comply with what you demand and use that as a sort of defacto precedent. Well, these other companies are doing what we've asked, therefore. You must as well. and from a sort of regulatory PR standpoint, sometimes people feel that that is a better approach than going after the site that is going to thumb their nose at them and say like, haha, come and get me. I don't know that that is a realistically good strategy. I think. Obviously there are lots of problems and challenges and risks with that, that strategy as well. But I do think that goes into some of the thinking, that people have where it's like, well, if we put enough pressure on these sites, we can use that as sort of like, look, everybody else is complying. Why aren't you?
Ben Whitelaw:Okay. Yeah, I mean that's, definitely something seems to be an, a strategy that they're taking. whether it be successful, we'll see. We started the episode, Mike by talking about an online influencer who was struggling to make money. We'll kind of wrap things up today by talking about, uh, a woman who, who has no issues with making money. are you familiar with Bonnie Blue?
Mike Masnick:I, I will admit, I had no idea who she was until, I saw this link that you, you submitted last night. Um, I, I have since read a few articles about her and, and have a sense of, who she is, but I had no idea who she was, uh, prior to,
Ben Whitelaw:well, she, I won't go into the ins and outs, um, so to speak, but she is an OnlyFans creator. She, she's a social me. She was, she's a social media influencer at large. She, again, is an, an adult creator, I would say, she's famous for her. Rivalry with another adult creator, and they are both essentially kind of over time, one-upping each other as to do you know more? Uh, why, how do you
Mike Masnick:How do you explain this one, Ben?
Ben Whitelaw:uh,
Mike Masnick:More outrageous, outrageous stunts involving adult
Ben Whitelaw:Yeah. Yeah, I, I think, I think that's, that's all we need to say. If you want to find out more about that, Google her. She should become interesting this week to us specifically, because her latest stunt has led to her being taken off of Instagram, TikTok and OnlyFans, where she makes a significant amount of money, reportedly 600,000 pounds a year, I believe. Basically TikTok and Instagram have, I don't think made a statement, but OnlyFans said that the latest stunt is quote, extreme and crossed the line, which mean that they now feel she's breached their acceptable use and terms of service. Which I think is really interesting because. We know that these policies are somewhat malleable. and I went into both of those documents and tried to see where this, the latest stunt might have been crossing a line. there's a, a line in there about a lack of expressed consent, which this latest stunt, based upon my very, simple understanding of what she's planning to do might, cross the line. But otherwise there's a, There's not much of a justification, I would say. she is obviously kind of very upset about it. She makes all of her, you know, money from these platforms and suddenly she has, no online presence, compared to like a week ago. And I just think is a really interesting example of, again, the way that platforms can react to media pressure and the kind of. PR essentially, that comes with some of these creators becoming much bigger than maybe these platforms even intended in the first place.
Mike Masnick:Yeah. and there are a few different things here where it's, there's also a question of like, the planned stunt, I would say like, it wasn't clear to me that that was going to be via OnlyFans or even associated with OnlyFans. So this is another example sort of. Potentially off platform behavior influencing on platform policy decision making, which is always a challenge. And you have some people who insist that you should never use off platform behavior to determine whether or not you violate. Policies for the platform, but that becomes really, really tricky very, very quickly. The other thing I'll add is, is that, in my then research last night trying to read up on who the hell this person was, she has claimed that, that stunt is off, but she's planning to announce some other stunt that she claims will be even better, which I, you know, who the hell knows. there is an element of is she really upset about this or is she sort of leaning into the controversy again? I had never heard of her until this happened, and now I'm aware of her. Unfortunately, it's something that I
Ben Whitelaw:welcome. You're welcome.
Mike Masnick:cannot forget. Um, but,
Ben Whitelaw:Yeah. But
Mike Masnick:this is where, you know, we talked about this with Dom at the top of the podcast like. Outrage gets interest, right? And so there is this incentive to do stuff that is outrageous and you can see like the stunt that she was planning was deliberately outrageous just to get attention. And it feels like she has been to some extent, you, you talked about sort of rivalry with someone else, but also she's just sort of one-upping herself. She has done other ridiculous stunts. She like faked an arrest at some point. There, there, there are a whole bunch of things where it's all about kind of like what stunts can you do to get attention? It's attention for attention's sake. And, you know, platforms have to have rules to, determine like, what do we find? Okay. And, and in some sense, are they, you know, I'm sure they're going through the process of thinking, are we encouraging this kind of behavior ourselves? And so. everybody has a line and OnlyFans line is pretty far out there. but apparently this crossed it.
Ben Whitelaw:It's very generous of you to say that. These platforms are thinking, are we encouraging this behavior and should we stop encouraging this behavior? Um, I'm not sure that's necessarily true, but you're right. You know, this is the outrage internet. And the question is kind of who pays for it financially or otherwise. so yeah, you know, it links. To an extent to the France story, and it certainly, uh, links to the profile that we talked about at the top of today's episode. I'm glad to be able to have brought, Bonnie Blue to your attention. Um, sorry about your, subsequent internet history and, uh, that, that is probably a good point to wrap up today's podcast. it's really great to be back, in the chair with you, Mike. I'm really, really glad to be talking about, you know, the mess that is moderation and internet regulation. Really excited to kind of continue to do that. I should be back on next week's podcast. You'll be taking a break in a couple of weeks. Um, much earned. and yeah, we'll, we'll get back into the rhythm of things, right?
Mike Masnick:Yeah, absolutely. It's, it's good to have you back and, uh, I don't think for someone who is sleep deprived and, and dealing with, changing nappies, as you say, uh, I, I think you, you, you were pretty on point.
Ben Whitelaw:Thanks. I appreciate that. I, I'll try and have two hours sleep for next week's episode.
Mike Masnick:we go. Yeah.
Ben Whitelaw:Um, if you enjoyed today's episode, everyone, please do rate and review us where, wherever you can. if you are interested in sponsoring the podcast, get in touch with us podcast@controlaltspeech.com or go to the website ctl alt speech.com. You can find details there. you know, that helps keep us going. and also if you want to send us some, I don't know, drugs to keep us awake or, you know, red Bull, that would also help me right now. So, um, it's been great to be back and, uh, good to see you again, Mike. Uh, thanks for listening.
Announcer:Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.