Ctrl-Alt-Speech

How The Online Regulators Stole Christmas

Mike Masnick & Ben Whitelaw Season 1 Episode 41

In this week's round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:

This episode is brought to you with financial support from the Future of Online Trust & Safety Fund. While Online Regulators may have stolen Christmas, Ctrl-Alt-Speech is going to try to take a short holiday break and will return in early January.

Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.

Ben Whitelaw:

So Mike, tis the season to be jolly and, and one of good tidings. So, that only means one thing for me. That means Secret Santa. Uh, I don't know if you're a fan of it in your house, but it's a very much a staple of the families that I, I partake in, in Christmas. And the system we use, the tool we use is called Elfster. Uh,

Mike Masnick:

Of course it is.

Ben Whitelaw:

of course it is. And, uh, so I consulted Elfster as I was, you know, working out what I was going to buy in my secret center. And the prompt for Elfster is what are you wishing for? So, so Mike, what are you wishing for?

Mike Masnick:

Well, I have to say my wish came early this year, because yesterday, as we were recording this just last night, we actually hit our. threshold and our goal for our Kickstarter for 1 billion users, the social media card game, and we are going to go into production and we did not think we were going to make it. And yet we did. So it was a, uh, early holiday miracle

Ben Whitelaw:

amazing. I love that.

Mike Masnick:

putting it. what, what, what are you wishing for Ben?

Ben Whitelaw:

Well, as you know, as usual, I'm wishing for some reviews of control alt speech, um, from our listeners, but also, uh, I want to wish for some time off for Not just for me or for you, but for our trust and safety professional listeners who have been hit with a barrage of regulatory codes and standards this week. And it looks like might not have such a, uh, an enjoyable Christmas period after all. Hello and welcome to Control Alt Speech, your weekly roundup of the major stories about online speech, content moderation, and internet regulation. It's December the 20th, 2024. And this week's episode is brought to you with the financial support from the future of online trust and safety fund. My name is Ben Whitelaw. I'm the founder and editor of everything moderation

Mike Masnick:

had a check.

Ben Whitelaw:

seemingly. Yeah. And, uh, and I've got Mike Masnick in the chair who is a newly King of Kickstarter. This, this was a very, I was very surprised. I've got to say Mike about the, the success of your Kickstarter. this week you were miles away from the total that you wanted to hit 50, 000 a couple of days ago. Something happened.

Mike Masnick:

Yeah. I mean, it is a, a social media miracle story. Um, basically blue sky folks went crazy about it. You know, I sort of posted yesterday morning. We were still on Wednesday. in the lead up, we were still at basically about 50%. We were a little over 25, 000 of the 50 with two days to go and sort of realize This is probably not going to make it, which, you know, happens. This is why we used Kickstarter. Like it was slightly disappointing, but not like. not hugely disappointing because the, I does, this is the point of Kickstarter is you sort of test it out. Is there interest in this market? and the original answer we got was, well, maybe not. Um, and then, you know, on Thursday, I sort of, posted on blue sky about it a little bit. I posted elsewhere as well. LinkedIn, threads, uh, mastodon. but people on blue sky were like, Hey, wait, we need to make this happen. And then yesterday they just went crazy. It was just like a whole bunch of people were backing and posting and talking it up and saying, let's do this, let's do this. And it was like fun and exciting. Everybody was watching it tick up together. and, we hit the goal and zoomed past it. And, uh, you know, we're, we're, it's open for late pledges, which is a new feature that Kickstarter allowed. And so, you know, we went right past 50, 000. We're. Right about 60, 000 as we record this. Um, and it's a stunning, success. And I would say, you know, it's almost entirely due to community folks on blue sky really rallying behind it. And, and, it's been kind of amazing.

Ben Whitelaw:

Yeah. And obligatory, you know, notes of the fact that you are on the board

Mike Masnick:

Yes. Yeah. Yeah.

Ben Whitelaw:

you go, you almost forgot,

Mike Masnick:

I almost forgot. Yeah. I mean, it had nothing to do with my, my position on the board, but

Ben Whitelaw:

no, no, no.

Mike Masnick:

there is a, yes, we should note that and should, disclaim that, fact. but, uh, you know, it w was very nice. it was the best present I'm probably going to get this year.

Ben Whitelaw:

Right. And just to be clear, so you can see from Kickstarter where folks come from, like you can take a referral.

Mike Masnick:

there's a tracking thing. And a lot of our backers had come from blue sky before, I think yesterday morning we were about 33%. So about one third of our backers had come via a link on blue sky. And by the end of yesterday, when we passed the goal, it was at 43%. So a massive increase. just in that day. And so, you know, a little less than half of our backers came via blue sky, which, you know, is kind of a statement on, on sort of how blue sky is doing these days. And, and that, there's an active community there. And so it was really, really fun for me to see. And, and obviously I'm really happy that we now get to produce this game. Um, we have some work ahead of us on our side in terms of doing the actual production. but you know, we had planned for that and, uh, now we just have to go and do it. Okay.

Ben Whitelaw:

about building a social network with a billion users, right? And there's a, there's content moderation and online speech considerations around it. It's very similar to the online games you produce in the past where you, there are trade offs. So just took a little bit about how the kind of. Card play brings, bring some of that home I'm really excited about this. You know, I've got five packs coming. I've got five games coming next year. I was, I was a backer. so I'm really interested to see how this will play out.

Mike Masnick:

yeah, yeah. It's fun. I will note and we tried to be clear about this, you know, the digital games that we've done moderator mayhem and trust safety tycoon designed very deliberately to be, you know, highly educational, you know, is really sort of trying to put people into the, into these roles, the card game, because it's a card game, you can't be as. You know, you don't have the same affordances as you can with sort of, you know, like a digital game. So it is a fun card game. It was based off of a 1906 card game about car racing, but where we, changed everything around. But then we added in all of these like really cool elements related to social media, I don't know if you would consider toxicity to be a cool element, but there is this concept of

Ben Whitelaw:

Core core rather than cool.

Mike Masnick:

Yeah. Yeah. Um, and so something that, you know, you sort of have to learn how to manage the toxicity on your platform. You're dealing with influencers who may or may not be good, may be problematic. You're there's a whole bunch of other things you have to deal with online regulations in the game. There's all of these things that we talk about. and so it is at its heart, it's a fun game. Some of the, the deeper concepts that we talk about are maybe a little bit more abstracted in the game, because, you know, it's not fun to go deep in the weeds on like regulation in a card game, but it's, there are some elements of it in the card game to sort of get people aware of some of the issues that we talk about and maybe interest them in, finding out more, but at its heart, is, it's just a really fun game. You can play the game without knowing anything about trust and safety. but you, you might, you know, Begin to get, you know, at least some, some of the terminology and understanding of it.

Ben Whitelaw:

when does it come out and, when can people start playing it?

Mike Masnick:

Yeah. So we are aiming to deliver it by June of, 2025. So we'll, you know, we've got to finish up a couple of things on our end. It'll go to the printers early in the year, and then we'll have to handle all the shipping and logistics and all that kind of stuff, which is a fun story in its own, But we're, we're targeting for June of next year.

Ben Whitelaw:

And will people be able to buy it if they haven't backed it?

Mike Masnick:

So it, it, the Kickstarter has late backing and it is available, for a little bit longer, we're not entirely sure when we're going to close that off, but, keep that open for a little while longer. and then, depending on how many copies we print, we're not going to print too many more copies than were ordered, but we probably will have some leftover that will be available for sale. but we're only planning on doing one big print run. we're still determining exactly how many that'll be. and then we'll figure out a way to, to sell off the remainders, I

Ben Whitelaw:

So essentially, you know, listening to control or speeches basically just practice

Mike Masnick:

guess.

Ben Whitelaw:

for the, for the game when it comes out, right? This is like you joining up on how to potentially win against your family and friends when playing 1 billion users

Mike Masnick:

Absolutely. This, this podcast is like the instruction, the secret instruction book for the game.

Ben Whitelaw:

Great. Okay. Um, well, we should, Get into today's stories then and talk a bit about some of those issues that come out in the game, but it is worth noting before we do so that we're going to be off for the next two weeks. at least we, we may, uh, run into kind of January a bit more, depending on how we feel, uh, how much Turkey we eat, but yeah, Mike and I will, Mike and I will be taking a bit of a break. I think this is our 42nd episodes this year, Mike, which is, not bad for a first year. And, uh, we were really pleased with

Mike Masnick:

we, we didn't start till March. I mean, it's a, it's a little crazy.

Ben Whitelaw:

Yeah. Yeah. We've really been, um, really been proving out this concept and, uh, we've had some great feedback from listeners about how it's gone. So I'm excited to get back and that all again next year for now. we should jump into today's stories and see how we get on. I'm going to kick off this week, Mike, in a bit of a change to proceedings by talking about what I think is. The big story of this week, which is the online safety act. know, you're a fan of all regulation. I don't know where this fits. I don't know where this fits in your, you know, top five,

Mike Masnick:

Oh boy.

Ben Whitelaw:

but the, uh, online safety act has come into fruition well and truly this week after Ofcom, which is the UK regulator announced that. It's first code of practice for legal harms was being released. And what that means is that, essentially platforms under the regulation will, as of March next year, have to abide by the online safety act in all of its forms. And the code of practice sets out. What that looks like what the regulator wants of the platforms and intermediaries that fall under its remit. So the estimate is that that's around a hundred thousand different platforms, um, different sizes, some big tech platforms, some as much smaller ones, which we're going to talk a bit about. and at the risk of kind of aging myself here, Mike, which is not usual the podcast, you know, normally it's the other way around, but, but, but I've been, I've

Mike Masnick:

Cause I'm the old guy.

Ben Whitelaw:

Hey, um, I've been, I've been following this since kind of the white paper was put out. There's an online harms, white paper in 2019. this is really the kind of a significant milestone because it's been, the online safety kind of regime in the UK has really been building since then. It's been six years in the making. that was the paper that really set out a regulatory framework and, Ofcom has kind of basically finally. put this out into the world as the first of a series of codes that will govern that. So it's a big deal in lots of senses. there's a ticking clock to it because it has this three month, lead time that platforms need to, adhere to. And then I just want to kind of, before we talk about the ramifications of that, just talk about okay, what is this code then what is it in real terms? So Essentially, it's a lot of documentation. it's, it's a lot of things to read. And I do, empathize with any of the kind of platform trust and safety folks who are going to have to get up to speed over the next couple of weeks and go back in the new year with a clear view of what they need to do and what, they currently already, do. There are basically three volumes when it comes to the illegal harms code. One of them is 116 pages, as a PDF. The second one is a whopping 540 pages, which is, you know, enough to make you fall asleep, I imagine. Uh, and the third one is a hundred plus pages as well. So they Together set out from a governance and risk perspective, a service design and user choice perspective, and a transparency and trust perspective, what platforms have to do, and in each of those chapters, there is a series of, recommendations that platforms of different sizes have to abide by, and a lot of those are things that we've talked about the platforms would do, Mike, it's, you know, having, specific labeling or particular design features, but there's also things in there which are somewhat new and, you know, Ofcom is clarifying for the first time, such as having a senior person named as the person, what you'd like to call the kind of hostage law, I think.

Mike Masnick:

Yeah. A little different than that, but, but yes, the, the, the, the who to blame, aspect of it. Yeah.

Ben Whitelaw:

yeah. So there's, a lot of detail in there and a lot for platforms of all sizes to really get across in the next three months before Ofcom start coming down on, on them if they're not failing to adhere to it. And there are fines both in real terms and percentage of income terms as well, whichever is larger that intermediaries will face. So it's a big deal as far as the UK's. journey of becoming the safest place in the world. Um, I think four or maybe five prime ministers have talked about this, Mike, in the last five or six years.

Mike Masnick:

pretty quickly, Ben.

Ben Whitelaw:

Disposable in the UK, I'd say. Um, but yeah, it's, it's, a big deal. And I, wondered what you thought across the pond, whether you'd, traveled to you and what you thought.

Mike Masnick:

Yeah. I mean, I, I think it's interesting. I mean, we sort of all. Saw this coming, right? We knew that this kind of thing was going to happen and Ofcom has been talking about it forever and, seeing the details finally come out is, interesting and certainly a milestone. but I think we're going to. Begin to learn what the actual impact of, these rules are. I, I think, you know, the, the larger statement though, is this goes back to the conversation we had a month or two ago around Daphne Keller's piece about compliance, right? This is the story is that we're turning into a world of compliance with, an incredible amount of paperwork and an incredible amount of like, what do we have to do to satisfy these regulations? and, becomes a global thing because the global internet is a thing. Borders don't exist in the same sense as they do, though they increasingly do exist in some form on the internet. And now there's different compliance regimes in different places. For folks who are operating with services in the EU, they're dealing with the DSA. Now in the UK, they're going to have this. We're going to talk about Australia in a moment as well. The U S is still doing whatever it's doing, which may turn up laws and may not depending on, on whether or not we have a functioning government, uh, at all, which is looking increasingly unlikely, but that's a separate story. and so, I think it's just another, compliance function. Issue that's going to be tossed onto the pile. And it'll be curious to me. I want to talk about some of the reactions that especially smaller sites have had, to these rules. But I'm beginning to wonder like for some sites, they may decide that it's just not worth it to do compliance for local areas if they don't have enough users, so, you know, blocking out the UK, I think I would not be surprised to see some sites do that. The initial reaction that we have seen are some sites saying that they're literally going to shut down. and so I wanted to discuss that one a little bit. The one that got the most attention, I think it sort of jumped up was this, London based, cycling forum, the, uh, London fixed gear and single speed, community, which is apparently a semi active community, but it's actually big in that because the guy who runs it, he built his own sort of forum software called microcosm and he currently hosts about 300 forums and He said he's shutting them all down The night before the online safety act goes into effect because he's basically doing it as a hobby there is some payment involved. So it is a commercial activity, but he said he's effectively losing money on it He's just kind of trying to break even and There's no way that he can comply with these things. And this is the kind of thing that we've warned about that I've warned about for years, which is, for all the best of intentions that some of the regulators have regulating the internet as if the internet is. Meta and Google and just these large players means that smaller players are unlikely to be able to survive because the compliance costs are so high. And you can try and do things like create thresholds like the DSA does. but even there you still have, some element of compliance that falls on, on smaller people. And the, the promise and the, the. Great, you know, wonder of the internet in the early days was the idea that anyone could set up anything, right? I mean, tech don't exist because I could just set up this blog without having to, worry about any sort of compliance costs and designating a head of trust and safety and all this kind of

Ben Whitelaw:

And imagine if you hadn't Mike, we wouldn't be

Mike Masnick:

Yeah. Yeah. What would I be doing? But, but, um, it's, you know, I sort of fear a little bit, like I, I understand the reasons why. And I understand this goal of like being the safest place on the internet. And I understand the concerns about online harms and all of these things. But I really, I still feel like this is. A turning point and I fear that it's a turning point that sort of kills off some of that early promise of the internet that anyone could build a site on their own and sort of locks in that only the biggest players who can handle the compliance costs are going to be able to build websites. And, and. It reminds me of other eras. And like, if you look back at other, mass media type tools, you know, radio, when it launched was this kind of thing that was originally like anyone could broadcast, like radio was this amazing thing. And then because of, you know, there were more practical concerns with radio in terms of like spectrum and, interference and everything, but eventually it sort of narrowed down to only, These professional licensed broadcasters were able to do anything. And we lost a lot that way. And you know, the internet had resisted that for the last, you know, two and a half to three decades. And that might be going away. and that, I think, you know, I'm sort of a little sad about that, even as, I understand the intentions here and I understand the purpose. but seeing the microcosm forum saying they were shutting down. seems like a sign where there was also the, I think, was it Linux gaming, forum saying that they were going to still post articles, but they were going to shut down their forums. You know, we're beginning to see this reaction, whereas like the smaller guys are the ones who are effectively shutting these things down.

Ben Whitelaw:

Yeah, there's definitely an element from reading those two announcements. And those sites, it should be noted, One of them has been going for 18 years. The other one has been going for 15 years. Um, and they both have pretty well developed kind of moderation systems and platforms. They're somewhat volunteer led. and they don't make a bunch of money as you say, but there's an element to it of like the big platforms have ruined it for the rest of us. You know,

Mike Masnick:

sure.

Ben Whitelaw:

right? Like, it's like, we, we, we aren't the issue here. Like we police our gardens. That's a kind of mixed metaphor there. We, we, we, we manage our gardens pretty well. We have systems in place. We've built our own software. We've invested. We've, we've, you know, we've created relationships with people online to be able to manage this. And yet, because of the kind of hundreds and hundreds of pages of documentation, the risk for me as an individual is too high for this to continue, and that really comes through in those kinds of announcements. And those are just the first two in the first few days since this announcement of the codes has been made. I'm really expecting many more of these.

Mike Masnick:

Yeah, yeah, and I think that is what is likely going to happen. And then the question is, yeah, I mean, if you want to point fingers, do you point fingers at the failures of the larger companies to deal with this stuff? but I'm not even sure that that is entirely fair either because you look at how much investment the large companies, you can say, you know, it hasn't always been as good as it should be, but there have been really large investments from the large companies in doing trust and safety because they recognize that you have to just to run a site, not because of regulatory reasons and not because, you know, compliance stuff, just because, you know, you don't want the harms to be happening on your platform. It's not good for business. It's not good for your soul. You know, there are all sorts of reasons to do this. And, some of the, the, underlying factors like humanity is not always great. And this is like another theme that we've, come across a bunch of times there, bad things happen. And there seems to be this underlying wish that, we can just regulate away the bad things by pinning it on these intermediaries and these, generally larger intermediaries. And, this theme that I've talked about forever is like, how much of this stuff is really larger societal problems. We've talked about like mental health issues. We've talked about other things where these are things that. Have been around forever, effectively, and governments have never really been able to solve some of these larger social problems or not solve them for a while or not solve them well, and a lot of this really kind of feels like an abdication of the government saying, well, we can't solve these problems, so we're just going to hold website platforms. responsible for solving the problems that government was unable to solve. And that doesn't feel like a satisfying solution either. It really feels like, passing the buck in a lot of ways. and then we lead to this world where it's just like this compliance heavy function. I think the other thing that we're going to start to see is like, as we have to deal with compliance around preventing online harms is we're going to find that non harmful. But actually often important information is taken offline. And, you know, the best examples of that are like resources around eating disorder or suicide information. Tools like that become a lot harder to run when you know, someone could turn around and blame you for, causing those harms as opposed to trying to create resources to help those harms it, I think we're going to find a lot of unintended consequences. well, there's a question of whether they're unintended or not, but we're going to see consequences of these laws where you're trying to prevent harms, but in some ways may actually be exacerbating the harms by, removing useful resources and helpful resources because people don't want to be accused of something that they're not doing.

Ben Whitelaw:

Yeah. And the radio analogy is an interesting one, isn't it? Because, we think about some of these forums and sites, like we think about kind of old pirate radio stations

Mike Masnick:

Sure.

Ben Whitelaw:

you know, no longer exist, but have this unique, picture in our mind of what it was like, but just don't longer exist because they couldn't, They didn't have the structure or the costs to be able to do that. So it's hard to know what it will look like and when this will all happen, but we do know that we were already seeing the backlash really of, of these codes and some of these sites saying, actually, this is too much for us.

Mike Masnick:

Yeah.

Ben Whitelaw:

and, and it coincides actually this, this story neatly Mike with. another regulatory story this week, which is happening right over the other side of the world in Australia, which I know we've been talking about in various forms of the last few weeks, but Australia have also launched, a couple of new standards, which are essentially versions of, codes, that they have. basically, enforced upon platforms, are in, in Australia or have a user base in Australia. So a little bit of a history, lesson back in 2022, Australia adopted its own online safety act. It asked the industry to create codes to deal with egregious content. Probably remember that, basically six of the eight codes that the industry put together were accepted. And they'd gone into, place. Two of them were not accepted. And so the East safety commission, the equivalent of Ofcom, the regulator in Australia has said, okay, we're going to write these for you. And here's rules you're gonna have to abide by. And so these two standards they're called to differentiate them from codes. I wish they'd all use the same language. Um, are known as the kind of designated internet services and relevant. Electronic services, and they've targeted at cloud based storage services, that typically hold and distribute harmful materials such as CSAM. so again, this is a significant, moment in Australia. It comes hot on the heels of the, famous, under 16 social media band that we've talked about on the podcast. And, and again, there's a whole. narrative taking shape in Australia around again, becoming the safest place in the world to be online. you said you saw some parallels between the two that, that maybe Australia just a bit ahead of the UK. Hmm.

Mike Masnick:

there are different approaches, but both of them are, these are, both countries sort of declaring like, we're going to try and make the internet safe and we want the, internet here to be the safest with this sort of, Semi recognition, semi not recognition that like, again, the internet is somewhat borderless, not entirely, but somewhat borderless. And so, even in the announcement in, in Australia where they're, you know, they sort of admit like, most of the companies that we're trying to regulate here are not based in Australia, but we're demanding that they, do these things. And that, creates all of these other questions about how do you, build a safe internet for a particular country? When again, a lot of the, harms and the risks and the, the regulatory compliance is somewhat amorphous unclear. And the requirements here have consequences that, I sort of feel like. The regulators either don't care about or aren't thinking about so much, you know, so, this particular regulation in Australia is very focused on the online storage space and it's effectively saying that, companies have to scan. More proactively for harmful material, which then raises a whole bunch of questions, you know, in the U S context. and I think we've discussed this to some extent, a few times in the U S context, that's a lot harder because you create fourth amendment issues where if you're forcing private actors to search, then you limit the ability to use any of that material as evidence in a court case.

Ben Whitelaw:

Mm

Mike Masnick:

That obviously doesn't apply directly in the same way outside of the U S but any sort of mandatory scanning creates all sorts of problems, especially for, online storage cases. And, first of all, there, the determination of what is harmful content, there's some content that is, Absolutely without question harmful. The CSAM, space is that already, most of these companies already are using something along the lines of photo DNA from Microsoft or other similar tools to, do some of that matching. But there are concerns about the quality of that tool as a proactive enforcement tool. DNA seems to be very good at doing matching of, if there are known problematic photos. But there have been reports about how photo DNA can be gamed or there are problems with it. And some of the other tools are even less sophisticated. And especially as you're trying to deal with more AI generated new content that is not matching. and you know, the regulations here sort of talk about that as being an important part of what they're trying to stop, but that is much harder to detect. Without dealing with errors, and there are all sorts of stories that have gone back forever of things like. a parent taking a picture of their baby in the bath, which is, non sexual related material, but can be interpreted in that way. You know, there was a story, I can't remember when it came out in the New York times about someone who lost their entire Google account because they'd sent, their, I think maybe toddler was having some, some health issues. And the doctor said, send me a photo, to do some basically telemedicine. And so the parent took a photo of their child naked and it was stored to Google and they lost their entire Google account because Google detected it and said, This is a naked child, which it was, but it was not done for sexual purposes. It was for medical purposes. And there was a legitimate reason and systems are not good at determining that, you know? and so these kinds of cases are not really taken into account this idea that like, Oh, we're going to force these companies to proactively do this stuff. We're going to hear more and more of these kinds of situations where these cases where everyone just assumes that everything is like easily bucketed. Into bad content and good content. And over and over again, like the one thing I keep trying to stress is like, you can't easily define most content as good or bad, and yet most of these laws are sort of requiring you to do so. And then the only way to then do stuff is to do, automated scanning. and that takes away all of the nuance and you create all sorts of downstream problematic impact from it. And so I'm concerned. Again, like I think this is my role on the podcast to, to, to raise concerns about this rush to regulate, and sort of how it's actually going to play out effect. And, and the idea that like the UK and Australia now feel like to some extent they're competing to be like, we're the safest country for the internet. I sort of fear where that ends up and what it leads to. And again, like, I said earlier of maybe some companies decide to just bail out on the UK. I could also see that happening with Australia. I mean, Australia and the UK are. decently large countries, but they're not. if they're going to, cost a huge sums of money, I could definitely see companies saying, you know what, forget it. We're just going to block access entirely.

Ben Whitelaw:

Yeah. We've seen that in other, in other areas and with things like GDPR sites still not catering to, EU regulations when it comes to, things like that. So, yeah, I can also see that. And I wonder if. You know, I do sit generally, I'm generally kind of pros subtle and kind of smart regulation, but it's so hard to get right that I wonder if it's going to impact users negatively when actually, you know, keeping them safe was what regulation was designed to do in the first place. Like there's a double edged sword there, I

Mike Masnick:

Yeah. I do think that there is a real fear there and I should note. and I think like the UK Ofcom people, especially make this clear and Australia folks to some extent as well, you know, they sort of try and present them so that they say, you know, this is different than us style regulation, which is what a lot of people think of where, you US style regulation is really focused on the enforcement side of it. if you do something wrong, we're going to come down on you and, cause you lots of pain. Whereas both in the UK and Australia, they sort of seem to be presenting themselves like they talk a game about Oh, you know, here's the potential punishments and fines that we can issue, but they sort of set it up. More along the lines of, we will work with you to get into compliance as opposed to like, we're just gonna, bash you over the head if you make a single mistake. and I appreciate that that is a different approach, but you know, the fear of having to deal with that is, going to be too much. You know, we've already seen some clashes and certainly, Musk and, Australia had a clash earlier this year and the UK also, right, you know, where both of those countries have demanded stuff of him, perhaps, more reasonably in terms of like, based on how he's running X. but you could totally see with, you know, Both of those countries going after X over these sort of safety issues. he may just get fed up and be like, we don't need you anymore.

Ben Whitelaw:

And there's, there's a, we should kind of move on to some of our other stories. Mac, I think the, thing to point out there, there is, there is this deadline of March next year. I think we're gonna see probably Ocom like. European commission try and make certain platforms, hold their feet to the fire. And if they don't do what they're being asked, then there'll probably be some interesting investigations and cases brought against them. And I think we'll probably see some interesting stuff happening around

Mike Masnick:

yeah, I, I did just really quickly though. Want to point out one other thing just to go back to it. I know we've, done the Australia under 16 thing, but there was this interview in NPR in the U S with Julian and grant about this, Talking about the age restrictions, since we were talking about Australia stuff, which I thought was quite an interesting read and that she sort of admits that the, age of 16 was somewhat chosen arbitrarily, basically saying, well, we could have done other ages. There were proposals for all different ages and Hey, the prime minister chose 16. And then also talks about You know, when there were questions about how do you do age verification, protecting privacy, she talks up this service that uses hand motions to determine your age, which

Ben Whitelaw:

Sounds, sounds like a Nintendo Wii.

Mike Masnick:

it's, it feels like snake oil. I, I then went looking and I found a few presentations of a few different providers who claim this technology. And as far as I can tell, what they actually say is like, they can determine if somebody is like, Over 30 or a kid, but they can't determine like 15 or 16, which is the thing that you need to determine to make this worthwhile, you know, and so like I saw a demo of this guy who founded one of these companies who looks like he's probably in his fifties or sixties and he does this test with his hand and he makes these hand motions that are like, you're over 18. I was like, okay, I can understand that a hand motion can determine like a senior citizen is over 18, but that's not the important part of this. The important part is, can you determine. The difference between a 15 year old and a 17 year old. And, and, the studies I've seen on this kind of technology says it has a plus or minus of around, four to five years, that's not going to work for that kind of thing.

Ben Whitelaw:

So you, so your reading of that interview was that maybe Australia kind of slightly making things up as they go along.

Mike Masnick:

Yeah. That is definitely the sense.

Ben Whitelaw:

Okay. I didn't read that. I'll have a look. Um, Julian and grant is an interesting link to our next story. She's actually featured in this next piece that you pulled out as an interesting read for us this week, Mike, similar to last week, big read personal story about somebody who's been affected by personal harms this time in Bloomberg.

Mike Masnick:

Yeah. it's a story about, mainly about, a kid who ordered some fentanyl through, well, he ordered some pills, which he didn't realize had fentanyl, I think. Through snap, and went basically into a coma and he survived, but has lingering issues and his family is now suing snap. the article is another one of these sort of. heart wrenching tragic stories of a young person who, some issues and, and, it talks about other kids who, ended up dying by taking pills laced with fentanyl. And this is, I mean, pills laced with fentanyl is a major, major problem and it's a known problem around the world. but the framing of the article. I thought was somewhat misleading and very problematic in that one, it blames snap quite a bit. and then on top of that, it blames section two 30 and, can go into the, deeper weeds on this, but it's probably not necessary to go that deep on it. You know, throughout the story, there are all these things where it's like, where are the parents in this, who, there were stories of, you know, kids sneaking out of the home at night to get the pills and all of these things. and that has happened throughout history, pre internet people buying drugs, illicit drugs has happened pre internet, all of these things. the problem here really appears to be. Fentanyl, and the widespread use of fentanyl and all this stuff. And, the idea of pinning it all on two 30, is extremely misguided and, and we've gone through this in other ways too, but it's like, if you change two 30 or take away two 30, it doesn't stop kids trying to access drugs, it doesn't stop illicit drugs. The companies. Don't want this stuff on their platform and they do make efforts to get off. And, and there's like one or two paragraphs given to the fact that like snap has this whole program to try and block these kinds of sales, but it is a constant game of whack a mole like any sort of trust and safety effort. And the idea that like changing 230 would somehow deal with this. It just felt like there are people out there who want to blame the internet for all sorts of other problems and don't want to look at the larger societal level issues that lead to these situations. And so it is an interesting article just in learning like how these things happened and the discussions. And, are other things too, in the article that straight over and over again, law enforcement, when they get involved, they choose not to prosecute, even though they have the information of who sold them the drugs or whatever, and they choose not to prosecute. And then the article is still going, but this is all two thirties fault. It's like, but it's not, it's like you you're pointing to the problems of law enforcement doesn't have the resources to prosecute these people. Why are we blaming 230 for this?

Ben Whitelaw:

this is a really, really interesting profile because for the fact that this is a young man who actually survived, right? So the case being brought by bunch of Snapchat affected, he and another kid are the only ones that have survived this, I think it's 50 something people brought bringing this case against snap and he and the other one is, is the only one who survived and he's, in a wheelchair. He's going for physiotherapy. and he's pretty bullish about the extent to which he's going to try and address, the platform dynamics that led to him buying these drugs. And I guess the counterpoint to that, Mike would be like, isn't there, and we've seen a whole suite of platforms recently changed the way that they do design on the platforms to make it harder for people who don't. No kids to get in touch with them, but with the constant point, not be actually, it was at this point when this, this incident happened, it was too easy for people to get in touch with this kid and to sell them drugs. And to, I take the point about kind of section two 30, but that other kind of elements of what. The codes of practice that Ofcom have introduced and the eSafety commission have introduced that try to mitigate partly what has happened to this young kid.

Mike Masnick:

Yeah. And you know, but again, this is the kind of thing that, everybody's sort of figuring out real time. And it's easy to look at it in retrospect and say like, well, obviously they shouldn't have allowed these kinds of things to happen, but this is actually where 230 is valuable because 230 allows the companies to make these experiments and realize like, Oh, wait, this is not working. This is creating a problem and we need to adjust. If you don't have 230, each of those changes. Basically has to be reviewed carefully by a lawyer. Whereas with two 30 in place, you say, Oh, a trust and safety team that sees a problem can, proactively say, we need to do something to try and deal with this problem. And in fact, like later in the article, they do talk about, another family that did also lose a child to, fentanyl laced drugs who had set up this nonprofit and basically says like this nonprofit is designed to like meet kids where they're at. It uses Snapchat to try and help kids and prevent these kinds of things and saying like Snapchat is meeting them where they're at rather than trying to like make Snapchat, you know, I want to like snap, you know, snap a finger and solve the problem that doesn't do it. It doesn't make the problem go away. It's just sort of like trying to brush under the rug. Whereas, if we can go in and use Snapchat as a tool where kids are and talk to them that way, maybe we can do more. and the article kind of dismisses that. Because snap actually gave some money to the nonprofit that was trying to do these things. And I was like, like as if that, wipes out all of the good that this organization is doing by actually trying to figure out, like what is actually going to work rather than just pointing fingers and filing, ridiculously mistargeted lawsuits, we're saying like, let's build systems that actually do things that help. and this article just kind of dismisses them. And, and, you know, sort of, there was a related point, this family, the article blames two 30 and argues about how terrible it is and all this kind of stuff. And then at one point, the, mother who brings the lawsuit, and keeps getting told like, you can't sue Snap because of two 30 says, I can't sue Snapchat and win because of two 30, but I'm happy to sue Snapchat and cause a scene and basically admitting like I'm filing a Vexatious lawsuit. In order to like make noise and like, that's not, creating a solution. I understand the, the anger and hurt that, the family feels, but like it's mistargeted, you know, as opposed to the other family that was like, no, let's figure out a way to meet kids where they're at and actually develop solutions.

Ben Whitelaw:

So, I mean, it begs the question, and we won't answer that. I don't know if there's a short answer to this, but like, what happens with 230? Like, what is the answer going forward then? Because parents like her don't feel like they have any kind of, means of addressing the harm that they've caused. They, or their children of,

Mike Masnick:

Well, I think, I think some of it is just this sort of misunderstanding of 230 and how it functions. I just had submitted a question for another podcast, a very popular podcast. Asked me to submit a question to another family that had, lost a child the AI lawsuit that we had talked about, and they wanted me to ask about 230 and I was like, well, I don't want to, this was like directly to the family and I was like, I don't, That feels like a very insensitive thing. Like I don't want to target them with like, specific legal in the legal weeds about two 30, because that's not what they want to hear. They're, they're reasonably and totally understandably upset trying to figure out what to do. But the underlying thing is do we solve the, actual problem, right? There are underlying societal issues around mental health and drug use and, all of this stuff. And if you are taking away the liability shield that allows companies to try to figure out what is best, what you are guaranteeing is that all this stuff gets swept under the rug. And we have a long history of showing that that doesn't work. Right. We're taking away the tool that allows companies to try and figure out what works best by saying, you just have to stop it. And if you don't, we're going to punish you. and nobody knows how to solve these problems. At a larger scale, there are questions about, prescription drugs and abuse and how the drugs get in the country and law enforcement and education and again, mental health, all of these things. And yet we're expecting private companies to magically solve all of these problems. And, It's I, I, the whole thing is sort of mistargeted. And I understand why, because, the family is upset and they, they want to cause a scene. They want to make a scene and part of making a scene they feel will lead to something to happen. And, and, you know, I think effectively they're saying, you know, we're not experts in this stuff because they're not, and they're upset and reasonably and understandably, and, and I'm not trying to minimize the absolutely terrible. Situation that, that all of these families have gone through, but they're targeting the wrong thing. And I think in the long run, they make it worse when they do.

Ben Whitelaw:

Yeah. I suppose it's easier to, campaign or bring lawsuits again against one specific issue that it is to try and address the issue with prescription drugs and the issue with, you know, the medical system in the U S and all these other things, right? Like it's, it's the fact that you can kind of bundle all of those frustrations up into one single law. Like one case and run at it.

Mike Masnick:

it, it's totally understandable, but it's, the old saying of like, for every, complex problem, there's a solution that is, simple, easy and wrong. Right. And that's what this is. It's a simple, easy, and totally wrong solution that will actually cause more harm. And, it's horrible to say, but like, these are complex situations that don't have easy problems. And anytime people get focused on easy solutions, you're actually probably making the problem worse.

Ben Whitelaw:

Yeah. I mean, I totally agree. And, uh, we've started the year talking about how these things are never easy and we will finish the year talking about how these things are never easy and that's proven to be the case. Let's go on to kind of a few of the other smaller stories that we noted this week, Mike. We've talked largely there about kind of regulation, whether it's the way forward, whether it isn't, what the implications. one company that we both noted was responding to some pretty heavy government pressure to be more regulated is Telegram and, got a new quite slick, quite, jazzy moderation page on their website. If you go to telegram. org forward slash moderation, and it has a whole bunch of data that Telegram has, Never, ever kind of put out before, which is super interesting. So, this is a new thing. I'll describe a little bit about what you can see as a user basically shows a few different charts with the total number of groups and channels that are blocked on the platform, some CSAM groups and channels, it has a terrorist communities that it's banned, and it has these kind of. quite, you know, sexy sliders. So you can, you can show for the, for a month, you can show for a whole year. Whoever's done this has spent, some real time making this as unnecessarily interactive as possible. And there's a big number at the top of the page, which says, we've blocked 15 million, almost 60 million groups and channels in 2024. And so, listeners will remember CEO Pavel Durov was arrested. Back in August, by the French authorities, when he landed his plane, just outside Paris and, the charges brought against him were related to the way that Telegram respond to government requests for information about users on the platform and, and other things related to content moderation. And so this is, is essentially the, the reaction to that, isn't it? It's the, we're doing some stuff, leave us alone. look at all the people and the channels that we've taken down.

Mike Masnick:

yeah. It feels like this is direct response to the arrest and basically saying like, here's our sort of transparency report. It's not quite the transparency report, like other companies put out with more detail, but it is slick and very professional well designed, the one piece that really stood out to me, you know, they talked about terrorist content, but they also talk about. C SAM. And, I'm pretty sure we noted in the past on this podcast, the fact that NCMEC has received zero reports from Telegram. and I went back and I looked you know, the last NCMEC report is for 2023 and Telegram is not listed as a reporting company. But here, when they're talking about CSAM, they talk about the reports that they've received from different NGOs, including NCMEC. you know, it talks about in 2024, it looks like they've received basically 5, 000 reports from NCMEC. and they're talking about how they take action on it, but it's interesting that they don't seem to report back uh, and any content that they find. And they also talk about some others. you know, some other NGOs that report stuff to them and how they're, taking action on it. So it is a, look, we are doing stuff kind of report, but, it feels a lot for show.

Ben Whitelaw:

Yeah. So Nick Mick being the, for people who don't know the national center for missing and exploited children, which runs the cyber tip line, which platforms, should be sending reports to, and so, yeah, Mike makes a really good point there that actually for whatever this page says, Nick Mick sources suggest otherwise, which is interesting. And that, that Jurov, uh, Case that incident is still happening in the background. He's still on bail. So, this is, clearly an attempt to try and demonstrate that it is, fulfilling its obligations under the DSA, which is really interesting. we probably have time for one more story, Mike, I think, before we wrap up today and for the year,

Mike Masnick:

Let's, let's just do a real quick one on, Tik TOK. Uh, just sort of noting the fact that the Supreme court has agreed to hear the ban. it's obviously a really big story. what happened was the DC circuit had, uh, Rejected, tick tocks, uh, attempt to, claim that the law was unconstitutional. tick tock then asked the same DC circuit to put the law on hold while they could appeal to the Supreme court. The DC circuit said no. So they rushed to the emergency docket, which is often called the shadow docket and said to the Supreme court, like, Hey, can you put the law on hold? And then we'll have this discussion, and we can, you know, the Supreme Court can hear this case and said, you know, and they sort of threw this line into the filing saying, if you'd like to turn this document where we're asking you for an injunction, just to put the law on hold into a cert petition, which is like to hear the actual case, we can do that too. And the Supreme Court came back. earlier this week and said, okay, we'll turn it into, into that. And we're going to hear the case on January 10th, which is like incredibly quick and like filings are due December 27th, which is, you know, next week, two days after Christmas, responses are due the following week. It's, basically like. Every lawyer who is involved in this at all, your holidays are ruined because you need to put in a whole bunch of work. and that's true of a whole bunch of people who are probably going to be filing amicus, briefs as well. and you know, suddenly have to rush through the holidays to work on, filing stuff, in very quick turnaround, and I don't, I don't I mean, they're trying to obviously do this before the law technically goes into effect on January 19th. So they have the hearing on the 10th and no, I assume that that means they're planning to rule on it before the 19th, but, um, you know, like

Ben Whitelaw:

hell of a short turnaround.

Mike Masnick:

it's, it's a really, really quick turnaround. And. For something that is like a really big issue because the DC circuit case really upended our understanding of the first amendment and basically says you can, get past all sorts of first amendment issues. They admit that as a first amendment issue, but they say we can, we can ignore the first amendment because the government has said national security. Loud enough without actually proving that this is a national security threat. And that is really scary to me because you could see all sorts of cases. In fact, like one of the most famous first amendment cases ever is the Pentagon papers case where the government screamed national security. If the, if the New York times and the Washington post are able to publish these documents about the Vietnam war, that will harm national security. And the Supreme Court said, No, you can't say that you can't just stomp on speech by just screaming national security. And yet the D. C. Circuit here saying otherwise, and now the Supreme Court is going to rush through this case. Briefs are filed in a rush over the holidays where people don't have their full attention on it could make a very consequential decision about how we handle the First Amendment in cases where the government claims national security. And so I understand sort of why it's happening, but I think it's a very, very worrying setup for this.

Ben Whitelaw:

And why didn't the other option happen where case is put on hold until it was allowed to happen in slightly less rushed circumstances?

Mike Masnick:

have no idea. There's no, as far as I can tell, there's no good reason other than they were just looking at the date of January 19th when the law goes into effect and say, well, we could, we can take care of this by then. And so it's, you know, the Supreme court is not supposed to worry about the, the lives of the lawyers who have to practice before it. And so maybe this is, that's what they're doing. But, to me it is kind of odd that they couldn't have just said like, we're going to put this on hold until we can hear this. And, you know, you could still do it in an accelerated manner. You know, I have the hearing in March or something, for whatever reason they said, we're going to rush forward and we're going to ruin a whole bunch of. people's holiday, uh, yeah,

Ben Whitelaw:

Yeah. It's the theme of today's podcast, Mike. Speech, speech, speech lawyers having their Christmas canceled, trust and safety, policy folks in platforms have got their Christmas canceled because they're dealing with regulations, stuff being given to them a week before Christmas. I mean. this episode might be the best thing that happens to them over Christmas. They might be listening to this whilst they're doing their work, thinking, things aren't that bad.

Mike Masnick:

yeah, maybe time for a career change.

Ben Whitelaw:

I hope not. I hope not. Um, okay. Well, that brings us to the end of our last episode of control or speech of 2024. how do you feel having got here?

Mike Masnick:

Hey, this has been, you know, I mean, there have been some tragic stories along the way, including in today's podcast, but, um, think we had a really good 2024 in terms of the podcast. thanks especially to all of our listeners who, have been incredible. I mean, the amount of feedback, like. Overwhelmingly positive, encouraging feedback that we've gotten from people has been amazing. Like, you know, we, we kicked this off as an experiment this year and really had no idea where it was going to go. And I think we have been absolutely thrilled with the response and the support and the excitement. the constant excitement that people have had about the podcast. and just people telling us wonderful things. So I had someone recently tell me that every new person they hire, they tell them they have to listen to our podcast, which is just, I mean, amazing, uh, perhaps cruel for the, their employees. But, uh, just amazing to hear stories like that and just makes us feel really, really good. So I think we've had a, really tremendous, 2024 with the podcast. At least how about, how about you? Yeah,

Ben Whitelaw:

I mean, big shout out as well to some of the sponsors who've come on board this year as well, and who've, who supported us, in the first year, we, we've got big plans for 2025 and we're aiming to grow the podcast to get more listeners to it and, and get more of you involved as well via. getting people to co host, getting people to come on share their expertise. So we've got a lot planned, rest assured that we'll be back in 2025. And, uh, yeah, just to thank you, Mike, for all your, wit and wisdom over the last 12 months. Uh, long, long may it continue.

Mike Masnick:

no, this is, this has been a really, really great partnership and it's been lovely to work with you as well. And, uh, I'm looking forward to, more and, uh, better podcasting in 2025.

Ben Whitelaw:

Amazing. Well, on that note, we'll bid everyone farewell, have a good holiday period wherever you are, listeners, and we'll see you in 2025. Thanks for listening.

Announcer:

Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.

People on this episode