Ctrl-Alt-Speech

It's a Banned, Banned, Banned, Banned World

Mike Masnick & Ben Whitelaw Season 1 Episode 71
Ben Whitelaw:

So Mike, this week I delved back into, my SoundCloud account. I dunno if you're a SoundCloud user or have ever been. Um,

Mike Masnick:

definitely used SoundCloud and we host the Tech Dirt podcast actually on SoundCloud.

Ben Whitelaw:

uh, okay. Okay. So it is where basically I find all of my cool music. I haven't used it for a bunch of years,

Mike Masnick:

It is, it is the place for, Cool Music and DJs, so,

Ben Whitelaw:

right, exactly. So I, delved back in, it was nice to see all the kind of, saved songs that I'd curated years and years ago. But anyway, I kind of thought we haven't used SoundCloud on control or

Mike Masnick:

Oh yeah.

Ben Whitelaw:

as the opening of, the podcast. And so today, as I'm asking you,

Mike Masnick:

Not to sing.

Ben Whitelaw:

not to sing or to share your SoundCloud songs and recommendations, but I want you to get back to it.

Mike Masnick:

Oh.

Ben Whitelaw:

Which I think in a SoundCloud sense means, you know, start playing your favorite music. But yeah. How, interpret that as you will

Mike Masnick:

Uh, yeah, I think, I will get back to it by, I think we're going to have some interesting discussions this week about things that go back to the earliest days of Tech Dirt, where I was always writing about how jurisdiction works on the internet. When you have a global internet figuring out what rules apply, where becomes a really, really interesting question that we have not solved in 30 years.

Ben Whitelaw:

hmm. Going back in

Mike Masnick:

that's, that's, that's my get back to it. But how about you? How, what, what are you gonna get back to?

Ben Whitelaw:

Well, you know, I don't think we ever really stop on, on controlled speech, you know, there's no getting back to it.'cause we, we never take a pause. So we're, we're back at it this week, for another episode. We've got bunch of great stories and, uh, looking forward to talking about it with you. Hello and welcome to Control Alt Speech, your weekly roundup of the major stories about online speech, content moderation, and internet regulation. It's September the 11th, 2025, and this week we're talking about Nepalese shutdowns asking whether the real Mark Zuckerberg can make himself known and debating whether four Chan is right about the Online Safety Act. I'm Ben Whitelaw. I'm the founder and editor of Everything in Moderation, and I'm with Mike Masnick, who actually is famous for something four chan related.

Mike Masnick:

oh.

Ben Whitelaw:

Are you not, did we not talk about this on the podcast

Mike Masnick:

I, I don't, I'm not sure exactly which specific thing you're talking about. So.

Ben Whitelaw:

Well, that's a great way to start the pod. I I I'm sure there was something where you were cited in court

Mike Masnick:

Yeah. Oh yeah. Yeah. So, well, I was an expert witness. I was an expert witness twice, related to four chan. one was just an internet troll who was arrested. And I, explained sort of internet trolling. And then that led me to being an expert witness in a case that was directly about four chan where I was basically, I, didn't have to testify, but I was, teaching public defenders about how four chan worked, because they were defending someone who was arrested, who had, shot up a protest. but the discussion about it was all took place on four chan and they had no idea how it worked. So I don't know if that makes me famous for four chan related stuff, but I, I, I am, an official expert.

Ben Whitelaw:

Yeah.

Mike Masnick:

A court recognized expert.

Ben Whitelaw:

I'm glad we clarified that that was what we were talking about. Because listeners might've thought you were just like a heavy

Mike Masnick:

Yeah, that's right.

Ben Whitelaw:

which, you know, no, no judgment if you are. No judgment.

Mike Masnick:

not. I'm not.

Ben Whitelaw:

I wanted to say thank you for my parcel this

Mike Masnick:

Oh, okay.

Ben Whitelaw:

Yeah,

Mike Masnick:

You got, you got something in the mail?

Ben Whitelaw:

I got something. I've got five card games you made that I funded or helped to fund via, Kickstarter. 1 billion Users is the, fund to play, excellently priced, I would say, uh, card game that, tech that, did a huge amount of work for early in the year and managed to, get a, a whole bunch of people to, to fund. And I, got'em delivered this week, so I haven't played yet, but, um, I'm excited to play. I'm excited to, challenge my mates about, how we run this hypothetical social media. What's the reaction been to people who've, got the game through the

Mike Masnick:

Uh, we've had great reactions. I mean, we, we've sent out all the ones for anyone who has sent a survey, which means if you backed it and you haven't gotten it yet, that might mean you haven't filled out a survey. And you probably got an email from me yesterday personally saying, please fill out the surveys. Only about a hundred people haven't filled out the survey we've shipped out about. Over 2000 copies of the game. Um, and we're trying to get the rest out the door so that they're not just sitting on a shelf collecting dust. the reaction's been great. we've been getting pictures and posts on social media. somebody did a live stream on Twitch with the unboxing. That was really fantastic. we we're, you know, people are really excited. We've already had people asking about how they can order more copies. and we're going to have that very soon. we don't have that many copies left once we ship out all the remaining ones, but we're gonna put the remaining ones on sale probably next week. So if you didn't get it, pay attention because we'll probably announce something next week with, the ability to, to grab some more.

Ben Whitelaw:

or make me an offer for one of my

Mike Masnick:

There you go. That's, that, that's, these are now going to be rare and, and so you can sell it at whatever price you want. So whatever the market will bear Ben.

Ben Whitelaw:

like a fine wine. It will, uh, in increase the value. Now I'm really excited to play. and, and just kinda explain the concept for people briefly, like what it is that the game is trying to do. How does it play into the kind of questions and topics we talk about each

Mike Masnick:

Yeah. And so first, and, and most importantly, it is a fun game. you know, we've done other games that were like, education is the more important part. This is not true of this game. There are some educational elements of it, but the idea is, you know, you're running a social media network and it's, in the form of a card game and you're trying to build the biggest and best social media network, and you have to deal with moderators and toxicity and trolls and servers getting overloaded and the press being mean to you. And there's all sorts of other things that happen along the way. which I think actually does give you, you know, it may not be a, a perfectly realistic, but it gives you a sense of the trade offs and challenges of running a social media network and trying to. Get more users and, build up the service and, succeed while facing all different challenges and, trade offs and stuff. But it is designed at its heart to just be a very fun game. So, it's based on, uh, some people know this game. there's a game meal born, which is very popular in the fifties and sixties, which itself was based on another game from the early 19 hundreds. And we took all that and sort of remixed it into something that we think is, really fun.

Ben Whitelaw:

Nice. Cannot wait to play. maybe we'll do a live game on the

Mike Masnick:

A couple people have asked about that, like, you know, doing a sort of live version of it. There, there may be a way to do that. I would have to think through the best way to do it, but yeah, there's probably a way to do it.

Ben Whitelaw:

I'm gonna get practicing so I don't embarrass myself. Um.

Mike Masnick:

Look, I, I made the game and I play it, a fair bit and I lose pretty constantly. So.

Ben Whitelaw:

Must be tricky. well, we'll come on to more of that in future episodes. before we jump into today's stories, if you enjoy the podcast, listeners who tune in every week, please take a moment to rate and review us wherever you get your podcasts. give us five stars, leave us a few words of, praise and adulation. If you, if you will, do mention me and Mike. not advocating for you being left out, Mike,

Mike Masnick:

Look, look, I don't know, I don't know if you noticed, but earlier this week there was a post on blue sky talking about Ben White law's control alt speech. It did not mention me at all. and, and I think that was done deliberately as payback for me for the review. That didn't mention you. So

Ben Whitelaw:

Right. More of that, more of that, please. Um, and for those of you who want to reach our growing audience of esteemed listeners, you can sponsor an episode of the podcast, get in touch with us to talk about how that would work, podcast@controlorspeech.com. Um, we can talk you through the options. It's all part and parcel of how we make the, podcast sustainable and how we help grow what we're doing. question for you, Mike, about 1 billion users. Is there a card or part of the gameplay that involves government censorship?

Mike Masnick:

There. there is.

Ben Whitelaw:

Is there, is there, well, that, that's a handy segue onto our first story of this week, which many listeners will have heard about. we are starting in Nepal. of all places this week, not somewhere we talk a lot about uncontrollable speech. a country of 30 million people nestled between India and China, famous for Mount Everest naturally, being the birthplace of Budha. And now, as of this week being known for a government shutdown of social media platforms as a result of some, internet regulation. So really, really interesting story that have, has kind of, evolved as the week has gone on. I think it's really important that we talk about this as a podcast that is concerned about online speech that tries to kind of include, countries outside of the us, the uk where we are based. and so we kind of wanted to unpack what happened this week and, and maybe why it matters more broadly.'cause I think it's, fairly small nation in the grand scheme of things, but could have potentially quite large implications for how regulation, rolls out, particularly in the region. So for people who, who maybe. Haven't been paying attention. the Nepalese government on the 28th of August, so kind of, couple of weeks ago now, sent a request to a bunch of social media platforms to register, under the government, under the Ministry of Communications and, information technology. This is all part of a kinda longstanding effort to try and bring big tech platforms, under governmental regulation in the country. There was a piece of legislation that was passed in 2023 and the 28th of August was kind of the last attempt of the Nepalese government to try and get the platforms to register. And registration, as we've seen in many other countries, is providing information about, how many kind of staff you have in the country, provide information about the office and actually provide a named person as well. And this was sent out to. 26 different platforms that hadn't registered in the intervening period. that included all the big ones that we talk about every week on control of speech, Facebook, X, YouTube, WhatsApp, a whole range of them. a week later, on the 4th of September, nothing had happened. So this is this time last week. None of the platforms, had responded to that request for registration, and the Nepalese government banned 26 platforms. So it, it meant that citizens of the country were unable to access those platforms. And it kind of brought about this, this kind of snowball effect, Mike in the country that you would've seen, that we'll talk a bit about, A number of platforms did compliance, worth noting vio, the messaging platform TikTok as well, which was banned in 2023, by the government for, supposedly sewing social disharmony. there was a kind of an issue where the, the current government in Nepal, which has been a longstanding, in the country, didn't like the way that TikTok was, kinda sharing pornography sites and, and sharing non suitable content. And so TikTok was banned for nine months. That wasn't the case this time round. So you had two fairly big platforms who were still available, but so many of them were banned and obviously citizens were unhappy at this point. We saw huge protests, in Katmandu the Capitol. People came out on the streets, they stormed parliament. They were, going to minister's houses and, and setting them a light. The airport was closed. And actually, that led to, the police, and the army coming out onto the streets to try and quell the protest. people were killed. Around 23 people, at the latest count and hundreds were injured. And it led subsequently to the resignation of the Prime Minister, KP Sharma Oli, who resigned, a couple of days later, so two days ago now. and then rolled back on the ban that kind of encompassed the 26 platforms. So in the course of really just a few days, you had a situation where, The government is trying to kind of enforce regulation on big tech platforms only to then find that the Prime Minister has resigned. And, there's a, a series of violent protests on the streets as a result of that. I'll go into more about the reaction, Mike,'cause I think it's a kind of an am amazing, tragic story in many ways because of the deaths and, and the violence. But also it brings together the kind of real world implications of the stuff we talk about every week. how have you been following it from where you are?

Mike Masnick:

Yeah, I mean, it, it, it's sort of fascinating, I saw the story first. You know, I've, I've sort of followed it as it's rolled out, and thought, oh wow, you know, uh, um, these bands, it's a lot. You know, a lot of services to ban at once, but then I was sort of surprised that, that led to these massive protests and they're sort of referring to them as the Generation Z protests. and you know, my first reaction was like, I'm a little surprised, like people were protest over social media specifically. Obviously, I think there's more to it, but the social media bans were sort of the catalyzing effect where young people in Nepal were basically talking about how the government is corrupt and doing all these things. There's a sort of side story about how, there's huge unemployment, right now among young people, but children of government officials are finding jobs very easily. And so there's this larger concern about corruption and the social media regulation and ban is just kind of seen as a sign of that corruption and an attempt to sort of control the flow of information. And the, youth of Nepal basically said, we're not gonna take this. and went on a, huge protest. And, we've seen laws kind of like this in the past. You mentioned that, they've become popular. I sort of refer to them as hostage laws, where a lot of these, governments are concerned about social media or various internet platforms and know that they have no power and jurisdiction over them. And so they pass a law where the registration sounds. Benign. Like, oh, you just have to register to operate here. What's the big deal? But no, the important part is the designating who you have in the country and you need to have somebody in the country. And the reason they're there is that so that the government has somebody that they can arrest or threaten in order to, pressure, a company to, remove content. is really how this works in practice. And, you know, like, again, you can paint all this in a, totally benign light and say like, well look, if you're operating in a country, you need to have a representative there. And that makes sense. But the way these have worked out in practice in lots of countries, especially authoritarian countries or less free countries, is that the whole purpose of it is to find a pressure point in order to push these companies to moderate in a certain particular way. And, you know, I sort of wonder. TikTok agreed to do it. Viber did these other companies that didn't, you know, I almost wonder how much this was even on their radar. Like, you know how many of the, the companies made a conscious choice? How many of them are just like, didn't even notice? You know, Nepal's probably, you know, there are a lot of people there, there are millions of people there. But, it may not be a, priority for some of these companies, which is another topic that comes up a lot like what happens in countries. You know, usually we talk about how, these companies ignore things that are important that are happening in these countries because they just don't have the resources for it. here it might be that they're willing to sort of risk it because they're just like, this is not, you know, there's not enough users in these countries to, in, in this particular country for them to care about. And they might not have realized that it was gonna lead to blocking and then social unrest.

Ben Whitelaw:

Yeah, there's obviously kind of two stories here, isn't there? There's, there's the government reaction to the fact that the companies didn't register, which I think is, It's pretty cut and dried, right? governments that withdraw the ability for citizens to communicate, particularly in a country like Nepal where social media is so synonymous and where, from commentary that we've, both read this week, Facebook and Twitter and Instagram, are, are the way that kind of people communicate. Um, right. So doing that is gonna have, I would say that kind of natural backlash. and, and I think Nepal has poor and declining press freedom. you know, the Press Freedom Index, which is run by Reporters Without Borders, puts it at 90 out of 180 countries in the 2025 index down from 74 last year. So it's, declining at a fair rate. there's been a lot of criticism about the incumbent government and its approach to press freedom. This story, maybe that's more relevant to us that I wanna kind of talk about is. The registration part of, of the platforms.'cause a number of ministers in a piece and comments given to the New York Times said that they'd asked five times to the platforms to register. And you know, they probably, as we'll talk about in your tech up piece very shortly came in the form of, emails and requests to provide information that, sometimes are, legally binding, sometimes are not. and they clearly were ignored. And so from the government officials perspective, they're kind of saying, well, we, we gave the platforms or the opportunity that they had, clearly they didn't recognize what was gonna happen and, and the people were gonna come out onto the streets. Do you think the platforms should have done more in hindsight? now that we've seen what the protests have done.

Mike Masnick:

I mean, and I don't know, I don't know if you've seen anything, I haven't seen anything of the platform, any of the platforms who were blocked actually giving any commentary on this. I think they've been pretty silent. So I don't know again, I don't know if it was a deliberate decision or it was just one of these things that fell through the cracks because Nepal was not, particularly, prioritized market for them. but I do think for at least some of the, companies, probably the thinking was. The recognition that if we do register, we're basically putting someone at risk. Right. Which is what I was talking about with the sort of hostage taking law aspect of it. When you register, you have to designate someone, and that person just is naturally at, risk. And so there are always, when these kinds of laws come into place, there will often be like law firms that sort of step up to be the designated person. But it's always a very risky situation for whoever is designated and for the company that then employs them in which then they have to deal with the trade-offs there where it's like. Are we willing to put one of our employees at risk or representative of our company at risk? And if so, how are we going to deal with the demands to take down or remove certain content or, you know, to twist our algorithm in some way, whatever it might be. And so there are big trade-offs and I think for a lot of companies, probably the decision here was, you know, that trade off is not worth it. We don't wanna put somebody at risk and we don't wanna be in a position where our arm can be twist to do things that are probably against the interests of, the people using the platform without necessarily recognizing that the response was then going to be to shut down the platform, at least for a few days. before the protests happened. And so, there is a certain logic to the regulation side where, different countries want to have the ability to regulate, internet companies that are operating within their borders in some sense. Or, you know, that are widely used by their people. Certainly. But, you know, I think a lot of these laws and a lot of the companies recognize that the way these laws are written, they're dangerous to agree to. And so a lot of times you just sort of take your chances as a company and say, look, we're gonna ignore this, and see what happens. and now we've seen what, what happened? And, the public rose up and, that to violence and stuff.

Ben Whitelaw:

Yeah. I mean we, we've somewhat seen this before, right? Because of the TikTok ban that happened in 2023. So they were banned for nine months. They were allowed back in, in the country, so users could, go on TikTok and, consume content there. And then a couple of months later, TikTok registered. In the country, presumably with a, an office and a named individual. And, and with a process, as you mentioned, do you expect other platforms to do the same now? Like what do you think the kind of thinking would be in, in the companies?

Mike Masnick:

Yeah, I don't know. I mean, I think again, it's like there are really difficult trade-offs here and, you know, I think right now because of the reaction, because the government backed down and the Prime Minister resigned, I think the companies are probably in a pretty strong leverage position where they're saying, we don't need to, because if you block us, you know, especially the more widely used ones, certainly if you block us, then the public will revolt. So we just, we're not gonna do that. So we're not gonna be subject to the whims of the government demands in the same way that we would be if we had a register. So I don't know if, if anyone will because of that. TikTok is interesting too, just because TikTok. given its background and its connection to, China already has shown that it is willing to, work with governments potentially oppressive, authoritarian governments to moderate things on the platform. So perhaps for them it was sort of an easier call of, this is the kind of thing that, we do. And, also during that time, TikTok was also banned in India. And so I think they had already been sort of dealing with the back and forth of, how do we stay in the good graces of the Chinese government and the Indian government, and then now the Nepalese government as well. and also like the kind of communication that happens on TikTok is a little bit different than some of the other platforms as well, where it's easier for TikTok to agree to these kinds of regulations where it might not be, you know, for some of the other companies. And then, I mean, like, included in the list of companies that was blocked was like, signal and signal's not gonna agree to any of this, right? That's not Signal would, would never agree to that. and so, there's all sorts of different things and I think the different companies involved will have different choices to make. I mean, you know, one of the, at least one of the companies is like a Nepalese company itself. Um, and obviously for them it would be more challenging. so I don't know exactly how, that will play out. but I think the public reaction actually makes it less likely for the companies to agree to register now.

Ben Whitelaw:

Yeah, no, I, I would agree with that. there's obviously something quite interesting you touched on as well, is, is the kind of, the youth unemployment in the country. So I didn't know any of this. It's, I've only really been reading it this week, but the nature of, there's a kind of big skills gap apparently in Nepal. A lot of people leave Nepal and go elsewhere for work. I don't think the, the economy is very strong. A lot of people send money back, to families in Nepal. And this has kind of created a, I guess a probably bit of a breeding ground for discontent, among the Gen Z group who, you know, will use social media more. do you think this is a kind of a series of factors colliding in a way that created this event? Or do you, do you kind of expect this to happen in more countries in, as regulation rolls out?

Mike Masnick:

Yeah. I mean, I think it is, it is a combination of factors, right? I mean, I don't think that like it will become a commonplace thing that we'll see. mass protests because of internet regulations. but I think the combination of the job situation, and unemployment for young people and the reliance with which they use it on social media and the extent of the ban, it wasn't like one app at a time where people could easily move to another. I think all of that combined and the concerns about corruption, it just created this sort of perfect storm, that resulted in, this kind of response. I think it will depend other countries, you know, unemployment causing, protests is not an uncommon thing. That certainly happens, you know. Big protests not that long ago in France over unemployment issues too, right? So like these things happen. But I, I think it was sort of this combination of all of that and the, the social media bans being a, a sort of symbol of, oh, not only do we not have jobs, but also the government's trying to take away our forms of communication to sort of cover up their corruption. That made for just an easy story that, easily led to a protest movement.

Ben Whitelaw:

Yeah, no, it's, it's been a, I think, a fascinating story to see un unfold. It's got widespread coverage, which is, very much why we're talking about it here. and also it links to a, a piece I think that we're gonna talk about that you wrote, towards the end of last week about how the UK is trying to kind of enforce its own regulation. It's a hell of a title when trolls take on tyrants. Four Chan and Kiwi Farms sue the UK over extraterritorial censorship. talk us through, what I would perceive as an argument in favor of four Chan and Ki Kiwi Farms.

Mike Masnick:

look, so I was very clear in the article that four Chan and Kiwi Farms are two of the worst sites on the internet. and they're certainly have been responsible as breeding grounds for all sorts of trolling and harassment. and I should be clear about that, where it's like trolling can be more harmless, but the actual harassment and abuse that has stemmed from both of these platforms, I think is noteworthy and they are problematic platforms. At the same time, I have certainly spent plenty of time on techer and on this podcast calling out my concerns about the Online Safety Act. so what I thought was interesting here was that Ofcom apparently targeted four particular sites as sort of early targets of not fulfilling the rules of, the online Safety Act. This is not having to do with the age verification stuff that just went into effect, but the, earlier,

Ben Whitelaw:

Child

Mike Masnick:

the child safety stuff first started to go into effect, uh, of, of, so they sent notices to four different platforms and it's, you know, conceivably four of the worst platforms online. So four Chan Kiwi Farms, what is referred to in the, in the complaint as Sasu, but is a, forum, specifically for discussing, and some people would argue for encouraging suicide. and then Gab, which is sort of famously run by someone who is proudly kind of a neo-Nazi. and you know, it's a social media site of that nature. but four Chan and Kiwi Farms teamed up to sue in US Federal Court to try and block Ocom. and I don't think the lawsuit is going to succeed. but I do think it raises some interesting points. And so there's like, these are awful websites that lead to awful things, and I think the lawsuit is not going to succeed. and also there is a bunch of stuff in the lawsuit that I think is just like political blustering to try and like rile up, a sort of Trumpian, MAGA world. But it raises some interesting points about the, jurisdictional reach of the UK and Ocom for companies that are entirely based outside of the uk. And this is a thing that, you know, as I said in the opening, the internet really has been since its, earliest days has been a challenge to the idea of local sovereignty and laws that apply to a certain country when you have an internet that is reachable, across international borders.

Ben Whitelaw:

Yep.

Mike Masnick:

And so we're seeing that in this case, just to be, uh, a few of the specifics on the, on this case. the companies are arguing. First of all, I think this is crazy. They're saying that they can sue, normally you have a thing called sovereign immunity, which is you can't sue a foreign government because they're a sovereign entity and they get to make their own rules. So they start out by claiming that Ocom is not a part of the UK government. This is nonsense. Like, I don't, I don't, I, I

Ben Whitelaw:

That's the, that's one of the bits I had the

Mike Masnick:

yeah. Like,

Ben Whitelaw:

took the most on bridge

Mike Masnick:

I, I don't, I don't see how they get away with that. and it's like, it makes no sense. even if you could make some sort of argument that it is like a separate independent entity, it's still acting as an agent of the state, which still makes it the state for this purpose. And therefore I think the case is just gonna lose on that. they're going to say there's sovereign immunity and the judge is gonna toss it out. but there are larger questions about, how much should A company that is not, that has no presence, has no assets, has no connection directly other than users to the uk, how much can the UK regulate them?

Ben Whitelaw:

Right. this is what I wanted to kind of talk about. So there is precedent here, right? For, laws which focus purely on the use of a tool or a, or a platform. I mean, GDPR is, I think a good example of that. Not necessarily a good law, but it's a, a law that countries like the US have taken up, and abided by even when, it's about the usage and the kind of collection of that data in the country. And so isn't the, the kind of. User base enough here. Like I know that the focus the legal case about operations, infra infrastructure, but isn't the kind of user base the thing, like that's the thing that where the, the money is made. It's where, data is collected on and based on, it's where that feels like, again, one of the points where this, this might get struck down, I dunno.

Mike Masnick:

I mean, so like, I, I understand the argument, but then the response to that is, so what? Right. So like if, a company has no presence whatsoever. what? What can be done, right? If they have no presence in the country whatsoever. Even if the country determines like, oh, you're violating these laws and we're gonna fine you. Well, good luck. I mean, right? Like if they're not operating there, good luck. Now, there are ways in which. Courts can try to, go into the, foreign courts and try and effectuate an order. You know, in the US we have some protections against that. there is like the speech act in the us which is specific just to defamation law, so I don't think would necessarily apply here, but conceptually is very similar where it says, if someone is found liable of defamation in another country for speech that would be protected under the First Amendment. If you then try to domesticate that ruling, bring it to the US and say, this person owes me, you know, a million pounds because the UK court said, so the US course will block that because it doesn't match with the First Amendment. So we have some precedent of saying like, you know, if there are things that would not break the law here, we wouldn't then enforce them here. And so part of my argument was just that and like, I don't think. the uk, and I don't think Ofcom is doing this in bad faith, but at the same time there are countries that might try and do the same sort of thing in bad faith. And so, you know, the obvious example is like, well if you have Russia that passes internet laws, which it has done, if it's then starts sending letters to us, companies saying you need to do these things, you need to leave up Russian troll farms, for example. Like, I think you would also admit that like American companies should say, screw that. We're going to ignore that law. We're not going to obey. And so my take on it is, as a country, the UK has the right and jurisdiction to rule over. The companies that are actually operating in that, country. And so, even when you say, but it has a whole bunch of UK users, I said, that's fine. But then the focus should be on the ISPs and service providers in the UK that you can then say, well, you could say four chan is violating the laws here. You have to block it. Because to me also, like, and you know, I have problems with laws that require websites to be wholly blocked, but at least then you're, you're being explicit about what is going on. Because I sort of feel like a lot of this is kind of like trying to go behind the scenes and not admit that you are really trying to get four chan to pull down certain content. And so going direct to the ISPs in the UK and say, we need you to block this site entirely, is, you know, like, let's, be straight about what's actually happening here.

Ben Whitelaw:

Yeah. Which as we've seen from the Nepalese example is when, uh, things start to get potentially dicey. I mean, there's this tension always, I guess within regulation. And, and Ocom will have this where deciding that a platform is liable and then trying to enforce that liability are, kind of two different things. And they have enforcement teams who are, I imagine I, I dunno for sure, but kind of doing a lot of work behind the scenes to try and enforce some of these, issues like for the example four chan, and other sites. does it not kind of undermine all forms of regulation in that sense? Because so much of where kind of the platforms are based are, is it the us? You know? And so actually

Mike Masnick:

yeah. But I mean,

Ben Whitelaw:

it always gonna be about the kind of

Mike Masnick:

the, there are,

Ben Whitelaw:

and pull.

Mike Masnick:

yeah. There, there are a whole bunch of nuances and issues here, right? So, all of the larger, more serious internet companies are willing to engage. Like one, a lot of them certainly have presence in the UK and in Europe, and so they're absolutely willing to engage in these kinds of, and respond to, to the regulatory requirements of the online Safety Act. And, you know, we, we've seen that, right? You know, most of the larger trustworthy companies are, are doing that. And partly it's because they're, fine complying with those laws. they may be a little bit of a pain, but they, you know, make sense within the context and they have people in those countries. So it is the sort of smaller sites that are, are left over that, sort of have to deal with it. And so, I don't think like there's an argument like, oh well, you know, four Chan gets away with it. Then it's like, it's not like Meta is gonna say like, okay, well now screw it. We can ignore the online safety Act. I like, I don't think that is a real concern, but I do think, Obviously like four chan and Kiwi forms are just, they're just terrible, terrible sites. So it's always tough to be like, yeah, let's, you know, root for them or whatever. But you know, there are lots of other smaller sites and services and like, I'm thinking about like. People running small Mastodon instances, right. which are required to obey these laws as well. But you know, if it's just somebody doing it as a hobby, they're not going to be able to comply with these laws. you know, I found, like, going back to the Nepalese story, like one of the blocked social media sites was Mastodon. And I was like, how does that work? Because Mastodon is thousands of different instances. Um, and I'm imagining like, you know, most of'em had no idea. And I'm assuming the Nepalese government didn't even realize that either. I'm sure they probably communicated with, the one Mastodon entity, which doesn't run most of Mastodon, right? Like, you know, they run as a, a few large instances of Masum, but Macedon itself is, decentralized in a big way. And so, I think that It feels like I, I, I wouldn't put too much into this, right? The fact that like, these, sites are, complaining about it. I don't think that means that like the major companies that most people rely on are suddenly going to ignore all of these laws. But I do think that there has to be some ability for, especially smaller companies to say there are certain laws that we don't agree with. Like, a Russian law saying you have to leave Russian troll stuff up. Um, or, you know, a law in China that says, you can't make fun of, the leadership in, in China, right? And so it's always a kind of balance on these things. And I think that even as we see these sort of skirmishes on the edges and the margins as to how this plays out, it tends to work out overall on the end. Like, yeah, you're still gonna have some bad sites that are gonna ignore these laws, but the bigger sites are gonna comply with the more reasonable laws.

Ben Whitelaw:

Yeah. No, that does make sense. I mean, I think, you know, I agree that the online Safety Act, and we talked about it on the podcast, is gonna be, overwhelming to kind of small sites and medium sites. And we talked about forums that, uh, are closing on their doors as a result who don't understand or can't comply with the ofcom guidelines. I mean, there's also an element of. And I wouldn't be surprised if the NEP police government has borrowed from the Online Safety Act, but the Online Safety

Mike Masnick:

And, and,

Ben Whitelaw:

there's a lot of research that that says that, you know, these, the DSA, the Online Safety Act net eg. In Germany, which was one of the first, kind of set the tone for other companies to follow suit. So there is, kind of concerns that these ideas will proliferate.

Mike Masnick:

I mean, there was a whole study that, that, uh, Jacob Hanman did years ago about Nets dg and how after that came out, how a whole bunch of authoritarian countries passed laws that on their face looked like Nets dg, which, you know, sort of then became the DSA to some extent. So, looked like that law, but were clearly done for authoritarian censorship purposes, you know, giving more power to the things. And we're going to see exactly the same thing with the Online Safety Act, where you are definitely seeing countries basically saying like, I can use the exact same language as the online Safety Act to control speech, to, stop dissenting political views, to stop protest movements. And that's, that's where we should be concerned and that's where we should be following. and in those cases, you want websites and services that are going to be willing to stand up and say like, no, we're not going to comply with this law because we think the laws are. In pursuit of authoritarian interests. And so, it's a. Difficult balancing act. And, and you know, I've spoken to trust and safety lawyers, you know, the people on the legal teams who work in trust and safety before who have talked about, you know, the demand from, democratic liberal country and the demand from an authoritarian dictatorship look the same. You know, they, they, they say the same things and they say the same words and distinguishing like, you know, at some point you kind of have to make a call over, are you going to comply with, this state request or not? and it's not as simple as, there's not a list of like, these are the good guys and these are the bad guys. and they don't make it clear we're doing this for good reasons or we're doing it for bad reasons. And so there's a lot of judgment calls that need to be made. And, yes, and sometimes that's going to lead to like bad sites, saying like, we're not gonna comply with a good law. And sort of, you know, sticking their nose, sticking their tongues out at it, whatever. And, and, but like. this is the sort of push and pull of, international, you know, regulations and, uh, global internet.

Ben Whitelaw:

yeah. No, it's for sure. I think the, the case is a really interesting one. I think. we'll definitely be, tracking this on controlled speech as, and when this, gets ruled on, And actually the, the kind of links to our next story, we'll go into our kind of small, small stories, our quick fire stories, now. But Wikipedia is, uh, I think is a platform that has pushed back in a number of cases and has tried to,

Mike Masnick:

the uk.

Ben Whitelaw:

a, including the uk, which we talked about recently, but it's, you know, somewhat tyrant resistant. and this piece that the Verge has written about, why it's so resilient and goes into some of the kinda machinations of Wikipedia and what makes it, the platform that it is, is a really good one. You, it's a long read that you've, really enjoyed this week.

Mike Masnick:

Yeah, this is a fantastic piece. It's really worth reading. It is a long read, but there's so many nuggets of, fascinating information in there. It's called Wikipedia is resilient because it's boring, and it, makes a whole bunch of points. About how Wikipedia has been able to remain the way it is. and it uses this line, which is not original to the article. It's been used a lot, but it is a good line, which is that, Wikipedia works in practice, not in theory, which is the reverse of what, you know, most things are. Things work in theory, but not in practice. But Wikipedia is the reverse because, and even when it came about, everyone's like, this can't work. Like it's a totally open thing that anyone can edit. How could that possibly not turn into a complete garbage dump? And it's really interesting at a time when we see all these, professionally moderated, professionally trusted safety sites all turning into garbage dumps and, feel uncontrollable. And then you have Wikipedia, which is open to anyone, which has been really incredibly resilient to not turning it into a garbage dump. And so it sort of goes deep into how that came about. you know, was certainly never inevitable. but it somehow figured out a way to do it. you know, and the, the title is that it's because it's boring, but the more interesting element of it. To me at least. You know, and I think this is, there is some element of this that is true for any site. That whether or not they are safe places in general has to do with the social norms that are accepted among the community and how they deal with interlopers that are trying to, wreck the social norms. And that is one of the things that trust and safety is very important for, that most sites struggle with. But Wikipedia is interesting in that and they describe this in the article that because they sort of set in place these sort of principles of neutrality and citations needed, and you know, you don't do your own research. You need to actually be able to back up your thing. They talk about how so many of the fights. About it don't get political at all. It is entirely focused on the process. It's sort of who follows the Wikipedia process best. And one of the downstream effects of that is that every fight on Wikipedia is about the process, not about the content so much. And that just continually reinforces the social norms of, we follow this process. If you want to participate here, this is the process we follow. It's not about your viewpoint, it's not about how you feel emotionally is entirely which thing best follows the process, which just has this self-reinforcing element that is kind of fascinating and isn't really recreated anywhere else, I think.

Ben Whitelaw:

Totally. And there's an amazing stat in there, which I think is in reference to the Elon Musk, maybe Nazi salute, maybe not Nazi salute. that said like, there was around 7,000 words of discussion on Wikipedia to agree on the three sentences for that event in his bio, in his kind of biography. Like huge amount of like work and thought going into, something kind of that a user would see as, very minor. And now, and I came away from this piece might thinking about like the power of citation and, and really how it's like maybe one of the most powerful forms of online speech in a way. you know, the kind of, the way that you can attribute an idea, which is something we've kind of lost in many ways. The, fact that it kind of provides evidence. obviously to back up an argument, which is so hard to do in many platforms, and it creates this like, as you say, this kind of rabbit hole that you can follow. which again, is like not prevalent on on many other platforms. I think that in many ways, it's powers that you're sent down to the bottom of, of the page should you wish and, and then onwards. and so it's a really nice piece actually.

Mike Masnick:

Yeah. And the, the one other thing I would add, which, you know, I hadn't even been aware of the full extent of this, but the article really delves into how many people, powerful people, and governments and, powerful entities are just like desperate to change Wikipedia to their own interests and how the site has been resilient. Not totally immune to these attacks, but like the ways in which they're being attacked also just gives you at least a sense within this particular context of the nature of malicious actors that try to distort messaging. And this is also happening on any other platform. It's happening on Facebook, it's happening on Twitter, it's happening everywhere, but we don't have as much view into it as we do on Wikipedia. And you begin to get a sense of like, there are really powerful but very malicious interests. I mean, the, the one example that stood out to me, there are multiple examples in the article that some of which are really fascinating, but there was one of an editor who was in good standing, you know, was a successful editor on Wikipedia who lived in a. Unnamed Middle Eastern country that had authoritarian leadership and he was invited to go meet with a government official who basically told this person, the editor, we want you to keep editing Wikipedia, but every once in a while we might ask you to make an edit in our favor. And

Ben Whitelaw:

Right.

Mike Masnick:

that person like left the country because they realized like they were at risk. But just the fact that governments are going to that length to try and influence how things are presented on Wikipedia is really kind of enlightening and gives you a sense there are, you know, with every platform there are malicious actors who are trying to do bad things and how you counteract that is a real challenge.

Ben Whitelaw:

yeah. And I agree, and there's a really one of the editors says something really nice about Wikipedia, which I'm wondering if we should adopt for control alt speech. Right? So it's reported that one of the editors pushes back against the idea of, being more forthcoming about the kind of viewpoints of contributors. And, and the reason they, they say that is because sometimes boring is good. Sometimes boring is credible. And I think that's really the motto of control alt speech.

Mike Masnick:

I don't, I know about, I would like to think that we're not that boring, Ben.

Ben Whitelaw:

listeners I'll leave that up to you to decide. Um, but, uh, really great piece. Definitely go and read that. It's long, but it's, it's worth it. couple more stories before we wrap up, both about meta that are worth knowing about. one of the stories that I was very drawn to this week was, couple of VR researchers from Meta who appeared this week at a Senate judiciary committee hearing, to talk about the fact that research that they tried to do at Meta, uh, about child safety on meta's VR platforms was, squashed. It was kind of held up. It was, um, made difficult and that actually. employees at Meta were asked to delete evidence of child safety issues. this is kind of part of a, I guess a, a longstanding, I guess, narrative of researchers, and staff at Meta who work on the safety side or work on the information integrity side, who have blown the whistle released documents, spoken in Congress about the issues relating to child safety, claiming that the company is, not being responsible in terms of its, users and, and the way its users are, treated. there's some really kind of horrid stuff that these two researchers talk about. I won't go into those details, but essentially these people are kinda speaking up against This trend. And we, we've seen it, we spoke about it not long ago where, a woman called Sarah Wynn Williams, who published a book about the internal, issues on the public policy side, claimed that the culture at Meta was such that she wasn't, she wasn't allowed to kind of dissent really as to the problem she saw there. So, again, part of a longstanding story. In some ways it's not a story'cause we've seen it so many times, but

Mike Masnick:

and I'm

Ben Whitelaw:

further fuel to the fire, isn't it?

Mike Masnick:

yeah, I mean, like, I think a lots of people can agree that meta has problems and, and Meta is not always the best company and makes some, some pretty poor decisions. I'm always a little skeptical about these stories, because I think there are. Multiple ways of viewing them. And they're always presented in the press in, in one specific way. I think there are potentially, but this, it all becomes, he said, she said kind of situations that are hard to tell. So like Meta responded to this one and said, you know, the content we deleted was because under the law we're not allowed to hold onto data about people we know if they're under 13. And we discovered this data that we'd learned was of people who are under 13 and we are required by law to delete that data. So, you know, that is, that is true. There's also like, you know, you look at how. Things responded to former earlier whistleblowers like Francis Hogan, where some of the media took internal research very much outta context, and I've talked about this at great length in the past, where they took internal research, which was meta trying to figure out how to make Instagram safer and presented it as in look at this evil company. They knew that things were unsafe, even though like the point of that research was to try and make things safer. And I could see why that would then make the company be very careful and deliberate in what kind of research they do, even internal research, because as soon as it leaks, it's going to be presented outta context and in a harmful way. So some of the things that they said, oh, we weren't allowed to do this research. I was looking at it and I was like, I bet you that there's somebody who is like, this research is important, but if we do it in this way and it ever leaks, it will be presented in This wrong light, this misleading light. And so I'm sort of like, like I'm glad that there are whistleblowers. I'm glad that there are people calling out the safety issues, but like I can also see some of the other side of this story.

Ben Whitelaw:

Yeah. Or it might be because the fact that meta changed its whole company name to a word that was, playing up its metaverse credentials and is spending billions of pounds per quarter

Mike Masnick:

yeah,

Ben Whitelaw:

to try and crack the metaverse. Maybe,

Mike Masnick:

Mean there are competing interests. Let's just say that within the company there are competing interests and there are different people who have different priorities and different requirements that they're going for. And some of those things are going to be in conflict, and that certainly plays into it as well. So yeah, I'm not denying any of that.

Ben Whitelaw:

No, no. It's, um, again, something, we'll, I'm sure call these researchers, out on the long list of people who've spoken out in the past. Uh, the final story is a bit of a lighthearted one, Mike. it's a meta story, but it, it really concerns, a man who has the unfortunate. Name of the meta, CEO. This is the real Mark Zuckerberg. We found him.

Mike Masnick:

Yeah, this is, this is, uh, uh, uh, someone else named Mark Zuckerberg, uh, who is a lawyer, and is older than, the CEO of Meta, has sued Mark Zuckerberg and Meta, claiming that, he has bought all these ads on Meta and his account and the ads keep getting taken down because meta trust and safety, people assume that he is impersonating their CEO. so everyone, it's the Mark Zuckerberg versus Mark Zuckerberg. He talks about all the times in his life that are ruined by people thinking he checks into hotels, or he was like waiting for a car at the airport, and people are expecting. The Mark Zuckerberg and then this random, you know, tax lawyer shows up or bankruptcy lawyer shows up and, and doesn't get the treatment. But he's really mad at Meta because they've, removed his account, his page and advertisements. I don't think the lawsuit has much of a chance. Like, yes, it sucks for him. and yes, he's talked to people at Meta and they've said they've put flags on his account to try and block that, but I don't see anything that suggests that, there was any sort of direct contractual relationship that wouldn't allow meta to moderate him, even if it's wrong and it's by accident. And yes, they should do better. I don't think that there's been any sort of contractual violation. he, he claims promissory estoppel, which is, I'm not gonna get into the legal weeds of that, but there is like a key case in California in the past against Yahoo, where an employee did promise to do a certain moderation thing and the court basically said, well, that overrules everything else. It overrules section two 30 because this employee promised. But I was looking through the lawsuit and there was nothing in there that, particularly suggested. There were people who said like, we, we'll try and help, we'll put a flag on your account. That doesn't mean they promise. Like, we will never make this mistake again. you know, it's a kind of lighthearted funny story like, poor Mark Zuckerberg, the other Mark Zuckerberg having to go through life dealing with, uh, being confused, but

Ben Whitelaw:

honestly, his personal website, I am mark

Mike Masnick:

yeah.

Ben Whitelaw:

is, was so funny. You know, there's, he, he lists, the things that have happened to him because his name is Mark Zuckerberg and it genuinely sounds terrible to have his name. You know, he says that he turns his phone off at night to kind of avoid the multitude of notifications that he has. He's constantly getting like password resets because people want to hack the real Mark Zuckerberg and keep mistaking him. he gets loads of friend requests and he can't like, sift through his real friends to find them and connect online. Like this is truly the

Mike Masnick:

I, I, I, I feel for him, I just don't think he has a legal, legal recourse. I mean, you know, it's funny because like, I have this experience, which is I've talked about publicly in the past where, where the real, the meta CEO Mark Zuckerberg reached out to me to talk to me. and I at first was like, this has to be an imposter. So like, I get it. I'm sure that there are a ton of people as an imposter, and the, the trusted safety team at Meta, I'm sure is trying to stop all those imposters. And, I can see why this mistake happens and it sucks for him, but I, I don't, I don't think he has any legal basis to sue

Ben Whitelaw:

No, but Mark's the real, the younger Mark Zuckerberg should settle, should give this guy money for all of the, unfortunate, uh, flurry of notifications that he's caused. That's what I would say. settle and, and maybe get some good PR for once. Um,

Mike Masnick:

see. We'll see.

Ben Whitelaw:

um, that brings us to the end of today's episode. Mike, thanks for everything. Thanks for bringing all those great stories. we talked about lots of you know, media this week, media nama, New York Times, we talked about The Verge. They're all, fantastic. Go and read them. Go and subscribe to them. we are very, very grateful to your listening rate and review us wherever you get your podcast. we'll see you next week. thanks for listening.

Announcer:

Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.com. That's CT RL alt speech.com.

People on this episode