Ctrl-Alt-Speech

The TickTock on TikTok

Mike Masnick & Ben Whitelaw Season 1 Episode 44

In this week's round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:

If you’re in London on Thursday 30th January, join Ben, Mark Scott (Digital Politics) and Georgia Iacovou (Horrific/Terrific) for an evening of tech policy, discussion and drinks. Register your interest.

This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.

Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.

Ben Whitelaw:

Mike, I've mentioned before that I'm doing a bit of house renovation, and once I've done said renovation, I like to support my local tradesmen by giving reviews on a website called Trustpilot, which is a kind of UK specific. I don't think you have it in the US. Is that fair?

Mike Masnick:

I'm not that familiar with it.

Ben Whitelaw:

Okay. Okay. Well, anyway, the homepage of Trustpilot, has a prompt, which is find a company you can trust. Okay. And so to start today's episode of control of speech, I want you to respond with a company that you can trust.

Mike Masnick:

Well, I'm going to argue outside of Our two companies, Ben, my company and your company. I'm going to argue that, maybe you shouldn't be trusting companies that much, and the world would be a better place if we were in a situation where you didn't have to trust companies so much to be so good. And I actually wrote something sort of related to that this week that maybe we'll, get to touch on a little bit later, but, uh, what, about you? What, companies, can you trust?

Ben Whitelaw:

Well, I would say Mike, actually, it's not in our interest to trust companies, you know, because actually control of speech wouldn't exist if we, certainly from a digital platform perspective could trust companies. So, uh, you know, I'm all for companies continuing to be untrustworthy. Uh, at least so we can, come together every week and talk about the big stories about content moderation and online speech. and that's a good place to start today. Hello and welcome to control alt speech. Your weekly roundup of the major stories about online speech, content moderation, and internet regulation. It's January the 23rd, 2025. And this week's episode is brought to you with financial support from the future of online trust and safety fund. This week, we're talking about a flurry of executive orders related to internet speech, EU regulatory pushback, and much, much more. My name is Ben Whitelaw. I'm the founder and editor of Everything in Moderation, and I'm with the Esteemable, Mike Masnick, uh, of TechBert. welcome Mike to the again. How's things?

Mike Masnick:

it's good. That's good. It's a, it's a fun adjective you use there. Uh, as you know, as we were discussing before we started recording, I am completely overwhelmed and busy and swamped and, not just because, uh, the entire United States is in, uh, being overwhelmed by nonsense, but just there's so much going on and, completely just, there's too much. There's too much. I, I've sort of joked in the past about this concept of, the computer world, there's this idea of the denial of service attack, you know, where you hit a website with just so much stuff that they can't handle it. And the just ends up shutting down the entire web server. And I sort of feel like I'm under a denial of service attack right now. It just was so much stuff coming in at once and trying to deal with it all.

Ben Whitelaw:

From all angles and, and from, various different, adversaries as

Mike Masnick:

Yeah. you know, I sort of need like a human version of cloud flare to block the denial of service attack and

Ben Whitelaw:

Yeah, and it sounds like that's a sponsored link. It's not. Other DDS services are available.

Mike Masnick:

That's true.

Ben Whitelaw:

Although Cloudflare, if you're listening and you want to sponsor, get in touch,

Mike Masnick:

Yes. Yeah.

Ben Whitelaw:

Yeah. I mean, I was, I feel a bit the same, Mike. I feel like every week now we're kind of coming to the podcast, trying to create some sort of sense from the chaos and this week is really no different. I was actually kind of reflecting on, just how difficult increasingly that is. Um, it's,

Mike Masnick:

you know, you know, the concept of Calvin ball, do you know this at all? So, there was a comic strip that was very popular in sort of the eighties. And I don't remember exactly what an editor called Calvin and Hobbes. Are you familiar with

Ben Whitelaw:

Oh, yeah, yeah, yeah, yeah,

Mike Masnick:

Okay. So Calvin and Hobbes was this cartoon that you know, there's this little kid and his stuffed tiger. And there were some of the strips, they would play a game called where, they would hit a ball around and. Sort of, you know, loosely based on baseball. But the part of the point of Calvin ball was that the rules can always change. And so, one of them would say, you know, I caught the ball, you're out. And the other one would say, rule change, you know, outs don't count on Tuesday. And like, you know, and then you get to a base and you'd be like, well, you know, now I got you out and it's like, no, you know, When it's past 4 PM, rules are reversed or whatever. It's just the fact that you can just constantly change the rules and we're sort of, and I think we'll get into this a little bit, we're living in this Calvin Ball time where the rules that we know and the rules of the game that we're sort of well established and understood, it feels like people can just say, well, no, it's Calvin Ball. it's Tuesday. And so the rules are totally different now. And, uh, that's exhausting.

Ben Whitelaw:

Yeah, yeah, yeah. We don't even know what the rules are anymore. I wish somebody would tell us what the rules are,

Mike Masnick:

Yeah. Yeah.

Ben Whitelaw:

but yeah, I mean, I, I thought it was interesting, you know, this seems to be something that's happening across the board. Right. And, I, this is born out really in, something I saw this week where a journalist called Judd Legume, who writes popular information. He's actually writing a newsletter solely. On Elon Musk, I don't know if you saw this, this was, this is, he started a newsletter on Substack called Muskwatch, which is a good name. I, to be honest, I thought we were doing part of that job for a long time on this podcast. I thought we were doing Muskwatch, the podcast. And he, although he's kind of obviously, been less involved in some of the online speech stuff at Twitter and has gone on to bigger and more nefarious things, but, you know, again, this idea of kind of this space being something somebody needs to come along, put some structure around it package it up in the way that we're trying to do, I guess.

Mike Masnick:

Yeah. Yeah. there is also, Bloomberg has a whole podcast called, Musk Inc, I think it is, and they did a separate spinoff, which was citizen Musk, which was all about his political efforts. Um, and so there are efforts that are just entirely focused on Elon Musk. And, thought we were heading into possibly a Musk free. Episode this week, Ben, and you just ruined it.

Ben Whitelaw:

about that. Um, sorry to all the listeners. We we've, we were so close.

Mike Masnick:

I mean, there's plenty to comment on, on him and things that he has done and said this week, but, but I think, I think we have some other stories to cover.

Ben Whitelaw:

there is, before we jump into those, Mike, um, we started off talking about, a rating and review website. we should ask our listeners to rate and review the podcast if

Mike Masnick:

Oh, yeah.

Ben Whitelaw:

episode, or if they have enjoyed any recent episodes. I was noticing, uh, I was going back through our recent reviews and actually the latest review. And I don't want to kind of like, guilt trip the listeners was on the 28th of September, 2024. So that's a,

Mike Masnick:

Ancient history.

Ben Whitelaw:

Hell of a long time ago. Right. You know, I think we've done some half decent podcasts since then merit rating or review.

Mike Masnick:

Yes. And, and, and look, I've been watching our numbers. Our numbers have been going up in terms of listeners. So there are definitely more of you than we're listening on September 28th. So therefore, I think a few of you are falling down on your required mandatory rating reviewing, of controlled speech.

Ben Whitelaw:

yeah, well, we would really, really appreciate it. If you had a few minutes to spend on sharing a few words, read all of them, get in touch with us at podcast at control alt speech. com as well. If you want to share some more candid feedback about the episodes. we really appreciate it. It really helps us be discovered on those podcast platforms. Great stuff. let's get started then Mike. And, talking of, trying to enforce a set of speech ideals on our listeners, it's a perfect, uh, perfect segue into the big news of, this week. last Friday, we. Recorded just before the Tik Tok Supreme Court judgment was handed down. It feels like several light years ago that that happened, you know, there's been a whole, we could spend three podcasts talking about what's happened since then. you have the unenviable task of taking us through what just happened. in a kind of, yeah, WTF

Mike Masnick:

yeah,

Ben Whitelaw:

give us a run through if you can.

Mike Masnick:

yeah, it's, it's, uh, it's the tick tock on tick tock is, you know, what, what has happened in the past seven days? so yeah, so we recorded last Thursday on Friday, the Supreme Court ruling came out. It was more or less what I think we expected based on the oral arguments that they did say that the law was okay. I thought there were many, many problems with how it was written. they try to acknowledge the first amendment concerns that people raised, but said that, this was a narrow ruling that only applied to these particular circumstances. And yet, as you read through it, there is a sort of roadmap in the way that the Supreme Court ruled. That will allow future governments to ban speech. and that is really scary. And the main part of it that was scary is that they effectively said, we're really not going to deal with the first amendment concerns here because the government has also expressed. A compelling interest on the data protection side of things. And therefore we're sort of going to ignore the speech part, which is strange and problematic because it sort of suggests that you can have an unconstitutional speech suppressing reason for passing a law. So long as you also add to it, a non. speech suppressing aspect to it, which seems bad. Right. And seems like a real wide open opportunity for mischief government attacks on speech, but that is now ancient history. Okay. So

Ben Whitelaw:

was that like? That was Friday.

Mike Masnick:

Yeah. I mean, people have totally forgotten that the Supreme court ruled on this already. So they made their ruling, soon after it came out. As had been hinted at earlier in the week, the Biden administration said, TikTok doesn't need to shut down. We promise we're not going to enforce it. TikTok responded by saying not good enough. Like we need actual legal certainty. And you just saying that under the law is not really enough for us to feel comfortable about it. Then, I'm now suddenly blanking on which day of the weekend this happened, At some point over the weekend, TikTok shut itself down. and this was actually fascinating in a lot of ways. they gave a little warning and they've said like, we're talking to president Trump who is not president at the time. and we were trying sort this out, but for now we need to shut down. And then the servers went down. A lot of people were upset and angry. It was interesting that they didn't just shut down TikTok, but they shut down a number of other entities associated with, ByteDance, which probably was what they had to do under the law, though. Some of them were more concerning than others. so there was Lemonade, which is another ByteDance on social media, sort of Pinterest like, that wasn't too surprising. I think people expected that to go down. The ones that people were surprised by where it was CapCut, which is a video editing, tool, which is apparently like the best video editing tool, which is owned by ByteDance that went down. People were like really freaked out about it. And then the really surprising one was Marvel Snap, which is this incredibly popular game. it's like an astoundingly popular game, but it is created by an American, studio, but it is the publisher on it is ByteDance. so all of a sudden this game shut down and everyone's like, wait, what? Like, what, what does that have to do with TikTok? And so, this is sort of how broad and how far reaching the law was that they felt that all these things had to be shut down. Then the next morning. It came back. I think it was, I've seen different reports, whether it was down for 12 hours or 14 hours, but it was, you know, somewhere in that range. and then it suddenly came back because they said that president Trump has promised us that he won't enforce the law, except president Trump was still not president Trump. at that time it was, so this, it went down Saturday night, it came back Sunday. Now I'm remembering the timing on this. And he, he doesn't become president until Monday.

Ben Whitelaw:

how would he have communicated that to self interest? Like, how does that work? Where he knows,

Mike Masnick:

Your guess is as good as mine. There, there may have been some, direct communication between them. There may have been, you know, for all we know, there were some public statements made. I mean, Trump had posted something on true social. Maybe that was the extent of it that like, when Biden says something publicly, that doesn't count, but as long as Trump says something publicly, they feel that counts. Tick tock turned back on its servers. And then they also got Oracle, which hosts their, the U S data, which was all of the like project Texas stuff. They were willing to turn it back on. And, Akamai, which is another, as you said, there are other, denial of service protector companies like CloudFlare, Akamai being one of them. they turned back on their service. and so they said, okay, it's fine to come back. and then Monday came inauguration. Trump is president. One of the first executive orders he signs is one on TikTok, which is. Nonsense. it's hard to explain how disconnected from reality. This executive order is, and we already live in a world where disconnections from reality happen, but it basically, there is one provision under the law that allows the law to prevent it being enforced, and that is if the president makes a declaration that he has seen evidence that, ByteDance has gotten, gotten, uh, close enough to a divestiture. They're in the process of doing a divestiture that they can delay the enforcement of the bill by 90 days. they didn't do that. That's the one provision that he had. So instead he just puts out this executive order directing the, attorney general not to enforce the law and basically to send letters to all of the companies that are implicated by this law saying, we won't enforce it. here's a effectively a get out of jail free card. saying we're not going to enforce this law because, and this is really important and I can't remember if we discussed this specifically before or not. The law is really strange in how it is enforced, which is, it is telling TikTok that it needs to be sold by, by dance. but the enforcement is entirely on third party companies. So. The fines would go to whoever's hosting or helping them. So that is the app stores, Google and Apple, but also Oracle for hosting them or Akamai for providing these other services to them. And so this executive order says like, feel free to ignore this law. And you can't. Do that again. Calvin ball rules rules no longer matter, but like in theory, you can't write an executive order that overrules a law that's, that's kind of the point of, the separation of powers and the way Congress works and everything. And in fact, The executive order itself has this boilerplate stuff that is included in like all of these executive orders at the end, which says this order is not intended to, and does not create any right or benefit, substantive or procedural enforceable at law, it's basically saying. Whatever we say above can't overrule a law, even though everything in the actual executive order does effectively say we're putting this on hold for 75 days, which has no relationship to the law at all. And so, then you're in this weird situation where you have that, and then some, even like Senator Tom Cotton, who is a very Trumpy individual but is also like a huge national security kind of guy. He was the guy who sort of famously wanted to deploy the U. S. military to, take down black lives matter protesters.

Ben Whitelaw:

Oh yeah. Good. Good guy.

Mike Masnick:

yeah, he's, yeah, yeah. and so he put out this tweet saying like, no, president can't do this. And he's like, I am warning any company that relies on this. That you are still liable and it's 5, 000 fine per user. So 170 million users on Tik TOK times 5, 000, that is a lot of money. and so he's, he warns that. And so Apple and Google have not returned Tik TOK to the app stores.

Ben Whitelaw:

Right.

Mike Masnick:

Because they were not willing to rely on this. You know, I think Oracle and Akamai are really sort of betting on like, okay, well, I hope we're safe, but like, I would imagine that there are lawyers at all four of those companies and probably some other companies who are just like, Whoa, like I would not do this. And, there's been some, reporting from law professors and lawyers are basically saying like this executive order. Is a very risky thing for companies to rely on. And there are cases the past where you can't rely on the executive branch telling you, we're not going to enforce this law to then break that law. Like you might be able to get away with it in court, but like you might not. and the statute of limitations on this, on the law is like five years, which also means the next administration could come back and look at this and say, you violated this law and we're going to go after you for this

Ben Whitelaw:

Right. Right. Okay. So just to be clear, the, the fact that Oracle and Akamai have kind of, I guess glossed over the fact that this executive order might get them into shit down the line is basically because they wanted TikTok to return. Like there's no way for TikTok to return without those guys, um, Essentially kind of playing it the way they have Apple Google and the play stores and app stores. They're different because actually that doesn't affect existing users of Tik TOK and their access. It just means that new people can't download it.

Mike Masnick:

Exactly. So new people can't download it. If you had it on the phone, you can still use it. You can't get updates. So that raises a whole other set of issues, which is like, if there's a security vulnerability, which shows up all the time, that can't be patched right now, which is really risky for users of these apps. If in theory, as some people claim, but it has never been proven. There are sort of, security back doors within TikTok. Now might be a good time for, for bad people to exploit those. Um, because Google and Apple, cannot patch them currently. this is a bad thing for a whole bunch of reasons. And, but just, just a basic cybersecurity issues. this is bad. and those companies I think are reasonably saying like, we're not going to do this until we have more information and feel really safe because the liability risk is massive. And, Oracle and, Akamai are betting on that, they can trust this executive order.

Ben Whitelaw:

silly question perhaps, but is. What's behind this period of 75 days? Is that?

Mike Masnick:

the hell knows, right? I mean, it

Ben Whitelaw:

a silly question.

Mike Masnick:

It's not a silly question because it, there is no good answer, right? The law had a provision for a 90 day thing. And what's funny is even like during his press conference, Trump said, I gave them 90 days and he didn't because the executive order is 75 days. and then that leads into what happened after this, because again, like every day, something new, crazy has been going on, which is, so after this, he gave this press conference where he talked about president Trump talked about, How the company must be worth a billion dollars. Or if I don't sign off on something, it's worthless, which is like, it's mafia shakedown things. It's like, you know, I can make your company worth a billion dollars. And then he says, like, he says explicitly half of that should go to us. It's a little unclear as to who the us is in this. So he's saying I can create a trillion dollars of value with my signature signing off on whatever deal we questionable who the we is. We should get half of that. and he sort of discussed this in different ways. So at times he sort of said the U S should own half of that. And he said, ByteDance can own half of it and the U S should own half of it. And that raises questions. Is he talking about the U S government? Is the U S government supposed to own a half of Tik Tok? Because that raises a whole bunch of really, really problematic questions.

Ben Whitelaw:

United States Limited

Mike Masnick:

Yeah. I mean, for all of the talk of the Murty case and, government control of internet censorship, and we're going to get to something else on that later in this discussion. Like if the U S government owns a social media network, then it's It's bound by the first amendment, which means it really can't moderate except for like CSAM and like blatantly illegal content. Otherwise it can't do anything.

Ben Whitelaw:

right. He's not actually thinking about buying it on behalf of the state, is he?

Mike Masnick:

that's, it's unclear. And I don't think he knows, like, this is part of the problem. He was talking about as if like, yeah, you know, the U S government is going to get half a trillion dollars out of this deal. But then also when asked about it, he also said, well, maybe, Musk could buy it or Larry Ellison, who's founder and CTO of Oracle, which was the original plan four years ago when he did the, when Trump did the original TikTok ban, the idea was that he was going to get Larry Ellison to buy it because Larry Ellison is a big Trump supporter has funded, funded his campaign for a while. But Oracle didn't want it five years ago this discussion happened four and a half years ago and worked out this whole project Texas deal as an alternative where they didn't have to buy it. They could just get this, sweetheart contract and it was reported and I don't think anyone ever denied it that Trump was really clear where it's like. Tick tock needs to go to you know, someone who is a loyal Trump supporter two choices were Larry Ellison or Walmart, if you can imagine Walmart owning Tick Tock. and this time around it's like, well, it's no longer Walmart, but it's Elon Musk. So it's basically like whichever, Billionaire that supports Trump enough, it's going to go to one of them. And so then that wouldn't be the U S government. So I still don't see where he thinks this 500 billion is coming from or who it's going to, none of this makes sense. I don't think he understands it. I don't think he has a good grasp on any of this, but he's just sort of talking in general about like, well, since I have to approve this deal, then I can, create money and therefore I should get money. But I don't know, where any of this

Ben Whitelaw:

right. And like, there's a kind of really funny line in it, in the executive order where he says this timing also interferes with my ability to negotiate a resolution to avoid an abrupt shutdown of the platform while addressing national security concerns, basically like heralding himself as the king of deals, you know, like chief deal maker. And. And he's really backing himself to be able to kind of resolve this by, kind of negotiating sale, I guess. And that's, that's what we're about to, about to see happen.

Mike Masnick:

this is also confusing too, because, for all of his talk of being such a great deal maker, and obviously people can have opinions on that, he blunders into this by saying well, you know, I'm okay. if Bytance still owns 50%. of the company, when this is done, which is like, you've just given away, the law is very clear that no, it has to be less than 20%. And so he's now announcing, like, I'm willing to give them more than that. And already ByteDance owns less than 50 percent of the company, outside investors own 60 percent of the company. So like all of this is, I mean, it's so stupid, like it, but he's, he's basically giving away all of the leverage by saying like, well, we can do this 50 50 joint venture thing, but again, it's not clear what he means. It's not clear all of this is. so, so stupid and confusing. And then like there was the, final thing, the most recent thing, and this is probably changing by the time you're listening to it, who knows, maybe TikTok doesn't exist anymore. Maybe it's owned by Mr. Beast, right? So like the YouTuber, Mr. Beast is trying to buy it. There's a whole bunch of idiots coming out of the woodwork saying we're going to buy TikTok. But one of the board members of ByteDance has said that they are working on a resolution. Who the hell knows what that is, but they sort of, it sort of suggests that there may be a deal that is coming relatively soon. I, the question is like, can that deal actually match with the law and then how will the companies react? Right. So the president can kind of sign something, which. Trump realizes, which is why he's basically saying his signature on this is worth a trillion dollars. and he does it in such a way that sounds like a shakedown and, clearly corrupt, like the signature is worth a lot. I need a cut of the money, which you wouldn't normally want a president to say, but, but we don't live in normal times. but there might be a deal soon. And then, you what happens after that? Well, we'll have to see. And then what Congress does, like already Tom Cotton seems pretty upset by this. some of the other senators ran. Paul doesn't seem particularly happy about this on the Democratic side. Ed Markey has complained about some of this. So we'll see what actually comes out of this, but it's still very much all up in the air.

Ben Whitelaw:

Okay. So the tick tock on tick tock will continue. been a broader thread of commentary around what this means for the U S is approach to the internet more broadly and, I think that's really interesting to touch on, you know, the U S is a place of the kind of free internet, the open internet was kind of pivotal to the early days of web and its creation. And there's a interesting piece written by Ethan Zuckerman in the Atlantic around how Actually, this is, no longer the case at America is no longer the home of the free web. I know you've written kind of pieces this week as well on that point, but like, to what extent does this signal a change in, in the way that the U S thinks about the open web, do you think?

Mike Masnick:

Yeah. I think it's massive. And I think it's really unfortunate. And, for better or for worse, and whether or not you believe it was, honest or hypocritical or, or what, for of the history of the internet, the US has been this sort of strong force pushing a global open internet around the world. The state department has actually done amazing work trying to spread the gospel an open Internet and Internet freedom around the world and really effectively. And I think that the U. S. has completely thrown away or set on fire whatever moral high ground, and you can say it was hypocritical, was never honest, all this kind of stuff. But they have. Yeah, we've got a lot of, we've got a lot of people who don't want to talk about that in any other know, countries do ban apps and countries do ban services, but they tend to be more authoritarian and the U S was always in a position where before this they could say like, no, we believe in freedom and they can't say that anymore.

Ben Whitelaw:

Right. And, you know, if you, if you take an example, not that long ago of India banning Tik TOK. India is not known for its approach to, open internet. 200 million users in India at the time, very similar to the number of users it has in the U S right now. And, the comparisons are eerie actually. And, you know, nobody would say that Mahindra Modi was a, giant free speech advocate or a giant open web advocate. And, you know, similar things can be said, I guess, about Trump. I had a question to Mike about like the missteps along the way, obviously apart from Trump coming in and deciding somewhat unilaterally, that this is, he wanted to ban Tik TOK and what other missteps along the way do you think the U S made to get to this point? I'm sure it's more complex than that. We only, we only have one hour,

Mike Masnick:

there's so much, I mean, so much went wrong. I mean, I really think like, there was this, weird. You potentially xenophobic fear of China that over rode everything and, and any sort of caution. was a totally bipartisan effort of a bunch of scaredy cats in Congress who, appeared terrified, you know, is people, It's a question of were they actually terrified of China or were they terrified of looking weak on China in an election year. And I think a lot of it was really that latter point, which was that you had a bunch of politicians who sort of were using threats from China and China as an adversary as a sort of wedge point and anyone who was looking soft on China was Was attacked as like not protecting American interests and not protecting American national security. And so therefore a whole bunch of people sort of felt a whole bunch of politicians felt that they had to really, step up and prove that they were tough on China and taking on Tik was part of that. And I think they underestimated in all sorts of ways. I mean, users on TikTok, and we discussed this somewhat last week, just found all of this ridiculous and the, the sort of youth vote, I think, kids these days are smart enough to know that they're being lied to and that they're being misled with some aspects of the concerns. I mean, there are some concerns that are legitimate, but like, this felt like a total overreaction. That's why you had the kids, as we discussed last week, moving to like even more Chinese apps sort of mocking the U S and now they're talking about like, Oh, like the wool has been pulled from my eyes. And I see how the U S is full of propaganda and the Chinese way of life is so much better. And you're like, well, wait a second, like maybe, maybe don't go that far. But I think this was a bunch of old scared people who had no idea culturally what they were doing, no idea technically what they were doing and legally, like how bad this looks and what sort of longer term impacts are. And if you look now we've also opened up, not just this idea of like, we no longer have the moral high ground on open internet, but we've made it clear to any other country. You should probably look at banning American apps, for all the talk of how Elon Musk and Mark Zuckerberg and Jeff Bezos are sucking up to the Trump administration and hoping that they'll help fight these fights, under the rationale of the TikTok ban. I think lots of other countries have a very clear justification to block all of Meta's apps, all of X, all of the American internet companies and basically say, well, it's a national security risk. Uh, you know, the concern was that the Chinese government could get access to TikTok's data. Well, you know, under the laws in the U S the NSA and the FBI have tremendous access to data at these companies. and so. I don't see how that's any different. And, they now have the roadmap and said, you did it with tick tock. You had concerns about national security. We have concerns and they're going to leverage that, especially as Trump is going around the world, trying to issue tariffs and saying like, Oh, you want to, you want to send steel or cars or, agricultural products to the, to the U S we're going to tariff you. And they're going to say, well, all right, we're gonna, we're going to block Facebook or we're going to make, uh, meta have to divest from, from Instagram. Who knows what we're going to see, but there's going to be pushback from foreign countries.

Ben Whitelaw:

Yeah. And there's also the thing of, platforms in the U S that, you know, have foreign owners or foreign investment. Right. You know, cause it's not just tick tock, you know, discord has a large Tencent, investment, you know, Tencent being, the other major technology platforms company, alongside bite dance in China, you know, I think the Swedes are a pesky bunch and, you know, Spotify has a large I think Spotify has a national security. any national security issues, but you know, that's the kind of, relationship you're engendering when you make these decisions.

Mike Masnick:

there's also X has a huge investment from the Saudis. Right. And so, I mean, we raised this, you know, and we've talked about this before we filed a brief with the, an amicus brief with the Supreme court. And that was one of the things that we raised, which was, you know, where's the line. And we pointed out that, X has a huge investment from the Saudis, and there's no clear limiting factor. that the U S government couldn't say that, they have to divest from X to allow it to operate in the country or they could block X.

Ben Whitelaw:

Yeah. Who knows where it will stop? what I do know and what didn't stop this week was the raft of executive orders though. And actually the TikTok one that we've just talked through is only one of the, the many, many executive orders that people were discussing this week. You picked out another one, Mike, about. government platform speech. Just give us a kind of run through what that means and what that builds on some of the kind of storylines that we've been talking about on controller speech for a while.

Mike Masnick:

I mean, this one, I mean, like a lot of this is, is hugely performative and not really substantive, but it is the restoring freedom of speech and ending federal censorship. and this is just a play on the false belief that we've talked about. All last year, and it was key to the Murty case and a bunch of other stuff about these claims of the censorship industrial complex, where in theory, we were told that the Biden administration was working with academics and social media companies to censor conservative thoughts and, anti vax concepts or whatever it was that they were

Ben Whitelaw:

Every everything that Zack was talking about on the Rogan podcast, right? Yeah.

Mike Masnick:

exactly like all of these, claims, which the Supreme Court again found no actual evidence to support that this was happening. And as Justice Kagan and Justice Kavanaugh made very, very clear, like the idea that government officials might reach out to private media companies and say like, no, you got this wrong. Like, don't write this way again, that happens. I think Kagan said thousands of times every day, federal government employees reach out to media properties and sort of, say like, no, no, like, don't write that you're, you're getting this wrong or whatever, as long as it doesn't pass into coercion, it is legal under the first amendment, but there's this, false belief that, the Biden administration was deeply embedded in these companies and telling them what to take down. The companies were like, yes, master, I will take that down. so this executive order is basically, we're putting an end to that. no longer will any federal government officer, employee, or agent engaged in any facilitation of any conduct that would unconstitutionally abridge the free speech of any American citizen. So they're saying that the government will never get involved in any of this stuff, which again, would be really interesting if the U S government somehow ends up owning a piece of Tik TOK, because it would suggest that they can do nothing to do any kind of content moderation on, on the internet. And already we know it's nonsense because you know, within a day of this, executive order coming out, directing, the entire apparatus of the federal government never to communicate with media companies to, try and abridge free speech rights, the new FCC chair, Brendan Carr reopened investigations into certain television stations that he felt were covering, Kamala Harris too favorably or Trump too critically, which is the exact thing that in theory, this executive order has said that the federal government cannot do. And yet Brendan Carr, as a, an employee of the federal government is absolutely doing. So I question how legitimate this is in terms of will the federal government actually follow this executive order or does it only apply to Trump? Speech that conservatives like we won't take that down. But speech that critics of the president saying like that that doesn't apply to this executive order.

Ben Whitelaw:

Also, we were talking before the podcast started recording and about the kind of potential for this executive order to shoot Trump and platforms in the foot for example, foreign interference. If there is an example of a coordinated, you know, Russian, attack online, there's in theory, as a result of this executive order, no way for government officials to be talking to platforms about that. Is that, is that fair?

Mike Masnick:

pretty much this executive order effectively bans the federal government from talking to social media platforms about any content that they have and that includes Actual foreign interference, and there are other efforts underway related to this that trying to, destroy CISA, the cybersecurity infrastructure, agency which was set up by Trump, but everybody forgets that. and they're trying to dismantle that and. all the good work that they do sort of informing people about cybersecurity concerns and, you know, alerting them to breaches or hacks or all that kind of stuff, I think all of that is going to go away, which makes the internet a much more dangerous place because you really need. That coordination to deal with big challenges. And now the government is effectively barred. The federal government is effectively barred from alerting companies to any kind of foreign threats, because it might be seen as, trying to suppress information that would violate this particular. executive order.

Ben Whitelaw:

Cool. That

Mike Masnick:

Yeah, not, not

Ben Whitelaw:

fun.

Mike Masnick:

Not really.

Ben Whitelaw:

Oh, great. We've got, we've got a lot to look forward to. okay. So, so I think there's, there's, something interesting there around the push and pull of two, world views about online speech, China and the U S there's also this, a similar push and pull happening, Mike with Europe and with the U S and it leads neatly onto our kind of next story, which, I'm going to talk through a little bit on this is an interesting story, I think, because we've talked to the last few weeks about how, There's from within the European commission, some sense that they are taking a pause in their investigations around some of the big tech platforms, some of the big social platforms, because they're kind of waiting to see what Donald Trump will do. And, we reported on that last week, some stories from, the European commission around that, and. Actually this week has seen a bit of a change. So I think it was late last week, we saw the EU launch a brand new code of conduct on hate speech, what it called the kind of code of conduct plus. Okay. So. the code of conduct plus makes it sound like a kind of the addition of a channel that you don't want to the cable package that you already pay a lot of money for.

Mike Masnick:

I can't wait to see what is streaming on the code of conduct plus then.

Ben Whitelaw:

And I like if you're a platform, yeah. If you work with interest and safety in a platform and you've signed up to the original code of conduct. Code of conduct on hate speech back in 2016, this would have probably been a surprise to you. the backstory for people who, weren't around in 2016 was that essentially the European commission after a series of kind of terror attacks, that kind of rocked Europe both in Belgium and in France created this code of conduct on hate speech. because it felt that the kind of terrorism threat was emanating from and it got 13 platforms to kind of. Agree to combating racism and xenophobia and protecting freedom of expression by signing up to this voluntary code. And it was the kind of bigger platforms at the start and a bunch of other smaller platforms joined. And in practice, that meant essentially having some processes within the platforms, to oversee hate speech and to listen to what were termed at the time, trusted Reporters of hate speech. So these are essentially kind of civil society organizations and nonprofits who were specialists in these hate speech areas who would let the platforms know when there was hate speech on, on the platforms that they perhaps didn't know about. And these, partnerships were key to kind of taking down a lot of, hate speech as termed by the EU around that time. some good stuff happened as a result of it. Okay. So there was a report in 2019 that the European commission did around the hate speech. code of conduct, and it said that within 24 hours, most of the platforms were addressing, reviewing some of these notifications by the trusted reporters and content that was egregious, that was related to terrorism and, other things was being pulled down. So did some good. Obviously we're in a completely different place now. and over the last few weeks, Meta in particular have changed its hate speech policies as we've talked about at length. This is, essentially the commission's reaction to that, or it's how, that's how I'm seeing it. And Hena Kinnan, who we've talked about is the new teary Breton within the commission. She has, celebrated this new code of conduct, and its inclusion into the Digital Services Act. It's now gonna become part of platforms compliance. around, the DSA and celebrated the fact that this is strengthening both the DSA platform's commitment to hate speech. It's fascinating. Okay. You know, this, if we, if we think about it as a big game of chess, the last few weeks, Facebook Zuckerberg saying, actually, we're not going to really care about hate speech as, as most of the world sees it. only going to focus on, terrorism and the most egregious harm. And this is the commission playing their chess piece and saying, actually, you're going to, under the DSA, now you're going to have to abide by some of these rules. That we've had in place since 2016, but have been voluntary. I think this is going to go on and on. we'll really, really see the reaction to it from platforms, but essentially they're going to start having to show what they're doing internally to combat hate speech in the way that, the commission, seemingly want to. How's your chest, Mike?

Mike Masnick:

Yeah,

Ben Whitelaw:

I, the chess analogy is one that I'm, you know, not that strong on, but,

Mike Masnick:

I, I,

Ben Whitelaw:

do know about platforms and speech. So where do you see this fitting?

Mike Masnick:

yeah, I gave up on chess when my, son got better at it, he's very good. It was, that was when he was in kindergarten where he suddenly could consistently beat me. And I was like, yeah, chess, I don't like it so much anymore.

Ben Whitelaw:

Good call. Hmm.

Mike Masnick:

But, um, yeah, this is part of the chess match, I think. And, know, one of the things that did become very clear after Meta made those announcements and Zuckerberg made those announcements was that they said, Oh, but this only is for the U S like, we're not, we're not going to do this in the EU as of yet. Some of that, I think, is that they're waiting to see how some of the initial investigations, especially of X, play out under the DSA in the EU and whether or not community notes actually suffices, which is kind of a big open question that is currently being explored the EU. And so I think they're sort of holding off on that, but this statement is sort of the, it's not as obnoxious as the Terry Purtone style, but it is the like, Hey, look at us, we're not going to let you get away with this kind of nonsense. it'll be interesting to see how it actually plays out. I mean, I think that, all of the platforms that are sort of designated here that are related to this, like most of them already have processes in place that try and deal with hate speech. The question is hate speech. Is a broad category and there are disagreements over what does count as hate speech and you do have issues that come up And we've called these out where labeling has been abused In all sorts of ways and we saw this in germany originally like even pre dsa with the netsdg law which had similar sort of provisions and requirements to hate speech quickly where like satire publications were being taken down You And so it'll be interesting as this becomes part of the compliance effort, under the DSA, if we're going to see other kinds of overaggressive removals to avoid any kind of risk of liability. and so that's always my concern when like, hate speech related things go into the laws, sort of how they're actually going to be enforced in practice. and what sort of, may come of them, but, realistically, I think all the major companies listed already have policies trying to deal with hate speech and they want to remove it for business reasons anyways. And so I don't know that this has an actual. Impact or any real change on how the companies are acting. But is a, pretty loud statement of like, Hey, we see what you're doing in the U S don't try that nonsense here because we're not going to allow it.

Ben Whitelaw:

Right. And it's, it's a drawing of a line, I would say, as to what the commissioner willing to accept. There is some interesting kind of points further down on the press release, which I just want to call out as well. So as well as all of the existing code of conduct, know, stipulations from 2016, um, As part of the code of conduct plus, you know, you're, you're just a couple of dollars extra, a month, the commission has outlined how it wants the signatories to, report on hate speech. And I think, again, this is very directly related to, again, some of the The kind of meta announcement. So the one that stood out for me was the fact that it wanted country level data broken down by internal classification of hate speech, such as race, ethnicity, religion, gender, or sexual orientation. And so here it's trying to, pull the levers within, I guess, the reporting process to find out exactly what changes, what impact the changes that matter. is going to make and what that's going to do to particular, groups and, identities of people. and that's fascinating. It's kind of trying to suck out, I guess, more data and make the platforms be more transparent in this whole process. Again, it's not necessarily clear that it's. Platforms are going to have to do that. It's encouraged. so again, that's, that's the gray area, but

Mike Masnick:

yeah, I mean,

Ben Whitelaw:

directly responding to what happened with,

Mike Masnick:

yes, but it's, it's also sort of I think missing the point a little bit you know, this was one of the points that I actually did think was very interesting that zuckerberg made on the rogan podcast about discussing how these things work, which is that meta and meta really sort of originated this process They use these classifiers these automated classifiers to determine what is hate speech within that you have Different levers to pull in terms of, how the automation works. And they're all sort of AI driven these days, but it's like, how confident there are confidence levels. And when you do that, you have what are referred to as type one and type two errors, which are false positives and false negatives, where it's like, if you tune it and you say it needs to be 99 percent sure that this is hate speech, that means that, You're going to catch some hate speech, but you're going to miss some, but you're also going to catch some stuff that is not hate speech. but then if you turn those knobs and you say, well, now you only need to be 90 percent sure, then you're going to catch a lot more innocent stuff, as well, but then you also might let through some more hate, right? There's like all of these trade offs with each of these things. So like, just talking about like presenting the breakdown of the internal classification, isn't that useful if you don't know what the confidence settings are right. Cause you can, you can twist that knob.

Ben Whitelaw:

You can make it go up and down as however you want. Um,

Mike Masnick:

and there are impacts of each of those choices in terms of how many false negatives, in how many false positives you have. So how much non hate speech are you taking down by accident and how much hate speech are you leaving up? Because your system isn't confident enough that it really is hate speech. And those are choices. but each choice has impacts and there's no right answer. I mean, some people might say like you should twist the knob until it absolutely catches everything that might be hate speech, but then you're going to take down a whole bunch of stuff that clearly is not hate speech, certainly satire parody and those kinds of things. and then just the accidental things, like ones we've talked about in the past, where it's like, Saying Hitler is bad, still got you taken down off of Instagram for, for a period of time. Um, and so, you don't want those kinds of situations either. So I understand what they're getting at here. I actually think this is another demonstration of they don't fully understand how these systems work to then go and say like, well you have to like, break down this information where it's like they can twist the knobs and that information is gonna look totally different which doesn't really help.

Ben Whitelaw:

yeah, no, I think that's fair. so chess game goes on basically. Um, you know, we will see who plays their piece next and, we'll be sure to kind of talk about it in future episodes of controllable speech. One platform, Mike, that, you know, it has been doing some reporting of its own, onto our next story is blue sky and get the klaxon The Mike

Mike Masnick:

ding, ding, ding.

Ben Whitelaw:

Masnick is on the board of Blue Sky Klaxon. I do need to buy one. I need, we need to have something actually.

Mike Masnick:

We need like the air horn.

Ben Whitelaw:

Yeah, exactly. Yeah. Okay. We could piss off a lot of people, including our listeners of that. Um, but this again, this came out last Friday, just after we finished recording was their 2024 moderation report. This is only the second moderation report that I believe BlueSky has done. so we had some data for 2023 to kind of go on and it gives us a kind of broad picture, I would say of the changes that the moderation services on BlueSky have had to go through as it has surged in users over the past 12 months. And we've talked a lot about why that is. a couple of kind of headlines that I thought were interesting, which I'll just kind of share as a, as a starting point. BlueSky grew a lot this year, grew, eight times the number of users it had at the start of the year. It was interesting to me that the number of user reports increased significantly as well, 17 times what it was the year before. almost six and a half million reports in 2024, which is interesting in itself, probably as a result. Of the large surges in users from Brazil and other places as and when X slash Twitter did something strange led people to get rid of the platform. but yeah, huge increase in number of reports, which obviously gonna make a trust and safety team very busy. Also interesting to note that even though the number of users increased significantly, the percentage of users that submitted a report was consistent. Year on year or kind of slightly fell, but not significantly. and we can kind of talk a bit about why we think that is, but essentially a lot of people reporting content again, probably cause like people trying to get a sense of what the platform is and does and some clashes and norms and standards on the platforms. And then there was also data about labels for the first time. Obviously, police guy has a. Extensive moderation, service where it labels content according to your settings in your profile. And there's also some human, additional labeling that goes on as well. it talks about how there were 5. 5 million labels applied by the moderation service, which again, we don't have a any context for or a benchmark for, but obviously it feels like a big number. I don't think, you know, you weren't involved in any of the reporting, you know, as a board member, you're not. aware that reporting, but as a user of blue sky is somebody who kind of knows the service. Well, what, what did you make from the moderation report?

Mike Masnick:

Yeah. And again, just to reemphasize that, like as a board member, I have no inside knowledge on this. This is all based on what I've seen publicly. I didn't even know this report was coming out, but I'm happy, happy to see it. I like. Transparency. I like the company's being super transparent. I think that it's, it's really, pretty interesting report to look through. There wasn't anything in here that struck me as, particularly surprising. but it is interesting. Like, I didn't think that the fact that the number of reports went more than the, amount of users went up was all that surprising as one of these things, as you get more users, there's more interaction, so it's, not a linear growth and so you sort of expect that sort of thing, you know, but it's uh, the percentage of users reporting stuff, staying relatively constant. It goes down somewhat, but I'm still, relatively stable. also doesn't surprise me. I mean, you have people who report stuff and you have people who just are never going to report stuff. but I, do think, it's really interesting to see. And I liked the way that the company is breaking down this information and especially this stuff on, labels and seeing like, well, which, kind of content is actually being reported and which things are the things that are, are most concerning. Because I think people have a sense from the outside, like, Oh, all of the trust and safety work is around X, Y, or Z. and this is actually putting some numbers on that. So you get a better sense of, what are the things that people are reporting where are the labels actually going? and so I, I found that, to be really valuable. I mean, for example, like, antisocial behavior was The biggest reporting thing where it's like, okay, I can get that. That's where a lot of people are concerned, but you see that, other things are pretty big up there. Spam is obviously a big issue that, almost everybody agrees has to be dealt with in some way or another. And there have been concerns. So think it's, an interesting report, but there was nothing you know, amazing that came out of it. That was, you know, super enlightening.

Ben Whitelaw:

Yeah. Just another thing for us to read, uh, in preparation for the podcast. Um, one thing I thought Mike was that maybe. the percentage of users reporting might hint at is the fact that users don't necessarily know about the range of features and functionality that are available to them to customize their experience, you know, the lists, the feed, the labeling services, maybe a year is. Too short a period of time to see that go down. But I suppose over time, I imagine the kind of trust and safety team that will hopefully see people take measures that avoid them actually reporting content.

Mike Masnick:

Yeah. There's, one of the things that blue sky is really trying to do, and this is again, all public information and things that they've talked about is that they they're really trying to build up the affordances and abilities for people that control their own experience. And for third parties to come in and offer services as well, labeling services alternative moderation services and things like that. but the team realized like that is something that is different. And so I think a lot of people just getting acclimated to like the Twitter like experience, and then they'll begin to learn some of the other affordances that it gives them and the, powers that it gives them to really have more control over their own experience and what it is that they see and also what it is that they don't see. And so. Yeah. I mean, I think it's, it's an ongoing challenge. I think the team is, aware of it looking at that and trying to figure out ways to make more people understand that or for like third party services, more third party services and, and the wider ecosystem to get built, I think is an important next step, but you know, I think it's pretty early say one way or another how that's actually going where the knowledge gaps are.

Ben Whitelaw:

Yeah. I actually went into my account settings this week and started to tweak my settings, according to, some experiences on the platform this week. So yeah, I think you're right. That will happen over time and people, people need to be given the space to do that, right.

Mike Masnick:

Yeah. And you know, people start to figure it out and you begin to realize like, if somebody has a bad experience, you begin to say like, Oh wait, I could do these. I can make these tweaks that will make that less likely to happen. And people will begin to figure that out. And again, like the company definitely knows that they want to make that easier and they want to make it clear how to do that and that those affordances are available and they'll get there.

Ben Whitelaw:

So we'll, we'll link to that in the show notes listeners, and you can go and have a look for yourself and pass through the data. we were going to talk a bit about, free our feeds, which is an interesting development, but I think we should park that for another time, definitely going to end up

Mike Masnick:

Yeah, I will say it is a really interesting thing and sort of does get to what we were just talking about and the affordances and the ability of third parties to come in and do a lot more and free our feeds is a really, really exciting, new initiative. I've written about it on Tector so you can see that, but, I think it's a really exciting initiative of, someone who is not blue sky coming in and really trying to build out a bunch of, really cool additional things that can be done using the underlying protocol without having to rely on blue sky at all.

Ben Whitelaw:

Yeah, go and read that. Take that pieces as a primer. We'll certainly come back to it in, future weeks as that initiative gets off the ground. and that pretty much brings us to the end of this week, Mike. We've, gone long on, U S executive orders. We've, we've touched on EU and, uh, everyone's angry with each other. that's my summary.

Mike Masnick:

I think that's going to stay the case for a long time.

Ben Whitelaw:

All right, well, thanks for counseling us through it this week and for giving us your wisdom. we'll see you next week. Listeners, enjoy yourselves and, uh, thanks for listening to Control Alt Speech.

Announcer:

Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.

People on this episode