Ctrl-Alt-Speech
Ctrl-Alt-Speech is a weekly news podcast co-created by Techdirt’s Mike Masnick and Everything in Moderation’s Ben Whitelaw. Each episode looks at the latest news in online speech, covering issues regarding trust & safety, content moderation, regulation, court rulings, new services & technology, and more.
The podcast regularly features expert guests with experience in the trust & safety/online speech worlds, discussing the ins and outs of the news that week and what it may mean for the industry. Each episode takes a deep dive into one or two key stories, and includes a quicker roundup of other important news. It's a must-listen for trust & safety professionals, and anyone interested in issues surrounding online speech.
If your company or organization is interested in sponsoring Ctrl-Alt-Speech and joining us for a sponsored interview, visit ctrlaltspeech.com for more information.
Ctrl-Alt-Speech is produced with financial support from the Future of Online Trust & Safety Fund, a fiscally-sponsored multi-donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive Trust and Safety ecosystem and field.
Ctrl-Alt-Speech
Should Speech Cost a GARM and a Leg?
In this week's round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:
- Bluesky adds Techdirt founder Mike Masnick to its board (TechCrunch)
- X, Owned by Elon Musk, Brings Antitrust Suit Accusing Advertisers of a Boycott (New York Times)
- WFA Shutters GARM, X Antitrust Suit Cited (MediaPost)
- UK faces resistance from X over taking down disinformation during riots (Financial Times)
- Open letter to UK online service providers (Ofcom)
- Google and Meta struck secret ads deal to target teenagers (Financial Times)
- Turkey blocks access to Instagram for failure to comply with laws (Reuters)
- Harry and Meghan discuss 'protecting' their children (BBC)
- The Many Reasons Why NCMEC’s Board Is Failing Its Mission, From A NCMEC Insider (Techdirt)
This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.
Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.
So I've used this opener on the podcast before, Mike, but we have a special reason to go back to it and use it again. it's a special blue sky prompt that you, you and I both know. What's up?
Mike Masnick:Well, it's somewhat fitting for that. I was going to say, it's going to sound like this episode is, uh, summer repeats, uh, and reruns, but I swear everything that we're talking about today is new, just, uh, maybe follow ups to older stories. What's up with you, Ben?
Ben Whitelaw:Well, you know what? I don't like my prime minister being got out by some nobody CEO of, you know, the supposed world's largest town square. I'm not, I'm not into that. And we're going to talk more about. Hello and welcome to Ctrl-Alt-Speech, your weekly roundup of the major stories about online speech, content moderation, and internet regulation. This week's episode is brought to you with financial support from the future of online trust and safety fund. My name is Ben Whitelaw. I am the founder and editor of Everything in Moderation, and I have the newly, added to Blue Sky board. Mike Masnick with me today, Mike, we, we use the WhatsApp prompt as a lead in. Tell us about the news you've got this week.
Mike Masnick:Yeah, I mean, it's, something of news, I guess, uh, that I have joined the Blue Sky Board. Uh, there was an opening. Jack Dorsey, somewhat famously departed the board a few months ago, and so they had an opening. And after some discussion, decided that I would replace him. Him on the board. And you know, the history here, I assume a lot of people know this. If they don't blue sky itself sort of came out of a paper that I wrote. So in some ways I've been connected to it from the beginning. I wrote this paper protocols, not platforms in 2019. Jack Dorsey read it and really liked it and decided that he was going to make it happen. And so funded it and started it, hired Jake Graber, who then you know, raise some money and, started up the project. And I have been following it, obviously, all along, and I've been impressed by it. And my name is often associated with it, since the paper was, was, was a part of it. and I had known Jay from, before, Blue Sky even existed, and really found her very thoughtful and insightful and actually, really visionary in terms of how she thinks about these things. And so I've just been really impressed. And here and there I've, spoken to her, maybe given her a little bit of advice or answered questions on, some things that have come up. And so it just seemed like a natural fit for me to now hopefully help guide the project further along. And I sort of, you know, hope. That my position is to make sure that the vision stays primary focus, you know, empowering users and, making sure that it is really focused on, being a really good, useful social media system for people to use.
Ben Whitelaw:I mean, it's super interesting. How do you feel about jumping into Jack Dorsey's shoes?
Mike Masnick:Yeah. Yeah.
Ben Whitelaw:my main concern is like, what, how does that feel?
Mike Masnick:It doesn't come with the same bank account, so I don't quite have that ability, but yeah, I mean, somebody was asking if it means I have to now take a two week silent retreat to Southeast Asia every year, as he does, and I don't think that was in the contract, but I'll have to read the fine print a little more carefully.
Ben Whitelaw:It'll certainly make for an odd control or speech podcast that week, if you do. and yeah, so, I mean, that's, that's, it's important thing for us to flag up. Up top, we know we will continue to cover blue sky stories on the podcast. It's a really interesting platform for many reasons. And, um, we'll, you know, we'll preface it obviously when we talk about it in future that you're on the board, people will hopefully know that. And, um, I think it's really adds, I think, to, Hopefully why listeners listen to the podcast and
Mike Masnick:Yeah, and, and,
Ben Whitelaw:future.
Mike Masnick:yeah, and, and, and obviously we want to be as transparent as possible about that. And if there are any stories where even, you know, even related to some of the other, um, decentralized social media platforms, if there's anything, we want to be as, clear as possible. None of this should change my opinions on anything. It's, you know, but, um, know, yeah, we'll be upfront about it.
Ben Whitelaw:Yeah, great. And you also mentioned up top Mike about how this week's stories feel like last week's stories. You know, it's, there's a strong sense of deja vu about the stories we've selected today. it is a different week. There has been seven days since the last episode and, and I hope people don't feel shortchanged as, we listened today. But, a lot of the stories that we talked about in last week's podcast have moved on. It's quite significantly. We went around the houses ahead of recording and we feel like these are the big stories to talk about. So without further ado, I think we should jump in, um, get right into it. One of the stories that listeners would have heard last week that you kind of unpacked was the, Jim Jordan, he, he, that shall not be named, um, you know, taking aim at GARM and the kind of whole, you know, backstory with Elon Musk also on Twitter, rejoining GARM and then kind of deciding they were going to sue. Well, that actually happened this week, right?
Mike Masnick:Yes.
Ben Whitelaw:suit happened. It was, delivered. Talk us through kind of what happened and then the ramifications.
Mike Masnick:Yeah, so, as we had said, he had promised to sue, but that hadn't happened. He'd also called for, criminal investigations and everything. And so finally on this week, they did sue, they sued the World Federation of Advertisers, who was the one who set up GARM, GARM was a nonprofit, but. It was run by the world federation of advertisers and then some of their biggest advertisers, I forget exactly which ones, but there were just a few of the biggest advertisers that were part of it. They filed the suit in Texas in a particular district within Texas, where they are guaranteed to get one particular judge, Rita O'Connor, who is somewhat infamous for, let's say. a partisan leaning and being willing to take positions that are perhaps driven more by partisan feelings than what the law actually says. He's the judge who he's tried to overturn basically every major policy decision put forth by a democratic administration in the if you want it overturned or Determined to be unconstitutional, you go
Ben Whitelaw:man,
Mike Masnick:Reed O'Connor's courtroom. There's a few others. There's there's a short list of about three or four judges, but he is one of them. He happens to be in a court where he is the only judge. And therefore, if you file in that court, it will get to him. There, there is this very, very quickly, there's this backstory of that people have been complaining about this sort of judicial shopping, that has been going on. And. the administration of the administrator of the courts, which is John Roberts, who is the chief justice of the Supreme Court basically had put out a recommendation earlier this year saying we should change the system so that if you are filing in these sort of one judge, divisions, it should actually be pushed out. So there would be a random assigning to any judge in the district, the district being much larger, there'll be, you know, probably a dozen or so judges, depending on where you are. And in particular, the courts in Texas, notably said, nah. We're not going to do that. And they just like came out and said, Nope, we're not going to do it. And they started to make it into a big partisan thing, which it wasn't because it came from John Roberts. Uh, and then Roberts back down and was like, it was just a recommendation.
Ben Whitelaw:Just an idea.
Mike Masnick:yeah.
Ben Whitelaw:You'll leave it. Oh,
Mike Masnick:where it is. It is also notable that, Elon's suit against Media Matters, which is another one of his suing the critics, was also filed in the same court in front of Judge O'Connor. Media Matters recently tried to get him excluded from the case because he owns a bunch of Tesla shares. Uh, and so they're claiming there's a conflict of interest. X is claiming that X is not Tesla, which is a whole other thing. So we'll see sort of where that comes out, but it is interesting that they filed there. the claims are effectively that there's an antitrust violation and there is. case law that does say that sort of, organized group corporate boycotts can be an antitrust violation. There's always, you know, whenever you have multiple companies working together on something, there's a potential for antitrust. But the cases that have said that that is a kind of antitrust violation are when you are doing a kind of boycott for anti competitive reasons. So the, the sort of quintessential case, I forget the name on the case, but was around like. shipping, where companies were basically trying to drive shipping away from certain trucking companies to like rail. And so they organized this effective boycott on the trucking companies to sort of screw over the trucking companies and drive them out of business. That is understandable. That appears like an antitrust kind of violation. On the flip side, More importantly, there are cases saying that if a boycott is for speech purposes or you're trying to, you know, like make a statement, if there's any sort of expressive element to it, that is protected by the First Amendment. So we have cases that the NAACP had filed around, boycotts of businesses that were, discriminating. people had tried to argue that that was, collusive behavior and yet there was found like, no, that's, there's a first speech on it. So in this case, it's pretty clear that, this was not an attempt to like, for anti competitive reasons. Also like the structure of GARM itself is that they're coming up with frameworks and best practices for platforms, how to deal with advertising against, potentially problematic content. As we said last week, it came out of the Christchurch, uh, situation where the, the mosque shooting that was streamed live online and advertisers were getting slammed because like, Oh, you're advertising next to this sort of violence and terrorism. And so they just put together a framework. Then it's entirely up to everybody involved, how they deal with the framework. So the, the platforms get to make their own decisions. The advertisers then totally get to make their call as to whether or not they continue advertising. The thing here is that Elon insists that the only reason that a whole bunch of advertisers bailed on Twitter and then X was Because Garm told them to, which makes no sense. Like it's fairly obvious that there were good reasons to stop advertising.
Ben Whitelaw:And you know, the funniest thing about this as well is that Garm is a two person
Mike Masnick:Yes,
Ben Whitelaw:nonprofit. It's, you know, it's like a, it's a very, very small nonprofit, right? and it's kind of ludicrous that they would have this power and this responsibility. It adds this kind of ludicrous Musk is being conspired against theory that this all plays into, but the way that it was announced was very loud and very, very, um. We can kind of talk about now you and probably a lot of listeners would have seen the video of Linda Yaccarino in which she announces that this, suit has been filed and she's pointing at the camera, she's wearing these necklaces that say kind of free speech, and she's kind of beckoning users of Twitter to say, we will not be silenced. So this is for you guys that were bringing this case to bear. And, you know, she talks about no small group of people should be able to monopolize what gets monetized it's, it's, it was, it was very, it was a lot of laughing about it, obviously, but it was kind of a moment, right? This week, though, is this case is happening.
Mike Masnick:yeah. I mean, lots of people were talking about it. I saw people were saying like, is this a deep fake? Like it doesn't, doesn't seem real. It seems, it seems really like not quite human.
Ben Whitelaw:Yeah.
Mike Masnick:It was a very strange thing. And yeah, she's really. Try to position as, you know, this is about protecting the users. That's like, take a step back and think about that. Like forcing advertisers to put advertisements, which everybody hates anyways, on the platform is protecting the users. Like what, who, who actually thinks that?
Ben Whitelaw:Yeah, it's a stretch, isn't it? But yeah, it's, it's, it was very clear that from that video and also from her subsequent tweets and from my subsequent tweets, that there was just a kind of coordinated, um, obviously announcement, there was a bunch of retweeting of like kind of notable candidates in that space. And, You know, we mentioned last week Musk did the popcorn emoji. There was a kind of bunch of similar, um,
Mike Masnick:Yeah.
Ben Whitelaw:attention to this.
Mike Masnick:And then just a couple other points on that was like Elon. So he also tweeted out that he believed that this was probably violating the Rico act that what Garm was doing Rico being the, uh, racketeering it's basically, but it is one of the most misunderstood laws in the U S lots of people, including apparently Elon Musk seem to think that Rico means something really bad, but no, it's like. It's a conspiracy. Like, you have to have multiple parties who, like, are planning a crime and are taking steps towards actually doing the crime. There's a very famous, unfortunately no longer online, though you can find the archive, article that Ken White, who's pretty well known lawyer, wrote many years ago called, uh, His law explainer, which was, it's not Rico, damn it. Where he just goes through this thing where everyone's like, well, is it Rico? And he's like, no, it's like, but maybe if it's no, it's not. He's basically like, it is never Rico. Like Rico is very specific. It involves all sorts of, specific things that you have to have. None of the things that people are talking about are ever Rico. And this is definitely not Rico.
Ben Whitelaw:needs, that, that needs a Musk and Ken meme, I think.
Mike Masnick:Yeah. Oh gosh. So, I mean, anytime anyone claims something is Rico, everyone has to alert Ken and I, I will admit that I alerted Ken to, to Elon doing it this week, but then the, the other aspect of it was it, that it, it wasn't just, Twitter or X that filed the lawsuit. It was also Rumble, which is another one of these platforms. I've been sort of jokingly describing it this week as, what if YouTube, but for assholes, uh, is the, the sort of rumble mantra. and so they sued also, and their suit was perhaps even more ridiculous because there was no, as far as I could tell, there was no commentary from Norm about rumble at all. The only evidence that rumble CEO presented was that he had emailed a few advertisers. Including Dunkin Donuts and was like, well, you guys advertise. And Dunkin Donuts response was, Hey, like, it's great talking to you, but you know, honestly, because of the sort of political bent of everything, we just don't think, you know, of everything that's on your site, which is, they just platform all the, like most extreme right wing video chatters. it's like, we just don't think it's right for us. And present that as like, this is proof of like GARM's., silencing us. And then even in that bizarrely the rumble CEO admits that they also, because of this, because Dunkin Donuts wouldn't sponsor them, they started their own coffee brand.
Ben Whitelaw:No.
Mike Masnick:yes. And, and so he's like, and he says, and it's been a huge success. So people pointed out, like, it does not appear that it is a huge success. Like that, that coffee brand has an account on. X that has less than a thousand followers. It's like, it's not, it doesn't appear to be particularly successful, but also like saying that out loud, sort of proving that there are no damages here. If you thought there were damages and you were able to start your own coffee company that you are publicly claiming is a huge success is, is not true. So. but there's, the followup to this story, which
Ben Whitelaw:this is the kicker, right? This is, what happened yesterday. Essentially that we, the kind of final closing of the circle of this week's story.
Mike Masnick:is that, that the world federation of advertisers announced that GARM is shutting down because of this. And as you said, it's a two person nonprofit. It was a small thing. All they were doing was sort of putting out these best practices and, discussing these ideas and allowing the advertisers to make their own decisions. In fact, some of the evidence in the X case it has them saying, like, they're asking, like, what should we do? Should we advertise on Twitter or not? And has the GARMS director saying, you have to make your own decision. Here's what we can tell you, but you make your own decision. But they announced they're shutting down. So in other words, they folded, and they said, like, basically we can't deal with this. This lawsuit is going to be huge. We're, we're tiny. And so we're just, shutting down. Uh, and so it is, To me, it's a classic example of the suppression of, speech because all GARM was doing was speaking, right? They were putting out these frameworks and, advocating for their position, trying to make sites places for advertisers to feel safe advertising. And they got shut down and you have Jim Jordan and the house judiciary
Ben Whitelaw:Don't mention his name. I
Mike Masnick:Sorry. Sorry. Sorry. You know, announcing a big victory for the first amendment and you have Ilan and you have Linda Yaccarino and you have a whole bunch of other people cheering this on as a victory for free speech. And it is exactly 100 percent the opposite. It was literally one extraordinarily wealthy person and the government. In the form of he, who shall not be named and the house judiciary committee publicly attacking this organization for advocating for their speech. And they have been shut down because of legal government activity targeted at them. That is like quintessential, violation of the first amendment. And yet it's being paraded by a lot of people as if this is supportive of the first amendment. And that bothers me.
Ben Whitelaw:It doesn't feel like a great day for your fair country in that sense, right? Like, I mean, I don't know if it will ever be kind of couched like this, but it does. it's clearly not, In the spirit of all the kind of wording of the first amendment. And yet, but it's being twisted to make it seem like it is. And all the while the, the committee is still investigating GARM, I believe on, on the kind of whether they can colluded against each other to suppress, not monetize conservative voices. It's a, it's a bit of a mess, isn't it?
Mike Masnick:Yeah. It is a mess. It is an unfortunate situation. And I hope that as years go by, people will look back on this and recognize what it is, which is just a sort of modern version of McCarthyism, which the U. S. went through in the fifties, where, these accusations and silencing of speech, through completely bogus accusations. And so I, I think it's a really dark week for speech in the U S because of all this.
Ben Whitelaw:I can't talk Mike, you know, cause my country is in much better and it's partly to do with that man, uh, Elon Musk again. So my, my
Mike Masnick:That's a great segue. That's a great segue.
Ben Whitelaw:an unfortunate segue that my story that I'm really been reading this week is a continuation of what we touched on last week. We use the kind of tick tock report. and it's use of by far right and Nazi, uh, users last week as a way into the South port riots, which have happened over the best part of two weeks. Now, those have continued since last week, and there's been a whole tranche of stories that have, come out, around, um, Ongoing violence and the role of social media in that the story that I was particularly interested in amidst all of that was an FT piece that came out a couple of days ago about how British government officials have tried to communicate with platforms to take down some of the most egregious speech. The stuff that is a threat to national security, and not had a great experience with X slash Twitter. So to put it lightly. So. I just kind of unpacked it a little bit. There's a unit within the UK government called the national security online information team. Somewhat controversial in itself, but is it kind of essentially the, conduit between government department, which is called DCIT and the platforms themselves, and. This report goes on sources, close to this unit and says that actually Google, Meta, TikTok have all been fairly receptive to certain posts, have been flagged by this unit and, and done something about them. Whereas X slash Twitter have not done anything at all and have been really slow. Often keeping content up that have been flagged to them. So there's a few parts of this, I think are really interesting. So this comes amidst honestly, a crazy week in terms of heat being put on platforms for missing disinformation. You know, we had the Yvette Cooper, who's the home secretary of the new labor government, calling social media kind of rocket booster for missing disinformation, um, we've had reports of. death threats to her and others, figures within the kind of left, uh, of the political spectrum on Telegram and other platforms that have, have been, doing the rounds. You've got Ofcom putting out a open letter asking platforms to, act now in terms of, uh, slightly desperate plea, um, to the platforms.
Mike Masnick:I think it was, it was a pleading letter.
Ben Whitelaw:Um, and then, you know, so, so. The fact that this is the backdrop of what's happened. And then you've got these, platforms kind of working with the government in order to take down some of the most egregious stuff and basically getting nothing back from, Twitter. So we're in a situation again, where we're, we're looking at another weekend of, violence and riots in parts of the UK and this sense of, Social media being often the cause of this, which is something we talked about on last week's podcast. It's something that we seemingly can't shift. The media is naturally inclined to say that social media is to blame for obvious reasons. And. the same time, we've got these court cases coming to light of rioters inciting, racial hatred and violence from their, their cities, from their homes, from their armchairs. And again, this idea that content moderation on platforms is not sufficient. And, you know, where does regulation fit in all of this? What, what's your view of this from the other side of the Atlantic? Like I know you've probably seen more of it this week than last week when we discussed it. And I know you've written a couple of posts on TikTok.
Mike Masnick:Yeah, it's, these are the tough situations, right? And so there are a bunch of different threads that are worth exploring. So one, like I'll just mention, even in the US, even with the first amendment incitement to violence, incitement to imminent violence is not protected by the first amendment. And so I think that there are some of the posts that are being discussed that would certainly. fit under that banner and probably are not protected by the First Amendment. The issue, and we did discuss this somewhat last week, is that a lot of this is still being driven by telegram. And it's odd how little, I think, that, like, it's, it's funny to me how sort of people recognize like telegram is effectively unreachable, or it seems like everyone is acting that way. And so the sort of agitators behind a lot of this are just sort of using Telegram to organize and to plan stuff out and then, Brigading from there to other platforms to try and spread their message. And so it's weird to me how little Telegram has come up in these discussions. The other element is like how much of the focus has been on Elon in particular. Here's where there's something that I think hasn't been as separated out in a lot of these discussions is that everyone's talking about, like, oh, well, you know, X is not being responsive. There's a difference. Between X and meta and Google and tick tock. And that is that the person who owns X and runs X believes in the nonsense and the conspiracy theories and is actively spreading them himself. And you don't have that with any of the other ones.
Ben Whitelaw:Two is 189 million followers.
Mike Masnick:that is creating some other issues. I mean, so the big one was last weekend where he, said civil war is inevitable based on a completely nonsense tweet, about migrants and asylum seekers. And then he tweeted at one point this week, a totally false thing about detainment centers, which he then deleted once people called him out on it.
Ben Whitelaw:Yeah.
Mike Masnick:It's funny how so many of the stories and even some of the comments from politicians and government officials who are talking about this are still trying to treat it as if X is a normal company in the same sense as Google or YouTube or Tik TOK or Meta are normal companies and they're not this is not about. The policies, right? So like Meta and Google have in place teams that try and minimize harmful things. They are not always successful. There are lots of problems. I think later on, we'll be discussing some other problems with, with those companies, X is different X is and it is tied up in his identity. And he believes in this nonsense. And when he is then attacked, and I'm using that word specifically because that's how he's viewing it by the British political class and others for not dealing with this, he sees it as another opportunity for him to appear as a free speech martyr. So he spins this as in My speech is being suppressed, which is utter nonsense, absolutely ridiculous. But it allows him to his 189 million, mostly fans.
Ben Whitelaw:193. I just
Mike Masnick:Uh, gosh, it keeps going up. I don't understand how it keeps going up, but yes, it allows him to present himself as I am the martyr for free speech. I am the one taking the, arrows and the slingshots. I am defending your free speech rights by fighting for this. And it is nonsense, but it plays into exactly what he wants.
Ben Whitelaw:And there's something that you're right. And I think the interesting thing about that is that this unit that I mentioned, the national security online info team, it was formerly known as the counter disinformation unit. And it's literally designed to work against hostile actors that have. nefarious plans for, you know, the British populace. And we're talking foreign States. We're talking election risks. We're talking increasingly AI and deep fakes. They are turning their attention to combating the disinformation of a CEO of a major platform. Like that's how wild it is. Like. You're right and there have been some quotes from government officials saying we are treating Elon Musk in the same way we would treat a head of state that's gone rogue.
Mike Masnick:Yeah.
Ben Whitelaw:I'm paraphrasing, but you kind of get my gist.
Mike Masnick:And that, that was, I think it was Peter Kyle. I think it's the, is
Ben Whitelaw:Yeah, yeah, yeah, the new, the new minister, yeah.
Mike Masnick:who said, you know, effectively, like we have to treat Elon as if he's a nation state and that, I think that's wrong. I mean, I think it's wrong for a variety of reasons. And I know it's like convenient. And, I had written a piece for the daily beast about this this week, where I know it's like, it's become really common to compare social media platforms to nation states, usually based on like user size and like, Oh, you know, meta is bigger than every country in the world or whatever, but like, they're not. They're not countries and the affordances that you have as a country to deal with other countries is totally different than the ones that you have to deal with private entities. there are things around like trade agreements and taxes and visas and stuff like that, but like, that's different than, dealing with companies.
Ben Whitelaw:Yeah, I mean, Musk called out the British Prime Minister as two tiered care. I mean, he's goading an elected official in the way that he's done it with, um, Um, you know, the Vela Suedem president and with other elected officials, even before he was made CEO and more so since what do you do in this situation then if, if the, if the idea is not to treat Twitter or any platform like a, country, what options are there to engage with somebody who takes that approach?
Mike Masnick:The answer is honestly, there's not much you can do directly because almost everything that, is seen as an attack plays to his advantage and he gets suspended into this, look at me, they're attacking me, for my free speech and that only draws more attention to all of it. My take on it is basically. Ignore him, start, start ignoring him and teach more people to ignore him, like lessen the power that he has in terms of the way that he is speaking about these things. And so some of that is more longer term things. And I know some people get really annoyed when you bring this up, but it's like, just educate people to be better consumers of news And I know that feels like really unsatisfying and I can see the look in your face bed that it's just like, Oh, shut up. Like, don't, don't bring that up again. But like, we have to start thinking about these things in ways of like, how do we make the world more resistant to this kind of nonsense rather than assuming that, Scolding Elon Musk is going to lead to better behavior. I don't know what in the last decade has shown that scolding people like Elon Musk leads to better behavior because it doesn't. and I don't know that there's some like magic bullet regulation that is suddenly going to make him be a better person because he's not. he's a terrible person and he's made that clear. And he. Thrives off of this stuff, especially when he's made to look like a martyr. And so, the thing that, needs to be done, the only other thing that I would suggest is like switch from scolding him and jump straight to mocking him, right? And this is sort of like what we're learning in the U S political context in the last few weeks, where the Democrats have learned like mocking. Donald Trump and JD Vance seems like a much more effective strategy to minimize their impact. We'll see how it actually works out politically. Get to that mocking stage rather than empowering these things by making it seem like, Oh my gosh, Elon is speaking, and this is so important that governments around the world have to yell at him for his speech. Why not just point out how stupid it is? Like, how is it that this guy who has more access to anything than anyone else in the world is falling for absolute bullshit?
Ben Whitelaw:yeah, I get what you're saying, you know, if this, I'd hope that in the, in the future when riots like this happen or any kind of unrest happens, we'll be in a better place than we are now, but there needs to be, it feels like there needs to be an kind of action now, right. There needs to be stuff in place. And in terms of ignoring Musk, the incentive structure of the platforms and the way that, Media works and the kind of wider context in which he's doing this posting. It's all it is at the end of the day, feels like it's set up for him to win. And I hate that. Is it so many, you know, there's so many trusted safety professionals and people trying from within to kind of ensure that, that isn't the case. and, and
Mike Masnick:he he is a unique, case, right? There is no trust and safety person who can stop him. don't live in that world. And so I totally agree. Like, it's not good. It's not comfortable. there's no situation here where you're like, this is the perfect solution, but it just feels like everything that everyone is trying, like with other companies, you can try and influence the CEOs to invest in the trust and safety or ramp up the trust and safety or change the trust and safety policies or understand best practices or implement new technology. All of those things are possible, but they're not possible with Elon Musk. He's not going to be convinced that way. All he's going to do is use this to make himself a bigger martyr. And so it just, it just doesn't work.
Ben Whitelaw:Yeah. And you talked about how, you know, there is no trust and safety professional that can kind of convince him otherwise. There's a really interesting part in the FT story about Nick pickles, who has now been appointed the VP of global affairs. It was a long time kind of global affairs. Direct, I think up until
Mike Masnick:and was at Twitter in the, the old regime.
Ben Whitelaw:was in the, yeah, exactly. It was there in your regime. And actually, according to these reports and, other folks who I know have worked with him, you know, advocated for a lot of the right things for greater transparency about decision making and for clarity of policies. And it seems like there's a kind of, piece points out that he has maybe compromised on some of those ideals for whatever reason. So it's just, again, it's the system that you're working in as a trust and safety professional, even if you have the right ideas and you have the kind of, you thinking is good. You are always going to, I guess, be bound by people like Musk at the top of the tree, who, whose ideas usurp yours.
Mike Masnick:Yeah. Yeah. I don't know. I mean, yes, this is a really tough situation. I don't know that there's an easy answer, uh, but you know, I, I just think that what people are doing right now just plays to his advantage and, and doesn't help.
Ben Whitelaw:Okay. Well, let's, let's make a pact now that we don't talk about Elon Musk next week. We try not to talk about these riots or, you know, ideally, GARM as well. Cause we, we've, we can't have a third
Mike Masnick:we, we need other news.
Ben Whitelaw:We need other news. Talking of which, talking of which, there have been other stories and you kind of hinted at it a few minutes ago, Mike, about Google and Meta and this. Uh, another FT story published this week, doing some not ideal practice when it came to targeting under 18s with ads,
Mike Masnick:Yeah, an understatement. This is, I think it's a really damning article, that basically said that Google and Meta got together and they struck a secret deal to target teenagers, violating the policies that Google itself had. It was basically Meta wanted to get more attention for. teenagers to be using Instagram. And there's a lot of background in the story. It is presented in the financial times in a fairly conspiratorial fashion. And there is one way to read this, which is sort of the way explain the headline of Google and meta getting together to secretly target teens, reading through it. There's a bunch of pieces in here and I will state upfront, there's nothing in this story that is good. I think everything about how this came about is bad, but I think there's some structural issues here that led to this, that is not like the leadership of meta and Google getting together and saying, ah, how can we secretly target kids,
Ben Whitelaw:this is like a kind of relatively low level, um,
Mike Masnick:it appears
Ben Whitelaw:Advertising executives going to an ad agency and asking to be able to target a particular demographic that they may or may not know, that company shouldn't be allowing them to target essentially, right? It's, it's a company called spark foundry, which is a subsidiary of a big advertising giant called publicists. And they launched a bit of a pilot program in Canada earlier this year. And your, your sense is actually that that is, the story is slightly overblown.
Mike Masnick:It's, it's not that it's overblown because this is still bad. Right. And I'm, I'm glad that it was reported on, but the framing of it is like Google and Meta getting together to do this as if it was like coming from the top. That's not what I get from reading the details. So here's the way. It looks to me and maybe I'm wrong, but reading from the details, as you said, it's, this, advertising firm that Meta hired. So it sounds like. A product manager or someone at Instagram has a KPI, which is we need to get more teenagers on the platform, teenagers being 13 to 17 year olds, which that's legal. They are allowed to target those, you know, and Instagram, Says they don't allow people under 13. There's a whole bunch of other issues there, but if you're over that's okay. So someone has, has a KPI within meta that says we need to get more teenagers using Instagram. Basically we're losing them to TikTok is I'm sure what the story was. So they go hire this advertising agency. The agency then goes to Google and says, we're trying to target these people. And it sounds like. What probably happened was that there was a salesperson at Google who also had numbers to meet and had a year end goal in terms of how much sales he was doing, who looks at this and says, yes, our policies say that we will not do advertisements targeted to anyone under 18. However, wink, wink, nudge, nudge. We know that if you target unknown. As your demographic, most of the people in the unknown category happen to be under 18 and there's probably another sort of. Systematic reason for why that happens, which is that there's been all this stuff about kids and teens on the social media things, and it has become imperative for some of these services to not know if kids are using their platforms because there are different legal consequences. And so there is the potential. That the incentive structure has created a system where certain platforms maybe look the other way and do their best not to know if there are kids using their platform, and therefore what happens as a result of that is that the unknown category is dominated by people who are teenagers. And so. The salesperson at Google recognized that, did a wink, wink, nudge, nudge to the advertising firm who was trying to accomplish goals for the meta product manager who had their goals. And so you have all these like Stupid incentives that line up to this deal where they were basically like, okay, you want to target kids? Just target the unknown.
Ben Whitelaw:Yeah. This is a kind of bit of insider info going on here. Right. Which is I think the kind of crux of the story. I mean, there is this other part, which is that the campaign was, you know, disguised supposedly to make it clear that it wasn't, targeting underage 13 to 17 year olds, like under 18s, which again, you know, makes you wonder how many, people knew that it was not the right thing to do, but again, like it's not top tier people. It's not even necessarily the platforms talking to one another. It's talking via a kind of conduit and. Them both establishing a kind of relationship that benefits
Mike Masnick:And so the, there's a part of this, this is just like capitalism in action. Right. And, and that, these companies have to make their numbers and when they're big enough, you have different people who get assigned different numbers that they have to make. And almost everybody in this story is just probably trying to meet their numbers. That is not a defense of it. And again, this is bad. This shouldn't happen. I'm glad the article is out there. I'm glad people are talking about this because safeguards should be put in place. But, the way the article is headlined and some of the reporting I've seen about it has been as if like the two companies, as if, Zuckerberg called up, someone at the top of, Google's advertising set up and was like, Hey, let's figure out how to target kids. And that does not appear to be the case.
Ben Whitelaw:no, I wonder to what extent the DSA will be interested in this and the European commission will be interested in this because they, are, interested in the protection of minors and are very specific about how data of minors is used. So it's not talked about in the story, but I wonder how much Thierry Breton's ears will prick up at the,
Mike Masnick:it's,
Ben Whitelaw:this
Mike Masnick:entirely possible. I mean, the fact that it was targeting Canada and then the U S, might safeguard them and I would imagine that these companies probably have a little bit more in terms of compliance for EU targeted stuff, who knows? but yeah, it'd be interesting to see.
Ben Whitelaw:Yeah. Okay. cool. So let's move on to our next kind of smaller story. This is something that, is in Turkey now. So a country don't really cover that often, but, uh, is definitely notable here. Turkey, right after we recorded the podcast last week, actually last Friday, decided to block access to Instagram. And the reason they did this is because Instagram failed to comply with the country's laws and rules, according to a government minister. essentially the Turkish government has warned, was warning Instagram about the fact that certain posts, offering condolences for. The Palestinian militant group Hamas's leader, who was killed, um, the week Those posts were being censored. and Turkey basically warned the company that this shouldn't be the case to kind of reinstate the posts. Instagram didn't do that. And so relatively quickly, they opted to ban the platform in the country. Now that is not something we see very often in any part of the world. You know, there's not many, there's probably a handful of times we've seen that happen in the past. So this is a big deal. I don't know how many users Instagram has in Turkey, but the ban continues to today. So the platform still is inaccessible, in Turkey a week on and number of different human rights groups have called for the government to reinstate Instagram. Saying that basically, you know, this is affecting people's, speech and ability to, communicate with one another. So. This is a really, a big thing. I mean, I'm guessing this is about the kind of dangerous organizations and individuals policy that Instagram and Facebook have, right, Mike, which is pretty strongly aligned with the, US, equivalent list of terrorist organizations, which includes Hamas. So is that your sense of the story as well? Do you, do you see anything more to it? Yeah,
Mike Masnick:well, there's a few things. And so definitely that that's a part of it. There's this element of, different interests of different countries in terms of what they want to promote and what they don't want to promote. And that comes into conflict as nations come into conflict. And so I think that that is https: otter. ai You know, I would say it's been a few more than a handful of times that certain countries have blocked entire sites. but it is most often and most frequently, authoritarian run countries that are blocking sites that are moderating content in ways that the, folks running those countries do not like. And so we have, certainly China has blocked apps. Russia has blocked apps, uh, you know, Iran. the one, Exception to that rule is now the U S trying to ban TikTok, but that's a whole other road to go down.
Ben Whitelaw:We've already covered that. Come on.
Mike Masnick:Yes. Yes.
Ben Whitelaw:We'll do, we'll do again.
Mike Masnick:but, but also like gets to the point that we were talking about in terms of battle between X and the UK right now, like that is an approach, right? You know, if the UK wanted to, they could follow down this route and perhaps, put in place policies that would allow them to turn off X in the UK. I think that that would be following the path of. authoritarian countries who, don't deal well with free speech at all. And I think it's, problematic when we see it, it is worth noting that a few days after they banned Instagram, Turkey also banned Roblox, the, the sort of kids game, and it's not entirely clear why, but there were claims around that they were, Promoting, pedophilia and, and other things, which I don't know what full interpretation was. And there wasn't, necessarily like a clear explanation for why, but it is interesting that Turkey is trying to ban different platforms. I wonder if there was some sort of connection where like people who had been using Instagram were suddenly moving to do stuff. I don't, I don't know if that's the case. But the timing is certainly interesting that it comes right after. But it is, you know, this is one way that countries are dealing with content online that they don't like, which is just to block it entirely.
Ben Whitelaw:Yeah. And obviously Turkey's president, Recep Erdogan has been under pressure since elections in April as well. So, you know, this again, not a Turkish political whiz by any means, but, it may be related to, pressure he's come onto there and, and other opposition parties who've gained a foothold in the last kind of five, five, six months. but yeah, interesting. Okay. So that covers off a Couple of, uh, of our stories. We've got a third one, which you're not going to like Mike, I'm sorry about this. And I think, you know, you know, what's coming, but again, right after last week's podcast, there was the weekend. There was a big interview on CBS, with. Maybe definitely not my favorite people. Probably not yours either. The Duke and Duchess of Sussex, who have launched,
Mike Masnick:to support them. They're your royalty, aren't they?
Ben Whitelaw:I don't know. I think they've distanced themselves from the Royal family. So, I'll leave my, views on the Royal family out of this podcast, but they've launched a big new initiative. Called the parents network out of their arch well foundation, really, designed to support parents who've gone through trauma, and, issues related to the death of children directly or indirectly as a result of social media. And so they did this big interview in which they talked about how. They're increasingly worried about how kids can seem fine, but then be next door on a tablet or a phone, and potentially be being bullied or be at risk of suicide, all of those stories that we've, talked about in recent episodes of the podcast. They've set up this network for parents as a big splashy website. There's a really slick video in which they talk to some of these parents. And there's a couple of guides on the website about how parents should think about keeping users safe on Tik TOK or Facebook or Instagram. They're not super detailed. I'll be honest. I think there's better resources out there. So it will be interesting to see. What it is, the plan is for this, um, whether they're going to be a resource that parents can use, whether they're going to be, if they're going to kind of agitate for change politically, whether they're going to be just a kind of online community. It's not quite clear, but, uh, yeah, the, Duke and Duchess of Sussex are now within the control of speech. Well, maybe there'll be listeners, Mike, you know,
Mike Masnick:Yeah. That'd be great. Yeah. Sign them up. Um, so yeah. And it's like, there's some elements of this that maybe it's good. Right. I mean, getting more attention to certain issues I think is important. Giving more resources to parents and kids, I think is really important. It's not clear that they're going to do a particularly good job of it. As you said there are a number of organizations that are focused on that, that do really good work. And I feel that, you know, especially the way a lot of this was framed, that this is people parachuting into an issue that they think they understand, but that they really don't. And, and the craziest part to me was in the interview where Prince Harry says that in the olden days, which is interesting, uh, it says parents always knew what their children were up to, which. No, and at least they were safe. Right. And he says, and now, as you said, they could be next door or in the next door room on a tablet or a phone, it could be going down these rabbit holes. I don't know what the olden days were like for Prince Harry, but like, for most people nowadays, kids have a lot less freedom in the olden days and I'm older than you, but you know, I would, you know, I would get home from school and I would wander the neighborhood freely and hang out with friends and be in friends houses and basements and rooms. And, we walk into, all sorts of areas that were far away from, from home,
Ben Whitelaw:you know,
Mike Masnick:our bikes places.
Ben Whitelaw:didn't live in Buckingham palace though.
Mike Masnick:I did not, I did not grow up in Buckingham palace. Um, and so I think in the olden days, parents had a lot less of an idea of what their kids were doing and where they were and what was going on. And I will tell you that, depression and suicide. Very problematic things existed pre the internet. There are all sorts of questions and all sorts of debates that we're not going to go into in this short little bit here in terms of like, what is causing what and what is going on. But this idea, the whole framing of it as if parents have less of an idea nowadays of what their kids are doing is just, it's disconnected from reality. And so, maybe it's that he grew up in Buckingham palace and he has no idea, but like it was just a very strange framing. And so that worries me in terms of where this is actually heading.
Ben Whitelaw:Yeah. I mean, I, I would never deny parents the chance to get together and meet other parents that have been, been affected by this trauma, you know, the extent to which all of the reasons why their children have died can be directly linked to social media, I think is something that is obviously a big topic of conversation in Congress and in, in this podcast and will continue to be, but, Yeah, I think you're right. It's, it's only if they really know what they're talking about and they gather people around them that really know what they're talking about as well, which is kind of needs neatly onto talking to people who know quite a lot about, child safety, you had a very good podcast this week, Mike, with somebody who knows a fair bit, and it's the It's kind of been on the inside on one of the organizations that's very responsible for this. Talk us through the podcast. We're going to play a few clips. I'm excited about this. Cause I haven't got around to listen to it yet.
Mike Masnick:Yeah, so, early on the Techdirt podcast, which is the other podcast that I do, if you don't listen to it, I had Don McGowan on the podcast. Don was until recently, he was the general counsel at Bungie, which is a video game studio. Previous to that, he was the chief legal officer for For the Pokemon company and, help them create Pokemon go. Uh, he's also the executive producer of like the Pokemon film. So he's, he's had a sort of really interesting, career and sort of, you know, until last year, he was also on the board of NCMEC, the National Center for Missing and Exploited Children. We have talked about NCMEC on this podcast before. It's a very important organization. They run the cyber tip line. And we talked earlier this year about the Stanford report, which I think was really important, that looked into the challenges that NCMEC faces. Faces, in terms of, being a helpful resource companies that find child sexual abuse CSAM material on their platforms are required by law in the U. S. to report it to the cyber tip line, which is run by NCMEC, which is a private nonprofit. And they then coordinate and get that information out to law enforcement in order to take care of things. There are a lot of challenges there. There have been a lot of problems in that report. they explain one of the things was the, lack of technical expertise within NCMEC, which limited their ability to do certain things. Don McGowan was on the board of NCMEC for seven years. He quit last year and he wanted to talk about it. So he came on the podcast. He talked about, we're going to play a couple of clips. and in the first one, he actually. Talks quite a bit about that lack of technical expertise and makes a big deal of it and points out how really problematic it was. He makes very clear, and I want to emphasize this as well, that Nick, Nick, the organization is. A very good and important organization that does good work. His complaint is entirely with the board and the board that runs NCMEC and he has many complaints. And I say, just listen to the entire podcast because he goes into great detail. But the first clip I wanted to play was about that issue of the lack of technical expertise. And it was in particularly in the context of, I asked him why NCMEC was supporting certain laws that I thought were problematic and that would actually make things worse. And in particular, Foster, which was a U S law that amended section two 30 in 2018 to officially deal with sex trafficking, but in reality caused a whole bunch of, problems that there've been reports. And there've been studies that have said that it actually has probably made trafficking worse, but. Put lives at danger and really hurt them. And NCMEC was one of the major backers of it. And so I asked him about why that happened. And part of the explanation as we're about to hear was around the lack of technical expertise that the Nick McBoard had.
Don McGowan:How, how this stuff came about is go back to what FOSTA was supposed to be and pretend you don't understand what it turned into. Okay. Right. Okay. Now, remember at the time it was a bill to cut down on human trafficking for the sex trade. Yes. Right. And if it was that, I mean, one, protecting children is never a vice in American politics. Yes. And two, if it was that, that would have been a great thing to support. And so, you had people going, speaking up in support of it, who were speaking from the perspective of what they thought it would be. Now you, and a lot of the civil society groups that spoke to it, understood the, the actual mechanics of the law, and what it would, you know, you had a little bit of seeing into the future that you could do. Um, You know, you're a little Nostradamus sometimes, Mike. A lot of us try to be, some of us succeed, a lot of us don't. Um, I remember, you know, like, that was, that was one of the things was, this bill, especially at that time, and to a certain degree even today, NCMEC has no technical expertise in the building. They have a relationship with THORN, which is Ashton Kutcher's child protection charity, um, and THORN does a lot of their, a lot of technical work. And carries the technical load in that space in a way that Nick makes just not set up to, I mean, I chaired their tech committee for a few years. Right. And so I actually co chaired it with a guy who was a marketing manager at charter considers himself the tech brains of Nick Mick, and he's a marketing manager for an ISP. So there was, there was a guy in there, a guy on the board who ended up, uh, no longer being on the board. Um, who is advocating for this geolocation app to help, you know, like you're walking down the street, you know, and it'll, it'll ping your phone and say, Oh, a child was abducted here. And he thought this was such a fantastic idea. Cause it'd be great for awareness. I'm like, why would anybody put this into your phone? Right. This guy styles himself as the technical expert. Right. So think about that. A guy who thinks that that app is the greatest idea ever and should be the technical focus of the organization. Is out there trying to set the guidelines. This was a guy who we, there were a few of us who actually had a bingo game during board meetings. Of at what point is Lou going to bring up porn? And then we would work the word bingo into our next thing we said. And that would be how you would win. If Lou got, if Lou mentioned porn at a time that you were ready to talk, you'd work it in and you'd win. If you, if you'd pick that slot in the squares game, uh, you had, you had first right to claim it for your victory. Right. So we're dealing with that level of. You know, sort of, you could write a script by this guy's focus on this issue fairly tangential to Nick Beck's mission. Huh. And so, you had people setting that, setting that as a priority, um, and so obviously Foster was red meat to them.
Mike Masnick:All right. So we had that clip there, and I, and then wanted to just get into the second clip that we had, which is about his larger complaint, which was that there are a lot of issues that really impact children and that NCMEC would not take a stand on, and there was a theme behind all of them and they were issues that the government had a responsibility in, and that, they would not stand up to the government, perhaps for political reasons where maybe it was the party in charge. the members of the board were supportive of that political party and had a partisan take or other things. and so, he talks about how he would advocate for issues that would actually protect exploited children, which is in the name of the organization. And yet he would get accused of going on crusades by other members of the NCMEC board. And that included around issues like child labor, asylum seekers and how they're treated, and also how NCMEC, uh, Avoids dealing with any issues related to trans kids. And so, I'll let him describe here, some of the, the things that he found with the board.
Don McGowan:I mentioned a few minutes ago about how, uh, you know, I was, you know, people made a reference to me going on one of my crusades. I had a separate issue that I was fairly vocal on, which is, as I would describe it, I think it's terrible that we have a political party that in this country has decided it's politically expedient to set aside a group of children, namely trans kids. for state sponsored political persecution. We should, we should care about this and we should be speaking out about it. I'm sorry, some of you don't like this. I'm sorry, trans kids make you feel icky, but this is the kind of thing on which we should be providing moral leadership to the legislators of the country. And we should be saying wrong is wrong. And you know, they, there was a fair amount of disagreement at that last board meeting, that April board meeting. Um, Nick got a grant from the state of Texas. Uh, and I was in, the grant was subject to return if the money was unused or misused. Oh, wow. And I was like, we got to find out what misused means. Yeah. The state of Texas has decided there is an entire population of children that should not receive any support. Yeah. And if we use this money and some of it goes to their benefit, They may want their money back. Yeah. And one of the board meetings looks over board members, looks over me and goes, thunk, you know what that is? That's the sound of you beating a dead horse. Wow. And I was like, okay. You know what? In my, in my mind, I said, I just quit. Right. Didn't speak for the rest of the meeting, let the meeting end. Said goodbye to everybody. Walked out, never went back. Wow. Got home, uh, flew back to Seattle, got got home, walked in, sent, wrote up a note of resignation to the general counsel of ncmec. Went to that portable software, posted a note saying, you know, this was my last board meeting. There's going to be a lot of you that are going to be happy to see the back of me. What you may not realize is I'm just as happy to see the back of you. Send, send resignation. Peace out. Wow. That's yeah. I mean, Nick, Nick has data in its stats banks that say some of the kids most at risk in the world are trans kids. Yeah. Um, and you know, they ignore that data. Yeah.
Mike Masnick:All right. So there's a lot more in that, podcast and interview. I, I recommend people go listen to it, but I think it's really important for, for people to understand, how, how to make NCMEC a better, more effective organization is understanding what at least Don found were serious problems with the board.
Ben Whitelaw:And, and Mike, just for people, it's Benefit. How do people end up on the board of, of an organization like ncmec? and, and you know, as somebody who's just been appointed to a board, um, you know, they, NCMEC website has all of its directors listed. There's some odd examples as people from companies that you might not expect to be associated with child safety. And did he give a sense of kind of what, why that was the case and, and the broader makeup of the board and,
Mike Masnick:Yeah, so there is a discussion of that. basically what he said with Nick make and he notes, you know, in the nonprofit world, you have different kinds of boards. There tends to be some boards that are focused on governance and some boards that are focused on fundraising. And the Nick McBoard is almost entirely based on fundraising. And so what ends up being is, as he described it, and he used very profane language, which I'll talk about. I'll skip, uh, even though I'm known to use some of that language as well at times, is that effectively a lot of the people position themselves to be on the board because they want to be seen as being associated with NCMEC and they want to present themselves whether true or not as protecting the children. And that attracts a certain type of person who pushes for themselves to be on the board who may not actually have the best interests of children in mind. And it really speaks to this idea of. Maybe for certain nonprofits and especially one like NICMC that is so intertwined with the U S government and the very important fight against CSAM, that maybe it needs a different kind of board structure and maybe you need, some group that is focused on fundraising and some group that is focused on actual governance, because it really comes through very strongly in the interview. And again, I urge you to listen to the rest of it, that there are real problems with the way the board is structured right now.
Ben Whitelaw:Yeah. Interesting. Looking forward to giving that a listen over the weekend, Mike. that wraps us up for the week. Um, we've covered fair bit of ground. We've gone a bit longer than usual, but I think that's, in good spirit. We, we wanted to dig into those stories. in depth, Mike, thanks as ever for giving us your wit and your wisdom. Um, thanks listeners for tuning in and for following us along again this week. If you like the podcast, if you listen to each week, don't forget to rate and review us wherever you get your podcasts. Um, drop us an email. If you want to talk, go to our website, control alt speech. com C T R L alt speech. com and if you want to sponsor the podcast and keep it going, then drop us an email, On the website as well. We look forward to hearing from you. Thanks as ever, Mike, and look forward to seeing you next week.
Announcer:Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.