Ctrl-Alt-Speech
Ctrl-Alt-Speech is a weekly news podcast co-created by Techdirt’s Mike Masnick and Everything in Moderation’s Ben Whitelaw. Each episode looks at the latest news in online speech, covering issues regarding trust & safety, content moderation, regulation, court rulings, new services & technology, and more.
The podcast regularly features expert guests with experience in the trust & safety/online speech worlds, discussing the ins and outs of the news that week and what it may mean for the industry. Each episode takes a deep dive into one or two key stories, and includes a quicker roundup of other important news. It's a must-listen for trust & safety professionals, and anyone interested in issues surrounding online speech.
If your company or organization is interested in sponsoring Ctrl-Alt-Speech and joining us for a sponsored interview, visit ctrlaltspeech.com for more information.
Ctrl-Alt-Speech is produced with financial support from the Future of Online Trust & Safety Fund, a fiscally-sponsored multi-donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive Trust and Safety ecosystem and field.
Ctrl-Alt-Speech
Presidents & Precedents
In this week's round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:
- Pennsylvania Becomes Hot Spot for Election Disinformation (NY Times)
- After Trump Took the Lead, Election Deniers Went Suddenly Silent (NY Times)
- X Is a White-Supremacist Site (The Atlantic)
- Papers, Please? The Republican Plan to Wall Off the Internet (Tech Policy Press)
- What Trump's Victory Means for Internet Policy (CNET)
- The government plans to ban under-16s from social media platforms. Here's what we know so far (ABC Australia)
- Canada orders shutdown of TikTok's Canadian business, app access to continue (Reuters)
This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.
Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.
So Mike, I feel like control or speech is a kind of counseling. It certainly is for me, but it's been one of those weeks. And so I want to kind of check in on you and make sure you're okay. And for that, I'm going to use the better help prompts that you get when you use it's app. And I'm going to ask you. Mike, you deserve to be happy. What kind of therapy are you looking for today?
Mike Masnick:Oh,
Ben Whitelaw:Do you feel, do you feel, are you in a therapy space?
Mike Masnick:Oh, am I ever, uh, yeah, I mean, I'm, I'm sort of tempted to say, is there, is there a kind of therapy that sort of puts me out for four years and lets me check in again later? But, um, yeah, I mean, more seriously, I think, anything that, gets me ready to, to fight. Because I think there's going to be, two to four to more years of having to fight for basic internet rights. And so, that's, that's where I'm at. Uh, what about you, since you also deserve to be happy, you are not in the U S you are in a different country entirely, but what kind of therapy are you looking for Ben?
Ben Whitelaw:Well, I'm, I do need therapy, but for other reasons, I want to, I want to give my therapy sessions to some of the people on that we're going to talk about today, Mike, I'm going to delegate. the counseling to a certain Elon Musk who, uh, I think, I think, you know, for all that he's done this week probably does need to talk through some things. Um, so yeah, let's get on with it and, talk through today's stories. Hello and welcome to control old speech, your weekly roundup of the major stories about online speech, content moderation, and internet regulation. It's November the 8th. 2024. And this week's episode is brought to you with financial support from the future of online trust and safety fund. My name is Ben Whitelaw. I'm the founder and editor of everything in moderation, a weekly newsletter that hopefully some of you read. And I'm joined with the brilliant Mike Masnick from TechDirt. Mike, talk us through your emotions if you can.
Mike Masnick:Well, yeah, it's, um, the election did not go quite as I had hoped it would go is the simplest version of that. And so I think a lot of people are kind of in a, state of, you know, shock and certainly concern and surprise about what the country told us. and so it's, you know, it's been, uh, it's been a rough week, I think.
Ben Whitelaw:Yeah, where did you watch the election out of interest? Where were you, taking it all in?
Mike Masnick:I mean, I was just at home and, um, actually wasn't even paying that much attention, as the first results came in, because everybody had sort of warned us that like, Oh, you know, it'll probably take days like last time. And there will probably be, false claims of election interference and, the election was rigged and stolen and so really don't pay too much attention. until later. And so, you know, I started checking in, you know, I sort of would peek at things. and some of the, early exit polls that were coming out where people were saying, Oh, you know, Hey, this County, even if Harris is winning, not, not winning by nearly as much as Biden did
Ben Whitelaw:Right. Yep.
Mike Masnick:And that started to keep repeating itself over and over again. so, by the time I went to sleep on Tuesday night, I was pretty much convinced that it was, it was definitely not gonna go well. Um,
Ben Whitelaw:It was, it was, you could kind of see it coming, couldn't you? Um, I went to bed at about 2 AM, uh, UK time. And, uh, by that point that you could kind of see which way it was, going to end up, I mean, whichever kind of way you voted politically, I mean, you could probably say that trust and safety was not the winner of, in either way. I think, I think that's fair to say. we're fully expecting the Trump administration to take some pretty kind of strong measures around topics that we cover every week, don't we? So we're going to, I think we're going to, what we try and do today. we did talk a lot, a lot about it last week, but it's important to cover. So what we're going to try and do is talk a bit about why we maybe got to where we were and a little bit of what happened this week when it came to trust and safety issues that we preempted in last week's episode. And then we're going to spend a bit more time to think about. Where we're going from here, and then we're going to sprinkle in a few other stories from. Around the world because, Hey, there are trust and safety issues that, exist outside of the United States. So it's really important for us on the podcast to, to make sure that we cover that breadth. Right.
Mike Masnick:Yeah, absolutely. And, and we will continue to do that. And obviously I am, you know, obviously the U S what happens in the U S impacts the rest of the world in many ways as well. And I am in the U S and I am directly dealing with this. but it is important to us and there's always been important to us on the podcast to, explore what is happening around the world. and a recognition that the U S is not the only place in the world.
Ben Whitelaw:Yeah, indeed. And we know we've got stories from Canada and Australia, but we're going to start in the U S because that that's where everyone's eyes have been this week. Um, we touched Last week, Mike on the fact that moderation and trust and safety issues often kind of into weave with politics and power. And you know, you can't really get more of a indication than that than the US election results this week. And we both read a kind of a wide range of stories, but in looking at The issues that played out and how the election played out. From everything you've read this week, what has been your sense of how we got to a Donald Trump administration?
Mike Masnick:Yeah, so this is kind of interesting because obviously as you mentioned we spent a lot of time Last week, talking about misinformation and, lies and how they were spreading on social media. And I've seen some people talking this week about, the information ecosystem and this idea that, Elon Musk taking over Twitter kind of led to this result. And I don't think. the evidence actually supports that in any meaningful way. And I know that some people are going to disagree with me and I am not a political scientist, but from what I've seen, I actually think the story of this election has so little to do with the internet and trust and safety and disinformation. Compared to what everyone was expecting. the story that I've seen that makes the most sense, and I may be biased and this may be, just cognitive dissonance at work or whatever, trying to explain what, what has happened. but the strongest story that seems to have come across is just inflation. Sucked. and it, it was bad around the world and there was a rejection of incumbency around the world, no matter which left or right or, or wherever on the political spectrum, different parties lie within different countries. It was just rejection of the status quo and almost entirely when you get down to what it was, was because of. Price of groceries, that's not to say that there wasn't other stuff mixed in and that there is like something about this election that told us about America and racism and bigotry and hatred, which I think was Is all there and is a portion of it, but we're seeing around the world, this rejection of incumbency, obviously in the UK, you guys had a big rejection of the conservatives that were in power for a long time. Uh, in France, there's been a pushback and, even in India, it wasn't a complete, uh, rejection of Modi, but it was pretty, pretty big. clear intent that, the Indian populace is not happy with Modi. So just almost everywhere where we're seeing elections occur, we're seeing people say, we don't really like the status quo. And when you dig into the data, a lot of it seems to just come down to, inflation and, you know, the price that they see at the grocery store. And that is the interpretation that they have of the economy. And that's. You know, there's a part of me that actually wonders if really what it is is that most people, the average voter tunes out of all of the specifics and nitty gritty. And because there's so much information flowing, it's not that they're swayed by particular disinformation. It's that there's just so much going on. Most of them are not paying attention to any of it. And it just comes down to the thing of You know, gas prices are much higher and the price of what I'm paying for at the grocery store is much higher. And so, we don't, we don't like these guys.,
Ben Whitelaw:So that does seem to be what the general consensus is among analysts and media, Mike, is that it just didn't have. The kind of role that we thought it would do, particularly when we were discussing it last week. so what are the kind of major, how do we diagnose it then? If we're, from a trust and safety perspective, if all of the things you thought that would happen actually didn't come to pass at all, there wasn't this big AI surge, there wasn't the disinformation that we thought would happen, we didn't see, I guess, many really egregious, maybe our definition of egregious has, has changed, um, over the past few years, we didn't see any like Significant stories in their own right of, platform manipulation in any way, where we just complete, we'd look in the wrong direction. Uh, you and I, should you and I be even doing this job?
Mike Masnick:Uh, no, I mean I I I don't think you know, so there were definitely attempts right and so There were certainly and we we'd spoken a little bit about you know, Russian attempts at disinformation. And, you know, and there was a New York Times article, about how, you know, Russia was, was even more brazen and sort of admitted that they were engaged in all these disinformation attempts. I think though, there's sort of two lessons that I, I would have as takeaways is that. the specific idea, and I've argued this in the past. So this, this might be just me confirming my prior beliefs and then I'll get to a, a second belief, but that the idea that people are sort of easily misled or led along by disinformation online is something I've always been a little bit skeptical about, and there is. Some element of that, that definitely happens on the margins. People are definitely influenced by their information environment. And,, there are people who are, sort of, you know, wallow in that and are completely convinced by it, but a lot of it still comes back to just sort of confirmation bias and people sort of go into the, the worlds that they want to inhabit. And the internet has built up this ecosystem that allows them to do that. And so it's not even so much like disinformation leading people astray, but it's disinformation just reinforcing people's prior beliefs. And I'm saying this in an admission that this is partly me reinforcing my own prior beliefs. Um, so that leads me to the second sort of larger takeaway, which is. We need to rethink the information ecosystem. And I think lots of people are thinking about this, but I think they're thinking about it in the wrong way. And that scares me a little bit because I think a lot of people are looking at this as in, Elon Musk and all the disinformation and lies that were spread on X had an impact here, which I don't think they did. And therefore people are going to say, well, we have to do something to stop that.
Ben Whitelaw:Hmm.
Mike Masnick:and I don't think that that makes a difference. If the people who are doing that, we're really just looking for the confirmation bias. And so what it's saying to me is like, we need to just rethink the way that people educate themselves and get information in general. I go back to all the time and I know it's a frustrating response, but it's something that I do think is important. It's just sort of general media literacy. You know, the other part of this is that like, when it comes to this idea that like, Oh, people are just unhappy because of inflation or, you they just don't feel like the country's going in the right direction. part of that is there's just like an education gap where people don't understand one, like how the president impacts the economy, what different policies are doing. Right. I mean, if you spent the time and you looked at it in the U S. the whole world faced inflation and sort of reaction to, the COVID 19 pandemic and sort of how that was handled. the U S handled the inflation way better than most of the rest of the world. And this is like the stunning thing, whereas like inflation was less bad. In the U S than elsewhere. We came out of it without actually going into a technical recession, but like the average person who is not going to wade into all of that and doesn't understand like less bad than elsewhere. That doesn't matter to them. It's still bad,
Ben Whitelaw:Yeah.
Mike Masnick:but also people just don't understand like, why did that occur? and then the sort of ridiculous part is like, there are all these democratic programs, like the uh, inflation reduction act and the other sort of, like chips act and all this investment, which because the Biden administration was sort of very careful in how they lay things out and didn't want to deal with like waste, fraud, and abuse, all of like the benefits of these programs. Really start to roll out next year, just in time for the Trump administration to take credit for them, even as like all the Republicans voted against them and talked about how awful they were going to be. And so they're definitely going to take credit for all the benefits as they come into effect. which, you know, sucks for a whole bunch of reasons. And so, you know, there is this element of like. people need to be better educated on this stuff and thinking through, like, how do we do that? And there is some sense of like a clear rejection on, the mainstream media is not informing the public well anymore. And there is a sense that certainly a lot of people feel that.
Ben Whitelaw:Yeah.
Mike Masnick:so my feeling is that it looks like a lot of people just tuned out entirely and didn't want to get at the details. And we need to think about how do we, how do we better educate public? And, people are unfortunately. Moving towards sort of like simplistic, sloganeering towards complex problems. And so you have someone like a Joe Rogan, right. Who reaches millions of people through his podcast, who doesn't understand complexity, like even remotely and is very quick to jump to very simplistic solutions. And so. I don't know how we do it. I mean, obviously like, this is something that I spend a lot of time on and, and, you know, you and I talk about this, where it's like, we're trying to talk through the nuances and complexities of really, really complex things where there aren't simple solutions to them. but I know that that's not like. there's no like mainstream interest in like, gee, there's no easy answer.
Ben Whitelaw:No, no, no. Exactly. It's, it's the thing we always come back to. I mean, I, I was, it reminded me of one thing, Mike, which is somewhat of a segue, but it reminded me of witches. It reminded me of, it reminded, I'm reading, I'm reading Nexus, the new book from Yuval Noah Harari, which is about a kind of, history of information. And it's the third in his trilogy. And he talks about witches and how, after the advent of the Gutenberg press, there was this period, which actually was way longer than we all thought you know, a significant period of time, almost 200 years. I think he says, if my memory serves right, where, because of the Gutenberg press ideas, perpetuated that had no. Scientific or factual grounding and one of them was the fact that there was a lot of witches around and, and this very popular tract got disseminated at scale and led to the death of thousands and thousands of women because they were believed to be witches. And this week made me think that we were in a similar kind of trough. in our understanding of how information gets disseminated that we, you know, we'd created this thing 20, 30 years ago, much like the Gutenberg press thought it was going to lead to, wonderful kind of ideas and the spread of knowledge and, people being kind of connected in ways that they've never been connected before. And actually what it led to was the death of lots of women because they thought they were witches. And I think you're right. I think we're kind of. It doesn't feel like it on the day to day basis, but it feels like we're in that period of unknowing, to some extent. And, and we, I don't know how we get to the place where we know more than we do, but I think it's going to take a long time and, you know, I'm not sure how much a Trump administration is going to help towards that, but we're just, we're just there and we're going to have to deal with it.
Mike Masnick:Yeah. Yeah. And, and, and we'll get into like kind of what this means. for, you know, internet policy and internet speech in a moment. But, you know, I, I think that is, that is a really interesting take. Uh, Clay Shirky, who's a, a sort of well known media commentator had written something similar, but I think almost 20 years ago where he talked about the advent of the Gutenberg press. And his, his argument was that it took a century of, pure upheaval. and, in the modern day, we sort of think of it as like, okay, There was before the Gutenberg press, you know, where people didn't have access to information, it was, closely hoarded. Then we had the Gutenberg press and yeah, there was like some upheaval and you had the reformation and Martin Luther and all this kind of stuff, but like people sort of figured it out and then suddenly like people were spreading information. He's like, no, like there was, he, he said a hundred years of upheaval, maybe it's 200 years before society sort of came to terms with that. And you know, his argument, and this was like 20 years ago when, Shirky wrote about it was that, you know. this is to be expected with the internet, like there's going to be this period of upheaval before society wraps its head around it. And I think that's true. And that's always sort of resonated with me, though. I hope that it would be more of a soft landing and maybe we're sort of discovering that maybe, maybe it's not. And so, there is this part of me, this is me just sort of thinking out loud where it's like, is there a way to make Complexity cool, right? Like, is there a way to get across this idea that, complexity and nuance and trade offs are important for people to understand? And that people should want, people should want to listen to our podcast where we point out that there aren't easy answers and that, you know, we should be as popular as Joe Rogan.
Ben Whitelaw:I know. I know. And, you know, and moderation, actually, you know, moderation in the kind of political sense in, in the sense of, balance and evenness. And, you're right that there's nothing, there's nothing that necessarily attracts people to that as a concept. Um, and that, it's certainly not something that the, platforms prioritize. And, you know, value within the algorithms that they, you know, users, use to consume content. So I think you're right. I think there's, there's something in that. I mean, we should, look ahead, Mike, seeing as we're, we're starting to think about that and, and talk a bit about what a Trump administration could or might mean for internet safety, internet policy at large, from what you've read this week, what are the kind of standout, Themes or potential narratives that we're going to see and end up talking about in the podcast over the coming months and years.
Mike Masnick:not too much good.
Ben Whitelaw:Damn it. I was really hoping for something nice to look forward to.
Mike Masnick:I mean, we'll see, because it's a little tough to tell, what is reality and what was, what was sort of bluster, obviously there's project 2025, which Trump has denied that was going to be his platform, though. We already see, I mean, it's, it's really Trump. Trolls online saying like, okay, can we now admit that? Yes, of course. project 2025 was our plan all along. You know, we'll see what actually comes into play. There's a
Ben Whitelaw:Just, just on that, Mike, what, what it's in there related to kind of internet policy specifically, cause I'm familiar with some parts of it, but you probably know more than I do. Yeah.
Mike Masnick:internet policy. You know, a whole section of it was written by Brendan Carr, which I, I think broke the law. Uh, he's not supposed to write that. Well, he is an FCC commissioner, and he is, most people expect that he will become the chair that Trump will appoint him as the chair. He's been kissing up to Trump for, years. so he wrote a whole policy. He is very much against section 230. He thinks incorrectly though, you know, what is incorrect and what is correct now no longer matters. Um, he thinks that the FCC gets to say what section 230 means, which is exactly the opposite of the intent of folks who wrote the law. And so I think we can look to that in terms of, you know, he's going to use the power of the government to do a few things, including reinterpreting Section 230 or helping to get rid of Section 230 in order to try to force platforms to allow for, pro MAGA, pro Trump speech. across any platform and suggest that there's a sort of common carrier esque concept on social media platforms, which I think is very problematic. So that is one element that will come out. the other, which might sound like the reverse, but it's not quite the reverse, is that there's a lot of focus in Project 2025 on, um, pornographic content and obscene content and an idea that that needs to be banned across the board. and so, there will be a sort of very puritanical attack on adult content in all forms. And with their definition of adult content, which might also be expanded to include Any kind of LGBTQ content. we've already seen Republican states, Republican led states their attempts to sort of ban books, I think that goes into overdrive and at a federal level and focused on the internet. So things around age verification. I think that will be a big push. I think things like COSA will be a big push. what stopped COSA this year was Republicans in the house being worried that if a Harris administration came into effect next year, that Lina Khan and the FTC would be in charge of enforcing COSA and that she would use it to censor conservatives. But now that they're not going to have that restriction, suddenly I get the feeling they will likely flip and say, COSA is wonderful. And let's go forward with it because since we have the power. To censor LGBTQ content, which has always been the concern that many of us have raised about COSA, that it is now fine and dandy. and so I think what we will see is this combination of, they will try and stop any attempt. To, remove or moderate disinformation, they will see that as an attack, but they will push for anything that allows them to control what kind of speech is online. And that will include the removal of all sorts of important and valuable content. and, it's, it's going to be messy.
Ben Whitelaw:yeah, no, I agree. And just on the point around, ID check laws and designed to kind of cure the pornography epidemic and, protect children is a really good piece actually by tech policy press, by, A woman called Anna Bonestell, who writes for fight for the future, who makes this really interesting case that we're going to see loads more bills like this. We're going to see many more laws passed. We're going to see states really go hard on this because it has been a feature of the Biden administration. And, with what we know about Trump and, his preferences is kind of going to be increased as well. So we'll include that in the show notes, but there are a couple of other things as well that we can expect, Mike, the TikTok band being one of them. Trump seems to have kind of flipped from where he, he used to be in terms of wanting a TikTok ban to really not caring anymore as well. It seems to be something to do with, uh, he's been funding his, election, program, right?
Mike Masnick:Yeah. one of the biggest, uh, I think it was the fourth largest, donor to the Trump campaign holds a, huge, uh, Portion of, tick tock equity. so, Trump had come out against the band. The band is supposed to go into effect basically at the same time as the inauguration is supposed to happen early next year in January. I imagine that, no matter what happens in that case, that Trump will probably instruct the DOJ to drop that case. and it'll go away. The other thing that'll be interesting to watch for is sort of what happens in the antitrust world. and this is kind of like, a big kind of up in the air thing where there are disagreements within, within both parties, to be honest, about antitrust and how it should work. And there is a belief that, the efforts to curb the power of, big companies to merge and to buy up competitors and all that kind of stuff, a lot of that is going to go away. But at the same time, I think that the Trump administration and, and, parts of the Republican, Senate and Congress sort of still see antitrust as a really cool weapon to punish companies that don't bend a knee and don't obey Trump's, you know, that suck up to him. And so, you know, we saw this in the first Trump administration, where, uh, AT& T and, Time Warner, where he tried to use, the justice department and antitrust laws to block the merger, which was sort of seen as like a favor to Rupert Murdoch. Like, it's all transactional stuff, right? There was no, no policy meeting behind it. And so, there are all these, Antitrust efforts against the big tech companies right now. and like a lot of that was led by Lena Kahn. and so there's the belief she'll get fired immediately. And will the Trump administration then back down from these, antitrust efforts? And I'm not sure they will, because he still hates Google, hates Meta, even though both of those companies are now totally trying to suck up to him and be like, Oh, you're wonderful. We love you. Um, and so. And so it'll be interesting to see how he leverages antitrust in this world. I think there will be, for companies that he favors, we're going to see all sorts of consolidation and, there'll be all sorts of opportunities for mischief there. but for Google and Meta, It will be very interesting to see how those things work out. You know, the first Google antitrust case, which, you know, has gone against Google, but it's, still a long way. They're still in the remedy stage now. And then we still have to work through all the appeals. It'll be years. That one was actually the one that was brought under the original trump administration where he His justice department rushed out an antitrust case in the waning days of 2020 because like he decided like that was going to be one of his big moves in trying to get re elected in 2020 So they brought this really like weak case which The justice department actually improved under the Biden administration and made it stronger. I still thought it was a weak case. I was little surprised with how that one came out. There are other cases that are ongoing, but they're still very early that I think are actually much stronger, but I don't know. It's not about the strength or weaknesses of the cases anymore. It's about whether or not Trump wants to punish a company or not. And so I think that will have a pretty big impact on a bunch of things. so, a lot of this is really kind of Up in the air.
Ben Whitelaw:Yeah. Okay. So in summary, not sure where the antitrust winds will blow. Maybe there'll be a TikTok ban. Almost certainly there'll be some online ID laws. Who knows about the rest?
Mike Masnick:yeah, and you know, it is important to note that like on the online ID laws and everything, there is a case at the Supreme Court right now, and that will matter, and there will be oral arguments for that in January, And there's decent possibility that even with the Supreme Court, that they would actually reject the online ID laws, it could go either way. but it wouldn't surprise me if they reject it. And that will be interesting because there's this kind of, I think it was something to watch if we're looking for things to watch in terms of how all this plays out is what happens in the courts. Because. the way a lot of this has gone down is that, people who have been supporting the sort of Trump view on free speech, which is, you know, laughable. He has no view on free speech other than like my speech is good. And my supporter speech is good. And everybody else's speech is bad. Judges that have been trying to like, That were appointed by Trump and we're sort of trying to play the role of being very Trumpist. We're willing to completely ignore a century's worth of first amendment precedent. You know, the, the key case that I'm thinking of is in the fifth circuit about Texas's social media law, the net choice case, where Andy Oldham, who wrote the ruling in that case, he just came out with another ruling this week. We don't need to talk about it in that case. That case is continuing. he basically rejected a hundred years worth of first amendment, precedent, which historically was not like Republican conservative judges were very supportive of that sort of very broad first amendment protections. And so the real question is how much of that lives through, like how much of people who were trained on the first amendment in that, uh, Tradition. Remember that and say, like, these principles about the First Amendment are more important than the principles of helping whoever Trump wants to help. And that's where the war is going to be played out. And the one sort of interesting twist on this. Is Elon Musk again, like it comes back to him in some way or another, which is that a lot of these cases started when the belief was incorrectly that all of these platforms were, you know, left woke,, you know, Trump hating platforms and therefore we have to bring them to heel now that Elon Musk owns a platform and president Trump himself. Owns a platform, which, you know, for whatever that's worth, will they begin to realize that these laws would be really problematic for themselves as well? And does an Elon Musk sort of pop up in these cases and say, like, hey, this is a violation of my First Amendment rights in a way that convinces the judges like, oh, wait, let's, you know, It's stupid. it's ridiculous that it would come to that, but I'm sort of curious to see how that, that aspect plays out. And I'm sort of going to watch the courts on that side of things.
Ben Whitelaw:Yeah. Well, we'd be saved by their own ego, essentially. Um, I mean, there's been a whole, whole bunch of, there's been a whole bunch of stories written this week about, Elon Musk's involvement in the election. Charlie Wurtzel in the Atlantic writes another good one about how the questions around anti conservative bias has kind of slipped away now that he's. Basically part of the campaign team. So that's not surprising, but there's also talk of him actually taking a, obviously a stronger role in the Trump administration and having his own political career in the future. So I think again, this idea of politics and trust and safety and of content moderation being intertwined is something we're going to see time and time again. but the online ID question is a nice segue, I think, Mike, into the, to our next story as we, Go to the opposite side of the world talk about Australia's
Mike Masnick:Yes. Let's leave the U S please. Please.
Ben Whitelaw:Sooner the better. this is a big story. If you've been following it, it's kind of rumbled on over the last couple of months. We, we touched on it briefly in the podcast. Essentially, Australia is going to be banning under 16s from social media platforms. the age is the new thing this week. We knew it was going to happen. We knew that some children, some teens were not going to have access to social media, but this week, Prime Minister Anthony Albanese has announced that it will be, 16 under which, kids will not have access to social media. the guys of this is, is to kind of protect children and their mental health and protect them from harm, which obviously we know is a big concern for, many people. And, what's basically going to occur is that a number of platforms under a very broad definition, which I'm sure we'll get into, Mike, are going to have to put, protection so that basically kids under under 16 will not have access. And, those who currently have access, will be essentially kicked off the platforms. So I wanna start with that definition, Mike, because it's something we talked a bit about before we came on there. Started recording. I'm gonna read it out'cause it's incredibly broad and. is, is almost kind of laughable in its capsule nature. So the definition of a social media service as per Australia's online safety act is as follows. It's the sole and primary purpose of the service to enable online social interaction between two or more end users. The service allows end users to link to or interact with some or all of the other end users. The service allows end users to post material on the service. That's. everything.
Mike Masnick:That's the internet.
Ben Whitelaw:That's it's email. It's the weird forums. I go on to, to, you know, rant about my football team. It's everything. Right.
Mike Masnick:Yeah. It's the entire internet. that is basically defines the internet
Ben Whitelaw:And, and so what does this mean in practice as, the kind of Australian government tried to roll this out? Do you think, cause one of the criticisms has been, there's no thought really about, the kind of implementation of this. It's only. It's been rushed out, political reasons again, and there hasn't been much thought or guidance as to how platforms do this. are they meant to deal with something like this?
Mike Masnick:it's, I mean, like so much internet policy, it's just disconnected from reality. It's like, mean, Yes, there are reasons to be concerned about the way that kids use the internet and all this stuff, but, but kids use the internet to communicate with grandparents, to communicate with friends. There's all of these things. And all of that is effectively banned
Ben Whitelaw:Yeah.
Mike Masnick:You know, uh, text messaging is bad. Like you can't use FaceTime. Under this
Ben Whitelaw:Right.
Mike Masnick:fall under the, under the law, you know, any kind of WhatsApp signal, any, any kind of like two way messaging, would appear to be banned for those under 16, you know, YouTube would be banned. And, there are concerns about like, you know, there's, there's bad content on YouTube, but you know, for most. Kids today, YouTube is TV. I mean that it's, it's how they consume content. They're not using it mostly as like a two way medium, but they're using it to consume content that is coming to them. And there's lots of really good content on there, frankly, for kids and educational content, and that would, be banned. There's all of these things in here that and I've talked before about like, The failure of like, this is not the way that you handle these things for a variety of reasons. You know, kids should be taught how to use technology properly. And instead, this is saying like, you can't touch this at all. And then suddenly at age 16, we're suddenly going to let kids loose on this without any real training or practice or understanding of how it works and not expect that to be a complete and total disaster, you know, I understand that, like, you don't want to let young kids alone onto these services without understanding how they work, but this bans them entirely, like teach them how to use it properly, teach them what the risks are. it's the same thing that we had the harms versus risks story that we had a few weeks ago about, you know, you take people out, you teach them how to cross the street and this is the opposite. This is saying, you know, kids can't cross the street until they're 16.
Ben Whitelaw:Yeah. Yeah. You're stuck on one side of the road forever.
Mike Masnick:Yeah, it's not, it's not how the world works. And, and so, I feel like Australia does this unfortunately frequently where they pass these laws without any real, recognition of reality. and they sort of build this, fantasy world where Oh, well, you know, there are some challenges and therefore we're just going to completely wipe it out. It's again, like to go back to a theme that is coming through a little bit on this podcast. It's very, it's a simplistic solution to a complex problem with lots of trade offs without a willingness from the government to actually look at the complexity and how do we balance these things?
Ben Whitelaw:yeah, for sure. And you know, I guess there's a philosophical question about how much harm do you allow children to, be confronted with on the internet, but there's also the kind of details of, how they're trying to implement this. And there really is very little details about it. So the government aren't specifying a technical method for, um, platforms to, check user's age. You know, there are, there's a lots of age assurance. technology and age verification technology out there, which has, which has kind of like, differing views. And then we don't need to get into that now, but if you're a government essentially with a kind of policy, that's, presumably trying to win a lot of votes, trying to keep kids safe and appease parents, then it's odd that they haven't gone to the length of specifying that,
Mike Masnick:Yeah. and in fact, the Australian government last year did a report on age verification and age assurance stuff that said none of these technologies work and yet they then come forth with this proposal and it's just like mind blowing where they know because they, they had like a, actually. Good report that looked at all the different technologies and said, they're like dangerous for privacy. There's no good technology that actually does age verification well. And yet then they come out with this policy and it's just like, how can those two things happen in the same government? And,
Ben Whitelaw:yeah, no, it's true. I mean, thing before we move on this question of, okay, how, what are the better ways of. Verifying age or ensuring that people of a certain age don't use certain services. And I come back to this idea of, you know, verifying at the app store level, which, a few people have talked about in the last year or so, UL Roth, who's appeared on the podcast before, has talked about this and written about this. you feel that's a kind of better way forward in terms of achieving some of these goals, which obviously Australia are trying to do?
Mike Masnick:yeah, I mean, as a tool, it's, it's not a, uh, It's not the worst one. There are limitations
Ben Whitelaw:that's, that's praise for me, Mike. I've got to say,
Mike Masnick:Yeah, that's true. Uh, there are worse ways of doing it, but it's not, it's not great either for a variety of reasons. you know, one, not everybody uses these apps through their phones, like people do have computers and the app stores don't apply in those scenarios also, and this is an admission, like literally this week I helped one of my kids lie about their age. In order to get access to it, to certain services. And, you know, but part of that was actually having a conversation with the kids saying, like, I understand you want access to this feature. And at your age, you are not allowed to have access to that feature. Let's talk through what else this is giving you access to and how we're going to handle it. And that's the point that I actually do think is important is like learning how to talk to your, talk to kids and. train them into how they handle things and not just how to handle problematic things, but to recognize like you might come across stuff that is problematic. And in those cases, like come talk to us and like, feel free to, to, recognize that there may be problematic things. And we are here to help you if that, comes up, never be afraid to come talk to us about that kind of stuff. But that is like, that's part of the education process rather than, you know, Oh, we just have to completely ban kids. And so that same thing is going to still happen. Kids are going to want access to something, this or that. And parents are going to say, well, yeah, of course, like that feature, they should be able to have access to. Therefore I'm going to lie or change the data that I gave this service already. And so, just this idea of focusing it on like the age specific part of it. Is it's just not, it's not a realistic solution.
Ben Whitelaw:no, there's a lot of detail to be fleshed out for sure, and I'm going to be really interested to see how this plays out. There's, there's a lot to be desired. I think about, the way that this is, has been conceived. I guess the only plus side for the platforms is it's not due to happen for at least a year, even if. The, kind of law gets passed in the next couple of weeks. So, we'll definitely update listeners on this, as we go in the next couple of weeks.
Mike Masnick:Yeah.
Ben Whitelaw:then. So, our other kind of major story we wanted to talk about this week, Mike was, from a country, not too far away from United States, um, Canada, which has taken its own kind of strange approach to the question of foreign interference and the question of Tik TOK.
Mike Masnick:Yes.
Ben Whitelaw:more.
Mike Masnick:Yeah. So is a weird one where, Canada is affected effectively kicking bite dance and tick tock out of the country, but not in the way that you might think
Ben Whitelaw:A halfway house.
Mike Masnick:Yeah, they have just said that the company cannot operate as a company in Canada in terms of having offices. They cannot have employees in Canada because they see that as a national security threat, but the app can still function.
Ben Whitelaw:Weird.
Mike Masnick:So everyone can still use Tik TOK in Canada, but there can't be Tik TOK employees in Canada.
Ben Whitelaw:So the theory is, is that having kind of, is it, is it, it's not Chinese nationals working for TikTok. It's anybody.
Mike Masnick:no idea because this makes no sense. Like, unless the concern is that the employees of the company are like, they're for some sort of espionage national security. don't understand how the concern would be more about the employees than the app itself. Like everywhere else that we hear concerns, it's always about the app itself. And in fact, like if you want to have more say in how the app operates, you would think you would probably want. A local version of the company. And in fact, in other cases and in other countries, we have all these laws that force you to have local representation for that very reason. You know, I often make fun of those as sort of the hostage laws, right.
Ben Whitelaw:Yeah.
Mike Masnick:And here, Canada is going in the. Other direction entirely, but allowing the app to operate. It makes no sense. I mean, I think the theme of this podcast this week, this episode is like, none of this makes sense. The world has gone crazy. I, like, I can't, I don't, I can't fathom the rationale for this. I, I, it's like the app can still go, but people can't work here.
Ben Whitelaw:Odd.
Mike Masnick:Then you have less control over it. And if there are national security concerns about how the app works, like you have less say in that there's nothing else that you can do. so I, I don't, I don't get it. It makes, it makes no sense to me.
Ben Whitelaw:It's very rare that you're lost for words, Mike, and that I also have no, no questions to come back to you on. It's one of those. It doesn't make any sense. And the Reuters piece that, kind of documents this also has. No, further information. So, I mean, you know, it's, it kind of sums up how I think we're both feeling after this week. It sums up how, how this episode has gone. Um, yeah, I mean, we'll have to, come back to listeners with a few more thoughts in the coming weeks. That's, that's all I can say to,
Mike Masnick:Yeah, it's, it's, it's just a strange one. It's like, you know, I read the story and I kept waiting for like the explanation of how this makes sense. And, and again, unless you think that they're using the fact that there's a local subsidiary in Canada to engage in like actual espionage, which I don't. Doesn't make any sense. Like, why would you do that? Like, that's the only rationale, like maybe they found out that, yeah, like China was using TikTok as like a tool to sneak spies into Canada. I don't know. It it's like
Ben Whitelaw:Start sounding like a spy novel.
Mike Masnick:Yeah, yeah, yeah. Maybe, maybe I'll spend the next four years writing novels. That's what I should do.
Ben Whitelaw:Yeah. Um, well make time for control of speech on a Friday. You can do whatever you want. Otherwise, um,
Mike Masnick:go.
Ben Whitelaw:that takes us to the end of this week. Mike, we have done our best to process, to, try to counsel you through what has been a tricky week. And, I think hopefully you feel like you've got something off your chest. Um, and for
Mike Masnick:I will say, I will say that, uh, I've been having very nice conversations with people all week. and there has been this sense of, I met a new neighbor this week. I, I had my, physical this week and I ended up sitting and just, Chatting with my doctor about America and the election for longer than we probably should have just as sort of, therapy moment both of us, So there is this element of you know when things go crazy people do need to sort of come together and talk and find community and that's not saying like find community with the people who are trying to do horrible things and, destroy lives and everything, but find the people who, who have compassion and who have empathy and try and build community. I mean, I think that's, that's like the most important thing at this point. And that is where, for all the harms and problems of the internet, the internet does help enable that and that should be encouraged.
Ben Whitelaw:agreed. You know, everything in moderation, I'm going to be starting to do kind of weekly calls with readers and other folks interested in safety as part of that effort to kind of bring people together and help them understand each other. And hopefully that's also what control or speech is a bit about. Um, it's about, you know, trying to bring some of these issues to light, to listen to a wide range of, folks who, who think about these things on a regular basis. And. if you do like the podcast and you want to, support us in what you do, the best way of doing so is by giving us a review, give us a rating on the podcast platform that you use. And, uh, if nothing else that'll, that'll keep Mike and I going in the, uh, the dark depths of the next, couple of days and weeks, at least. So thanks for listening as ever, uh, appreciate you tuning in and we'll speak to you next week.
Announcer:Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.