Ctrl-Alt-Speech

Locate Your Nearest X-it

Mike Masnick & Ben Whitelaw Season 1 Episode 37
Ben Whitelaw:

So in last week's episode, we were kind of processing the U. S. election. It's fair to say. and you know, one of the ways you can process your emotions, you will know this, I'm sure, is journaling. So, uh, I've done it a bit myself, but you know, I'm pulling out the daily prompt from journaling app one day in today's episode and asking you, what is the one fact about the That amazes you.

Mike Masnick:

Oh, gosh. Uh, well, let's say we're still processing, this is in the, in the order of still processing what has happened in the U S and I guess around the world lately. I am going to say I am amazed at how many people. People are, are, so incurious about how things actually work and insists they know how things work, the simplification of the world and the lack of curiosity about reality, really amazes me lately.

Ben Whitelaw:

Ah, in respect to what? Everything.

Mike Masnick:

Yeah, everything, just this idea that, you know, there's so much discussion and it goes to the stuff we talk about too, that people think all of these things are really easy, you know, get rid of the bad stuff online, keep the good stuff, Oh, the government is a mess because of, you know, stupidity and, graft and And therefore we can hire Elon Musk to go into the government and. Get rid of 2 trillion worth of spending. No problem is kind of, it's, it's a lack of intellectual curiosity. that is astounding to me.

Ben Whitelaw:

Yeah.

Mike Masnick:

But what is the fact in the world that amazes you, Ben?

Ben Whitelaw:

Well, for all the departure from Musk's platform over the last week, and for all of the people who are showing up on other platforms, including blue sky, I was amazed to see this week that actually, um, it was threads that grew, a blue sky amount. It grew 15 million users in November. So it has already surpassed the number of users Blue Sky has. And that to me is astounding and shocking. And, we need to get into that, Mike. We've got to talk about this.

Mike Masnick:

All right, let's do it.

Ben Whitelaw:

Hello and welcome to Control Alt Speech, your weekly roundup of the major stories about online speech, content moderation, and internet regulation. It's November the 15th, 2024. And this week's episode is brought to you with the financial support of the future of online trust and safety fund. My name is Ben Whitelaw. I do know what date it is just about. And I'm with Mike Masnick who I'm less sure he does know what day it is.

Mike Masnick:

Oh, this is, this is, uh, this has been many years in a couple of weeks.

Ben Whitelaw:

How's how's tells the week been? Have If you've got any further in processing what's happened.

Mike Masnick:

Yeah. I mean, it's just like the whole world is crazy right now. Um, and you know, and some of, as we'll discuss, there's a bunch of stuff going crazy with blue sky. And so partly because of my association with blue sky, I've been spending some time thinking about that and talking to folks there. and then, you know, there's a whole bunch of other crazy stuff happening in the world. And, it's just, you know, day after day of. what crazy thing is going to happen today? is, uh, it's a little bit disconcerting.

Ben Whitelaw:

Yeah. And we should say for new listeners that Mike, is on the board of blue sky. so that's what he means when he says about his, affiliation. So Mike, let's get straight into the stories then, because there's no time like the present. And we're going to start with blue sky. We're going to start with, a story that many people have seen this week where the Guardian. Newspaper has decided that it's going to, come off of X slash Twitter. because of concerns about safety there's a few stories interconnected here that you're really interested in and, and, some of that involves the departure of users to other platforms, talk us through them.

Mike Masnick:

Yeah. I mean, it's been really interesting you know, so first, uh, the guardian announced that they were quitting X and basically just said they, were upset about the content on the platform and how it was being. Moderated and how it was being handled. And so in the wake of the election, they decided they were no longer posting to it. That's not a blue sky story because the guardian hasn't, as far as I've seen, hasn't set up on blue sky at all. but you know, they decided they weren't going to do it. And then, some other news organizations followed as well. There's the Spanish newspaper. I'm not going to be able to pronounce it. Love and Guardia. Is that it? Yeah. All right. so they also announced that they were quitting, but then just, a whole bunch of users have been quitting and there've been some reports and obviously the internal data exactly from X is not available, but, I believe it was simple web, which tracks. people go on the web, they have a bunch of ways, some sneakier than others to see where people are, are on the web. Notice that traffic to the delete your account page on X went up through the roof, like massively. it's actually kind of impressive that you can, if you see the chart that they put out, And so a whole bunch of people have been leaving and going elsewhere. And almost all of it is just in response to the election and people, well, there are a few things also, as we record today, X's new terms of service, uh, have gone into effect and they were announced, about a month ago and they include a few things, such as that. Anyone that X chooses can train AI off of the data that you have given to the platform previously X itself could, but now basically they can sell the data that you have given to X. And so a lot of people were upset about that. So a whole bunch of people started deleting their accounts. A lot of it is just sort of reaction to the election and saying, do we really want to be on this platform at a time when. Elon has been basically like, it appears running part of the transition process and, and giving recommendations on who should be in the Trump administration. there was some talk that he's sort of camping out at, the Trump residence in Mar a Lago. In fact, there were some reports that people within the transition team were sort of getting sick of Elon Musk, which will be interesting soap opera to watch. But. It does seem that there has been a mass exodus of people from the platform going to other places. And so some of that is threads. Absolutely. you know, as you mentioned in your amazing fact of the week, you know, threads claims that they've, gotten 15 million new users. but blue sky and again, disclaimers, disclaimers, disclaimers, I am on the board. I am associated with the company, they've seen massive growth as well. And so, in the last few days, users have come in. you said, 15 million users is a blue sky. That is no longer true.

Ben Whitelaw:

Is it, is it going up even further?

Mike Masnick:

is now 17 million users. Uh, actually, in the time that we are recording this, it just passed 17 million users, uh, about 30 seconds ago, uh, as we're recording, as I look at the numbers, so blue sky now has 17 million users and that is growing, very rapidly. Uh, yesterday on Thursday was the first day. That over a million users posted during the day. Um, and so, you know, the numbers are looking really interesting. and the, the thing that I had seen, in the past when there have been events that are sort of referred to as Elon Musk events, EMEs, uh, that caused people to run to blue sky. It had usually been sort of one big shot. So, when X got banned in Brazil, or when he announced the new policy, there would be one day where there would be a huge rush of people to blue sky. And then it would start to trail off. People sort of check it out go away. that hasn't been happening this time. It has been sort of this gradual growth each day, bigger and bigger for the past week. I don't know if that'll keep going, but it's been interesting to see. And a lot of people are saying that they feel like that it's reached critical mass. You know, I don't know if that's really true or not. But it's, it's interesting to see that a lot of people are, you know, deciding that they want to go elsewhere. And the fact that, the Guardian and other media publications are saying, you know, X just might not be worth it anymore, seems worth noting. We have had some other media properties that have left in the past. NPR famously got into a spat with Elon where he called them state media, which is Kind of interesting now that Elon has apparently decided to accept a role in the government. Does that make X now state media? it make true social state media? there are all of these interesting questions that seem to open up, issues of potential hypocrisy on the part of Elon Musk and Donald Trump. But, but, but I do think it's, interesting that, for a very long time, you know, really for, two years, since Elon took over the company and it's been just over two years since he took over the company. There'd been this sense that like, Oh, you know, people are going to go somewhere else, but then it didn't happen. and sort of the, power of inertia. Was very, very strong in keeping people on Twitter slash X. And it's totally understandable. It's like, it's not easy to start over somewhere else. And it does feel like in the last week with the election and with the changing of the terms of service and everything else that there is at least a more concerted effort to say, you know what, it's really time, time to leave. And we'll see what, what does that mean for X in the future? it still has a lot of users, but if it's dwindling that, becomes more difficult to sustain, there is the argument that, you know, and there's, there've been lots of talk about the valuation had gone down. The advertising had dropped by like 80%. though there was a report in the FT this week expecting that advertisers would suddenly return to X, as they, as it says, seek favor with Elon Musk and Donald Trump, that feels really speculative to me. And I'm not sure I buy it. I'm not sure that's necessarily the best way to seek favor in, in this situation. And as users are leaving, it becomes an even less and less desirable place to put advertising, but, you know, we're seeing, A reshaping of the world in a lot of ways, both in the government and in what the social media landscape looks like. And I think that's interesting at a time when, you know, we had all these, these years where people were saying, you know, it was never going to change that we had Facebook and we had Twitter and we had YouTube and there was no way that any other, you know, the world was set, that was the way things were going to go and, you know, The only disruption we had seen to that for years was TikTok, you know, which sort of came out of nowhere and built up a really, really big presence. But everything else, it was sort of the suggestion that you could never take on these companies and we're seeing things change and we're in the midst of that disruption and change right now. We don't know exactly it's going to end up. But it's kind of exciting to see, what comes of this.

Ben Whitelaw:

Yeah, I agree. And I think you're, you're right to point out the kind of seismic shift in, the user behavior that we're seeing, you know, you can dispute the numbers and say that, 15 million users or 9 million on Mastodon isn't. anywhere near the size of these other platforms, but it's, it's again, the biggest shift we've seen in a while, I want to come onto the brand exodus in a second. but around kind of exodus of users, we haven't really seen, we were talking ahead of the recording. We haven't seen a lot of this happen previously as a couple of examples in the past where users have, basically decided based upon the platform policy or some changes on the platform depart directly as a result of that. Yeah. I was thinking about how Tumblr changed its adult content guidelines, I think in 2018, and that led a lot of Tumblr users to leave. And obviously Tumblr is really a shell of a platform as it was prior to that, and has really changed hands a few times since then. And, and really isn't the kind of, uh, interesting platform that it used to be, you were also talking about dig as an example of, when this happened.

Mike Masnick:

That was the one that sprung to mind as the biggest example. There are others, obviously, there was a grand migration from Friendster, which was sort of one of the first social networks to MySpace, and then from MySpace to Facebook. but those all seem sort of more gradual. the one that struck me as the most closely related to this was dig to Reddit. And, for folks who don't know in sort of 2008, 2009 timeframe, Dig was huge and, was sort of the, it's funny what the tagline of Reddit now is the front page of the internet. People went there for news and would vote and down news very much like Reddit. They sort of, really created that kind of model. And, the founder of Dig was on the cover of business week as like this super successful kid. I think they. called him a kid on the cover of business week magazine. Uh, and it

Ben Whitelaw:

with this funny thing called the internet.

Mike Masnick:

yeah, yeah. You know, I think they had him in like a funny outfit too. And he was like wearing like Walkman headphones or something. It was something so, so dumb. Um, but. Dig made massive changes. Like they basically changed their entire system. And there were reasons for that, which a lot of it had to do with the fact that the way dig was originally set up, it was totally open to gaming. And people could, push themselves up to the top of the ratings with a, with a little bit of trickery. and as. The people in the trust and safety world know when you leave it open for trickery to occur, it will, and that creates real problems. And so dig tried to deal with that, but in doing so, they really changed their entire system and the way it works. And it just sort of completely set people off and Reddit had actually. Been out there in existence for a couple of years, had come out of Y Combinator and, you know, had really sort of puttered along and there was, for a long period of time, people sort of looked at, you know, dig was the big successful company and Reddit was this like, eh, you know, kind of also ran, wasn't going to be very successful. And in fact, at one point Reddit had sold out fairly cheaply to Condé Nast. and Nast had like put them in the corner of Wired's offices. I once went. To a meeting with an editor at Wired Magazine for something. And he was like, Oh, let me show you around the office. And he's like, there's Reddit. It was like in the corner, it

Ben Whitelaw:

Well, like a handful of people.

Mike Masnick:

it was like four guys sitting in a corner of Wired's offices. but when Dig. made this change and change their policies and change their setup and the way it worked, people were like, this sucks. And we're going to go somewhere else. And that was what made Reddit and Reddit took off and did completely collapsed and ended up, you know, it's changed hands a bunch of times. I don't even know if there's still a version of it out there. Um, and so this, it's not a perfect analogy. Obviously X was way bigger than dig ever was. but I'm seeing some. Some of that sense of people saying like, there's a point at which it crosses the line and we're not going to take it anymore. And if there are reasonable alternatives out there, we're going to go seek them out. And so it feels like that's what's happening and it'll be really, really fascinating to watch, like what is the end result of that?

Ben Whitelaw:

Yeah. I mean, it's interesting that in almost, you know, 20 years of social networks, and you can say whether whether it's longer than that, but you know, in a significant period of time, there's not that many. Examples of this kind of shift, I'd say as a result of policy. And I think that is interesting. Again, you could debate whether there are other examples, but like, I do think this is, notable for that reason. And the sheer fact that the guardian is coming out so strongly and you can disagree with, Uh, it's political views and, but the fact that it came out so strongly with such strong wording around the toxicity of content on the platform and the fact that Musk had used it as a political weapon. and the fact that La Vanguardia had said specifically, this was about its content moderation policies is very much. Striking to me, you know, these are, major brands in their countries in the case of the Guardian, big news outlets in, in the U S and, Australia too, deciding not to at a time when journalism is, struggling for audience and, and struggling for sustainability saying, actually, You know, we don't agree with these policies and we're going to, to exit from a kind of principal's perspective, I think that's, notable in and of itself, irrelevant of what happens right to the user base and where it goes.

Mike Masnick:

Yeah. It is really interesting. And there's a whole separate angle, which we probably don't want to go too deep on, but I attended an event earlier this week about all of these attempts to do what I refer to as sort of link taxes, or what some people refer to as bargaining codes around getting tech companies to pay media properties and. the way that, meta in particular has responded to these in Google to a lesser extent has been to sort of threaten to block news. And in fact, they are doing that in Canada. and so I heard a report from people in Canada this week, which was very interesting saying that, the impact of meta blocking Any sharing of news in Canada has had a huge impact, especially on local news organizations. and there was a question that was asked of, would local newspapers prefer to have traffic from Metta or from Canada? Or payments from Meta and universally there's people were saying, the traffic is more important. And yet here's kind of a flip side story to it, which is saying like, this traffic is not worth it to us. Even if it is, a way for us to get our news out there, it is not a platform that is, worth it to us because of these other ancillary factors. And that's really interesting. You is this concept and I think, Patel at the Verge was the first one to sort of. Put it in these words that for social media, content moderation is the product. Um, and you know, I think a lot of people don't really understand that to get back to my, my strange fact at the, my amazing fact at the beginning, people don't seem to really think this through. There is this belief, especially among like the Elon Musks of the world that content moderation is somehow evil and a problem yet it is the product that makes the experience on your website. what it is. And some people are realizing that and there are consequences to that. And so when the guardian, which is, you know, obviously a huge, huge media operation decides that it's just not worth it anymore, that's saying something about, the need for good trust and safety and content moderation, or there are consequences.

Ben Whitelaw:

I mean, talked there about, the kind of interplay between platforms and publishers. do you think this is going to have any difference on the metrics of the platform itself? Because the Guardian are obviously making a bet that they're not going to lose anything by not being on a platform. Twitter has historically been a low driver of traffic for publishers. Facebook is very different to Twitter in that respect. you think the platforms are likely to see any change in habits as users, if publishers decide to go down a route where they take their content off the platform, will there be behavior changes as a result? Do you think?

Mike Masnick:

I mean, there's a whole bunch of different factors here because yes, historically Twitter has never been as big a driver of traffic to publishers, but at the same time has been a major source of information where the journalists participate and the media, folks in the media participate and have these discussions. They often find stories, they find sources, they find out about things. And as that moves around, I think it'll be really interesting as to how that impacts the media. I also think it'll be interesting to see how, because these alternative platforms, Mastodon, Threads, and BlueSky are built in a different way. Then, X slash Twitter specifically in the, decentralization elements of them. I'm sort of curious to see if there will be some unique experiments that, reshapes the way that people interact with the news. This is very, very speculative and it's not like thinking of like a perfect example right now, but the fact that these are in some ways, and they're very different ways, but these systems are more open to third parties building on them and doing experimentation. And so like Flipboard, which has been around for a while, but it is kind of a tool for finding and interacting with the news has really very strongly embraced, ActivityPub and Mastodon, and they're building a whole bunch of stuff related to it. And I think, could change begin, you know, there, there are these experiments that someone's gonna, crack the code that changes the way that people interact with news. And that could be good, could be bad depending on how it comes out of it. But for the, first time in really a decade, it feels like we're We have this opening to change the way we interact with news. And that's both exciting and scary, right? Because we don't know how it's going to end up. Hopefully it leads to something better because there is a very strong argument. The way that we've interacted with news for the last decade has maybe not been too good.

Ben Whitelaw:

Yeah. No, yeah, fair.

Mike Masnick:

if we can get to a point where there is a better way to interact with news and this sort of shakeup allows that, that, that could be really exciting.

Ben Whitelaw:

yeah. And we're starting to see some journalists kind of go onto these other platforms, onto Blue Sky, onto Mastodon. It is worth noting that, you know, back in the day, Social platforms, particularly Twitter did a big push to get journalists and media organizations onto platforms. I was working in a couple of newsrooms at the time, and we'd have a policy, kind of partnerships manager and they would come and they would call all the journalists and they would say, we can give you verification. And if you just make sure that your work email account is attached to Your address, then we can, guarantee that it's you, we can give you the blue tick. and that was because they knew that the journalists were producing content, which as you say, is the product they knew they had information that was valuable. it was a habit driver for a lot of social platforms. And so as threads, and blue sky and others, I guess, see some of those. Content providers there's a bigger group of content providers than there was 10 or 15 years ago, right? It's not just journalists anymore, but that will be interesting to see how those behaviors shake out. Um, I wanted to talk a bit about the brands piece as well, Mike, cause I, cause there have been other brands, as you say, that have kind of left, social media platforms in the past, which we know. Brands can remove advertising and can kind of take away their, what they spend on platforms as a way of having leverage over a platform's policies. But there's only a few examples that I remember of, platform stopping to publish content organically, and not paying for it. A couple of examples. I remember one of them was, lush actually still never posted on. If you go to Lush, which is a cosmetics company in the UK, maybe elsewhere, they actually have a video that basically says, go outside and enjoy yourself. Like they made a big, which, which, yeah, which is a nice, nice message, but it was in, in response to concerns. at the time about mental health of, social media users of, um, body image of social media users and tick tock, not, not necessarily doing much to address those. Balenciaga, the fashion brand took yourself off of Twitter when Musk joined the platform in 2022 and has not returned, it doesn't have an account at all now, as far as I can see. And do you know what the third example was I found? of a brand taking itself off a platform,

Mike Masnick:

Tell me.

Ben Whitelaw:

Tesla and SpaceX.

Mike Masnick:

Wait a second.

Ben Whitelaw:

I I'd forgotten all about this, but in 2018, after Cambridge Analytica scandal, and after the kind of big pushback around, Facebook's use of data, and there was a big campaign to delete Facebook. Uh, it was a hashtag was trending Elon Musk decided that he was going to take Tesla and SpaceX's accounts off of Facebook And they have never returned. There's no official Tesla account on Facebook You had something like 2. 5 to this day never never. Yeah so

Mike Masnick:

Can, can I just note, I'm going to interrupt you for a second. Can I just note, this is the same Elon Musk who is currently suing companies for refusing to advertise on X.

Ben Whitelaw:

that very same

Mike Masnick:

Amazing. Amazing.

Ben Whitelaw:

yeah, talk about having your cake and eating it.

Mike Masnick:

Amazing.

Ben Whitelaw:

So, you know, there are very few examples. and this is what makes this story interesting and notable. And as, as you see from how long we've talked about it, I think there's a lot of different elements to it, which, make it fascinating.

Mike Masnick:

Yeah, just, just the fact that, there is this shakeup I think is really exciting and worth talking about. And I think it's going to lead to a lot of, things to talk about, in the next year. just because it's, you know, everybody had sort of assumed that we had this set of companies to talk about and the entire ecosystem is changing in really interesting ways and nobody knows exactly how it's going to shake out. and. everybody's making their individual decisions and we'll see what happens.

Ben Whitelaw:

Yeah, indeed. we've done a fair chunk on that. So let's, let's move now onto our next story. this is a continuation in some ways of, of a story we talked about last week on the podcast. so for the listeners who didn't tune into that, we talked about Australia's proposed ban on social media for teenagers, and the update to that story was that. The age had been set at which teenagers wouldn't be able to have an account on social media and that age had been set for 16. And then we touched on it very briefly and a whole bunch of experts in academics have actually come out this week and written a letter about how That legislation is going to be in practical terms. And did discuss this and the law is going to parliament next week. So we'll be able to talk about it in the coming weeks, but what I want to concentrate on Mike is the fact that in the UK this week, there's been a number of Stories about how the UK is actually in close pursuit, of Australia is following close behind in terms of developing laws around, only smartphone bans, but also social media bans as well. so just to kind of give you an overview. So there's essentially a, a Labour MP, who's, put forward bill, calling for the banning of smartphones. And. Initially, labor and the prime minister, Kirstein, wasn't that interested in this, said that kind of schools had the tools and the ability to balance smartphones, said it was a kind of unnecessary move. However, since then, we've seen this campaign spring up called the smartphone Free childhood group, and it's attracted a lot of parents who have signed up for its manifesto. The manifesto talks about how childhood is fleeting, that technology should be a force for good, companies that profit from children must respect childhood, et cetera, et cetera. And it's got a very kind of clever, sign up process where you can not only. pledge to the manifesto, but also then email your MP directly from the campaign page. So all of a sudden MPs have got this of swell of parents in the inboxes calling for a smartphone ban, and all of a sudden this gap has opened. and according to reports this week that I've read, the Labour government more open minded and amenable To some form of a ban for, teens of some form. Now it seems like from the reports that actually social media is, is a more likely ban than smartphones as a whole. you know, schools have the ability to ban smartphones and take them away. And we can talk a bit about that because there's been some interesting reporting on that. But it is interesting to note just how what's happening in Australia, as we always see with legislation and internet regulation is already having a kind of impact. And we, and we know that ministers in the UK are closely following what's happening there. with a view to seeing they should follow suit. So you're also seeing a bit of this in the States as well. Right. You know, we talked a bit about California being place for this too. What do you make of all of this as a bit of a movement?

Mike Masnick:

Yeah, you know, I mean, I think we've, talked about some of this before where a lot of it just really feels not driven by the research or by reality, but by a general moral panic. and even like the framing of that campaign in the UK, it feels like, the older generation going, ah, the kids these days. you know, in the same way that They always do the kids these days, whether it was rock and roll or pinball or dungeons and dragons or whatever the kids were doing, you know, again, like, I may have talked about this on the podcast before, but I don't remember exactly you go back. There are examples of it when the waltz, the dance, the waltz first came out. That the older generation complained that it was too sensual when, you know, novels became popular in the 19th century. People said, well, you know, all of this fiction stuff is rotting people's brains. When chess became popular in the 19th century, people said it was a waste of brain power to be, moving little pieces around a board rather than being out in the fields working. This kind of thing happens all of the time. And yes, there are, this is not to dismiss the concerns that people have about social media and smartphones and how they're used and all of these things. And I think those things are all worth studying and it's worth thinking about how we do these things smartly, but the idea of an outright ban just consistently smacks of we don't like this newfangled stuff. And so we're going to ban it. And I, and I just think it's, problematic and I don't think it works well and as all sorts of attempts at prohibition don't tend to work out. Well, there are reasons why many parents, if parents are so concerned. Don't buy your kids as a smartphone. Right. Like, but the fact is like lots of people have found smartphones to be really useful and that includes for kids and for lots of reasons. And, and there have been, you know, there were studies in Australia, that were released earlier this year saying that. bans of smartphones in school didn't work. in New York city, they banned smartphones in schools about a little less than a decade ago, and they rescinded the ban less than a year later because parents were complaining that it, made it really difficult to reach their kids when they needed to, or to like, find kids for pickup and all this kind of stuff. there are all of these consequences to this stuff. There's also the fact that, with the, with the phones in schools, phones have been useful in schools and documenting abuse by teachers or, the U S we have a lot of police in schools, what are called school resource officers, SROs, um, phones have been used to document abuse by those individuals in schools. And so there's this whole thing where you're banning them. Are you avoiding capturing of those kinds of problems. there are all sorts of. Consequences to these things that it feels like the sort of ban and block campaigns don't take into account. And that's not even getting into the whole questions around age verification, where if you're doing, if you're banning social media for those under 16 or whatever, you know, how do you determine who's under 16? That raises a whole bunch of questions that nobody really wants to address. And so, you know, I understand why these kinds of movements happen. I think Historically in retrospect, they always look kind of silly. and that there are better ways to deal with this, which tend to be around education and, properly training people how to use these things in a way that is, is safer.

Ben Whitelaw:

I mean, I'm not a parent. And we should talk about how, you know, a little bit of how your kids schools deal with it, because I think there's, probably some interesting examples there. And for listeners who are tuning in, you know, do get in touch with us to share examples of how your kids schools are also thinking about this, because it seems like there's a lot of, potential thought going into how to address some of these concerns. And we should engage with them. get in touch with us, podcast at control. speech. com. Um, I'm not a parent, Mike, but you know, these, this is something that keeps coming up and parents seem to say that they don't give their child a smartphone, then they're worried about their kids falling behind, being bullied, being left out. that a kind of argument, do you think, for having legislation Bans it blanketly. Is that a strong enough argument?

Mike Masnick:

yeah, I'm not, convinced of that entirely. I mean, you know, not to get into too many details about, about, My situation in my family, like, you know, I have a, a, child who is in high school, who doesn't have a phone. though that might change soon. And it wasn't, it wasn't us saying like, no, you, you don't get a phone. It was that, they didn't. Want a phone and, and that was a choice that they have not expressed to us a desire for a phone and therefore we didn't feel the need to, force a phone on an unwilling child. Um, but you know, the, the reaching the, the age where it probably would be helpful for them to have a phone and, and we're talking to them about it and, but as part of that, preparing them how to use it properly and how to use it safely and, you know, what the risks are and things like that, But they are in a school where all of their friends do have phones. And, one of the interesting things that I, that discovered in the high school is that for ninth and 10th grade students, the first two years of the high school. Every classroom was given a board with pockets on it. So as you walk into the classroom, students are actually supposed to drop their phones into the pockets. so it's easy to, get back access to the phone if you need it. It's not some schools I know are having like lockers, and, Your phone is locked away for the entire day. This is just for, each class you drop it in the thing though. And some of, some of the classes apparently take attendance that way. They look at, you know, which pockets have phones in them, which, presented an immediate challenge for my kid who didn't have a phone, who was able to explain to the teachers and it worked out okay. But you know, there are ways to deal with it that are not as extreme as, as these sort of complete bans and blocks that I think are more effective. But then there are also stories like, you know, I'd heard, I forget exactly where it was of a student who was at a high school wrote for the school newspaper and they needed a phone because they were taking pictures for the school newspaper. And so they had to like, get a special like exception that allowed them to carry a phone. And, you know, if they were stopped by someone at the school, they could show their, I'm allowed to have a phone in school. There, there are a lot of things where it's like, there are ways to handle these situations in a way that isn't, as extreme as the government coming in and banning them. And I fear when you have the government Say like, Oh, we have to ban these things entirely that the end result is, you don't have room for those kinds of reasonable exceptions, or you just leads into everyone ignoring the rule. What one of the things that came up when New York. Banned schools a decade, uh, banned phones, uh, uh, in schools a decade ago was that it tended to be mostly enforced against, the less well off schools, and the less well off students. And that is something that is also really important to think about when you have a law, how is the enforcement taken out? And is that enforcement done in a discriminatory manner? And that's what we saw in New York with the school ban. and so I think it's, it's worth thinking about these things rather than assuming that, Oh, if we ban phones in schools, like all the kids run out to the fields and enjoy the outdoors again. You know, it's, it's not really the way it works.

Ben Whitelaw:

No, it's interesting. There's an FT piece, essentially asking the question of should smartphones be banned in schools? And it's kind of ironic. The case study, school that it looks at, claims to have had great success, taking. Kids phones aways, they're locked away in a locker for the whole day. And the headmaster takes great pride in saying, actually, we only confiscated four phones over two months for a thousand kids. And normally we'd be confiscating phones daily. And I was thinking like, great, well, let's just leave it like that then, you know, no need for, any kind of stricter regulation than that, if it's working And again, this is a school in a kind of nice part of the world in the, in the UK, you know, to your point, probably more well off school than, than others. So different behaviors, different kinds of kids. different examples, needed, I think. So I think there's, it's really interesting to see this emerge as a, theme in the UK. particularly as we go into next year with the online safety act, to see this emerge alongside probably questions around, as you say, age assurance, age verification, how do we ensure if we're going to have social media Bands as a, as a subset of, bands for teenagers, how are we going to ensure that that is done in a way that is, is okay. Is, is, allowing people who, need to access technology to be able to access technology and not locking people out of, opportunities to do so.

Mike Masnick:

Yeah.

Ben Whitelaw:

Interesting. Okay. Thanks Mike. So managed to get through that segment without mentioning Jonathan Haid. So, uh, well done. Well done you for

Mike Masnick:

You just ruined it. He is, he has mentioned in a few of the articles.

Ben Whitelaw:

know, I know you're being, very restrained. okay. So let's, let's round up the kind of best of the rest. We, we didn't have a huge amount to go on. This has been a relatively quiet week apart from, you know, the big stories that we've discussed. but you, you spotted a couple of interesting kind of effects of regulation, Mike, for some of the big platforms.

Mike Masnick:

Yeah. there were some interesting things going on in the EU, and responses from the big tech companies this week that I thought were sort of interesting to pay attention to in terms of. Specifically kind of how ads work in the EU in particular. under the regulations in the EU, Facebook and Instagram, I think last year started releasing, a subscription product that was an ad free subscription product, which was sort of mandated by the law. And. As soon as they did that, and they had warned EU regulators that this would be the case, EU regulators started complaining that the prices for the subscription were too high. And what Meta said was basically like, well, if we're removing our ability to make money from ads and we have to make that up through subscription fees, this is what we need to do. And then, so they were sort of getting hit both ways, which was that you have to offer an ad free product, but then you can't price it as high as you want. And so this week, sort of in response to that, Metta responded with cheaper subscription plans. the announcement is a little bit, Petulant in terms of how they're responding to this. Like, okay, you know, fine. You know, the EU, you know, we think this goes beyond what the EU requires, but okay, here we'll, we'll release cheaper plans and they'll, they're also changing some of the things so that you can get a less personalized ad thing. And even this is kind of like a little bit snotty in the, in the way it's announced, where it's like, All of our efforts look like, people say that they actually prefer the more targeted, more personalized ads, they're more relevant, they're more useful. But if you're going to force us to, okay, we're gonna give you a less good experience, which is, is a very self-serving and a little bit. you know, the way that they said it is like, despite our concerted efforts to comply with EU regulation, we have continued to receive additional demands from regulators that go beyond what is written in the law. That's basically like, all right, you're going to keep pushing us. We're going to do this. and so I thought that was, kind of interesting. It's sort of, the nature of how the companies are working and a sort of similar story though, on a different law, but still related to advertisements. Google this week announced that, they were going to ban political advertising across the EU, across their products, which includes YouTube, which I think, you know, is actually a pretty big area that is used for political advertisements. They're banning political advertisements across EU, based on the regulation on transparency and targeting of political advertising, the TTPA. And basically what Google is saying is like, we warned you that these regulations were terrible and impossible to comply with, but you went ahead with them anyways, and we have no way to comply with them that aren't going to get us, rung up on an investigation and then massive fines. And therefore we are just banning political ads altogether. And again, it feels a little bit. bratty in the response because it's kind of like, ha ha, you politicians who rely on political advertising to get elected and to get word out on what your, policies are, you're not going to be able to do that on our platform. And we're just sort of pulling out of that market altogether.

Ben Whitelaw:

there's a little bit of like, be careful what you wish for in the response to both of these stories, which I think is interesting. Also ties back a little bit to what you were saying about the, news carrier, uh, Legislation in, in Canada, right? It's like, again, we were trying to help you, get your journalism to your audiences in ways that you weren't able to do before, and you made money off the back of us and now you've taken that away. So we're going to have to, yeah, we're gonna have to punish you. Uh, and, and there's a kind of thread that runs through there of, I guess, politicians, media being. So reliant on these platforms now that actually when regulation comes to pass, the impacts aren't necessarily known. And I think this is an example that we don't necessarily know what this is going to mean in the EU for politicians or for voters.

Mike Masnick:

Yeah. And it's kind of interesting. And, and, you know, some people respond to this and say, well, this is more evidence that the companies have too much power or whatever, and maybe that's true, but like the reality is if like, you create regulatory incentives, then companies are going to respond to those incentives. And that's one of my frustrations with a lot of the regulatory proposals are that they assume we put this regulation in place and nothing changes. Like the market. We'll just act the same way. And we're going to be able to extract something from that or, cause something to change. But the regulation creates incentives and companies are going to respond to those incentives because that's, that's what they need to do. That's what businesses do.

Ben Whitelaw:

Yeah. Do, do you think that there is a. Downside to this, obviously the, you kind of alluded to the idea that politicians won't be able to campaign on these Google platforms when we have to kind of try and drive people to vote for them. do you think people will miss political advertising on Google products?

Mike Masnick:

Does anyone ever miss political advertising? I don't know. I mean, You know that that is a big question, but it will be interesting to see what does that mean long term and you know like the impact of political advertising is different in different kinds of campaigns for different kinds of offices and Is different in different countries because different countries also have different regulations and how political advertising works as well but it is something to watch and is something to pay attention to if they're banning political ads entirely I, I think it's got to have some sort of impact.

Ben Whitelaw:

yeah. I feel like part of this podcast, Mike, is just us coming up with like research questions that we can't answer

Mike Masnick:

Yes.

Ben Whitelaw:

that we're hoping somebody else can answer. a listener is able to answer at some point down the line. Um, I think that this is one of them again, I, you know, would love to have to kind of track the, kind of knock on effect of this. let's kind of do a couple more stories before we wrap up. I mean, I, this is a very lighthearted story. This is, maybe as far away from the U S election as you can get. from the last couple of weeks, but I was really taken by this Kismodo story, Mike, about, the latest English town to be affected by Facebook's moderation algorithms. there's a place called Coulsdon, just outside of London, kind of unspectacular place. You wouldn't necessarily go there. It's not cool. Um, I've, I've never been, um, I don't think, I don't think I'll in, uh, I plan to, but there's

Mike Masnick:

You can go do research for this story.

Ben Whitelaw:

I could, I could, maybe I'll drop by, um, maybe I'll go and find the owner of this Facebook group who, whose case was reported in the local press, after a number of different posts got, uh, moderated over a period of weeks, I think dozens at a time. And they were very miffed about why this was. And it turns out that Coulsdon, which is spelled C O U L S D E N. And moderation algorithms thought, That they were peddling drugs, specifically LSD. So this

Mike Masnick:

right in the middle.

Ben Whitelaw:

cause right in the middle of Coulsdon is LSD. And so this is the latest in a number of stories like this. I remember one about Plymouth hoe, a place in the south of UK. your favorite is, right up in the north of the UK.

Mike Masnick:

yeah, mean, people have a name for this, this sort of situation and it's called the Scumthorpe problem because of Scumthorpe, which has a word in the middle of it that is potentially problematic, uh, if you build a filter that is just based on, words alone. And so it's, generally referred to as the Scumthorpe problem because of that. And the amazing thing to me is that this is still a problem. For a company as large as Facebook. And we've talked about this, recently about Instagram and threads and some of their moderation tools and just how surprised we were at how simplistic the filters appeared to be. And this is just another example. And I'm, kind of shocked that a Scunthorpe type problem would still pop up from a meta product.

Ben Whitelaw:

Yeah, totally. And, and the response is always the same from the companies, which. This is a technical glitch. This is an error. We've now fixed this. What it means is somebody has gone into the tooling and taken out LSD from the filtering for whatever period of time until I guess they have an issue down the line and they have to re add it or do it in a certain way. So yeah, it's, it's funny. We are in this never ending cycle, I guess, of poor UK towns being caught up in moderation filters. And you know, I will, I will not stand for it, Mike.

Mike Masnick:

I was going to say, I mean, you, the UK has a way with naming its towns. So,

Ben Whitelaw:

Yeah, indeed. We should, uh, come up with a list of other problematic ones

Mike Masnick:

there's, there's, um, there's a, a podcast, called Hollywood Babylon put on by Kevin Smith, who was the filmmaker who made clerks and other stuff. And, They have this segment that they do semi regularly, which is called your town's got a fucked up name. And people submit all of these random towns, and I would say half of them are probably in the UK.

Ben Whitelaw:

Oh really?

Mike Masnick:

They are examples of just problematic names, uh, in whatever sense. Um, and so so maybe we need the, trust and safety version of your town's got a fucked up name.

Ben Whitelaw:

Yeah. Yeah. It makes me proud to be British. Um, great. And then finally, Mike, we round up on. The final story, because we started today's podcast talking about the X users who are leaving the platform, who are departing, but there is an awful lot of users still there, by some counts, 150 million daily users. And you've spotted a couple of stories that relate to those users who are still on the platform.

Mike Masnick:

Yeah, the one that really caught my eye was this one from rest of world, covering how phony X accounts are, as they say, meddling in the election in Ghana, and this is the kind of story that, isn't that surprising because we've certainly seen other accounts in the past of efforts to influence elections using social media. But this, struck me as, interesting timing sort of a sense of what is happening with X now. And there is a clear idea that, outside of maybe US and Europe or specific countries that are of interest to US and Europe, that X doesn't care at all, you sort of get that sense that, there are these sort of fairly obvious bots. Accounts. Some of them are, operated obviously using chat, GPT, or other generative AI tools to, create, political propaganda and, presented as if, you know, there's like a widespread support for the, existing leading party. And. the sense here is that X has sort of given up on moderating these kinds of things. If it's not something that matters to Elon Musk, the trust and safety at X doesn't exist effectively. And this is like, you know, it's in some sense, it takes you back to the way that You know, a lot of these concerns happened a decade ago when sort of trust and safety was first coming up as, as being so important, you would hear these stories of places that are not generally paid attention to by folks in the U S or Europe that were getting ignored. And terrible things were happening and, massacres and genocides were popping up and, people were using these platforms to do stuff and they were totally ignored. And so this just caught my eye. It's not, you know, this is not as extreme as that, but as an example that these kinds of. election meddling are apparently now going on X, almost entirely unimpeded from the sense of it. struck me as really interesting. There's a related story. It's not quite the same thing, but DFR lab put out this report recently. about a network of sock puppets, impersonating Americans and, Canadians, and pushing particularly pro Israel content, on X. And there's an interesting question of like, you know, again, how is X handling, you know, These kinds of trust and safety thing where you have these faked accounts I think I didn't look as closely there I'd seen a report actually a couple months ago that was presented at the trust and safety research conference where one of the ways they were identifying these faked accounts and these look like the same kinds of ones was If they were using this face generation app AI that would always put the eyeballs in the exact same spot in the profile photos. And so you could determine which ones were faked by just lining up their eyes

Ben Whitelaw:

Wow.

Mike Masnick:

and just looking at some of these faked accounts, I see all of their eyes are in exactly the same position. So I'm thinking it's the same, same sort of thing, but it's kind of, kind of interesting to see. basically how X in particular seems to have given up on dealing with these kinds of problems right now.

Ben Whitelaw:

Yeah, no, indeed. And I'm just going through each of the counts and they've all been suspended or been taken down as a result of, of this report, which is, which I guess is the importance of, you know, ensuring that this research happens and that people stay on top of this. So, a real range of stories today, Mike, a lot of X. Twitter departures, a lot of concern about safety, but, yeah, thanks again for your analysis and for your thoughts and, for those who are listening want to kind of leave a review for today's episode, who enjoyed, our chat through this week's stories, go to your podcast platform of choice, leave us a review, give us a rating, uh, really helps us get discovered on, the big wide world web. And, uh, we'll speak to you all next week.

Announcer:

Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.

People on this episode