Ctrl-Alt-Speech
Ctrl-Alt-Speech is a weekly news podcast co-created by Techdirt’s Mike Masnick and Everything in Moderation’s Ben Whitelaw. Each episode looks at the latest news in online speech, covering issues regarding trust & safety, content moderation, regulation, court rulings, new services & technology, and more.
The podcast regularly features expert guests with experience in the trust & safety/online speech worlds, discussing the ins and outs of the news that week and what it may mean for the industry. Each episode takes a deep dive into one or two key stories, and includes a quicker roundup of other important news. It's a must-listen for trust & safety professionals, and anyone interested in issues surrounding online speech.
If your company or organization is interested in sponsoring Ctrl-Alt-Speech and joining us for a sponsored interview, visit ctrlaltspeech.com for more information.
Ctrl-Alt-Speech is produced with financial support from the Future of Online Trust & Safety Fund, a fiscally-sponsored multi-donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive Trust and Safety ecosystem and field.
Ctrl-Alt-Speech
You Can't Antitrust Anyone These Days
In this week's round-up of the latest news in online speech, content moderation and internet regulation, Mike and Ben cover:
- Meta wins FTC antitrust trial over Instagram, WhatsApp deals (CNBC)
- Commission eyes further simplification of tech rules after DSA review (Euractiv)
- Inside Europe's 'Jekyll and Hyde' tech strategy (Digital Policy)
- NetChoice sues Virginia to block its one-hour social media limit for kids (The Verge)
- Tech Giants Sue California Over Social Media Access Law (2) (Bloomberg Law)
- TikTok to give users power to reduce amount of AI content on their feeds (The Guardian)
- The Most Frustrating Word for Trust & Safety Professionals (LinkedIn)
Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.
Mike, let's be honest. Finding a prompt to start every episode of this podcast is getting to be a pain in the ass.
Mike Masnick:No, no, it's wonderful. Come on. We learn all sorts of new things about new services, new apps. It's great.
Ben Whitelaw:I I, it's a pain in my life. It really is, but I'm gonna stick with it for now cause I know you like it and, and it seems to be a, consistent part of what we do. So we'll continue until otherwise stated. But I found a treasure trove of that prompts, I believe, and that is the Apple 2025 App Store Awards. It's like a kind of gold mine,
Mike Masnick:There we go. We learn about new apps.
Ben Whitelaw:Other app stores are available. I feel like we have to say at the start of the podcast, this is not a sponsored segment, although it definitely should be. Um, anyway, one of the
Mike Masnick:wants to sponsor the prompt, please, please get in touch.
Ben Whitelaw:please. anyway, one of the apps in this year's long list is a photo sharing app called Retro, which is, kind of like Instagram for hipsters, basically.
Mike Masnick:wasn't that what Instagram originally was?
Ben Whitelaw:yeah, I think so. I think it's Instagram as was. but the kind of nice feature about it is that you can take your photos, which need no caption. There's none of the kind of posturing that comes with, Instagram captions. You can just upload your photo, but you can also turn those photos into postcards that you send. To your friends. So one of the prompts on the app is to send a postcard. So who would you send a postcard to and what would it say?
Mike Masnick:Oh my goodness. Um, well, you know, I was out last week, and I traveled to Boston, and I went to a very interesting conference, one day conference, about trust and safety related districts or dealing with misinformation. Was, the topic of the event. And we in this building up on the top floor and we had a beautiful view of the river that runs through Boston from. back bay area of Boston over to Cambridge. We had this beautiful view of Cambridge, so I would probably take a picture of that, send it to you and show you what you were missing while you were recording an excellent podcast with, our guest host Kenji last week, which I then listened to on the flight back. It was very enjoyable, but, you know, had, had a little trust and safety related travel. What, who, what about you?
Ben Whitelaw:I'm glad you sent me a postcard. I'm, I'm sad it wouldn't say wish you were here, but don't worry about that. Um, I would send a postcard to, probable listener of control, speech and, long time topic, of the podcast, mark Zuckerberg.
Mike Masnick:Oh yeah,
Ben Whitelaw:and I would say, mark, this is probably the first time that you have relaxed in 2025, um, enjoy, this week from wherever you are. you must be, must be mightily relieved after what, we're gonna talk about very shortly. Hello and welcome to Control Alt Speech, your weekly roundup of the major stories about online speech, content moderation, and internet regulation. It's November the 20th, 2025, and this week we're talking about the big meta antitrust case, the eus Digital Regulation Progress Report. US state laws facing platform pushback. My name is Ben Whitelaw. I'm the founder and editor of Everything Moderation, and I'm joined with Mike Masnick, who doesn't send his co-host podcasts from wherever he is in the world, which, you know, I don't want people to judge you for Mike, but that's, it's, it's quite telling was waiting by the door.
Mike Masnick:well, you know, everything is digital these days, so, you know, it's, it's an internet era. I'm sorry that you live in that old fashioned analog era, but, uh, what, what can we do?
Ben Whitelaw:Well, talking of which, I mean, I'm quite old fashioned, I think in many ways. And I, I've actually got a postcard in front of me that I, thought I'd maybe get your thoughts on,'cause I, I want to do something with it. Okay. we didn't talk about this before we
Mike Masnick:Yeah, this is, I am, I have no idea where we're going.
Ben Whitelaw:No, I found a podcast in a thrift shop,
Mike Masnick:Postcard, not a podcast. It would be cool if you found a podcast in a thrift shop, but,
Ben Whitelaw:God. I found a postcard in a thrift shop, in a charity shop in the UK hidden in a frame that I bought. And the pod, the pod, the postcard Dix, a scene of the Moscow Kremlin, as it's called on, on the back of the postcard. So it's a kind of snowy scene with a Kremlin with some trees in front of it, and it says Russian winter on it. And there's a, kind of some writing on it, and I'm gonna kind of explain a bit about it.'cause I want to get your thoughts on how I can reach this person.
Mike Masnick:Okay.
Ben Whitelaw:Using the internet that you say is, is so powerful and, new. so the, postcard says, dear mom and dad, weather very cold food is beyond description. Metro Underground is very different from the northern line. Mr. Gorbachev is now president. See you soon. Love Linda. There is a stamp on it with the date, 1984,
Mike Masnick:Okay.
Ben Whitelaw:and there's no address.
Mike Masnick:Oh my goodness.
Ben Whitelaw:So this, postcard has been written but not sent over 20 years ago,
Mike Masnick:30 years ago. Yeah.
Ben Whitelaw:And the woman
Mike Masnick:40 years ago.
Ben Whitelaw:40 years ago can't, can't even do the maths. and I wonder, I want to find Linda. Basically I want to find her to see if she wants me to post her postcard to her or to her mom and dad. I just think it's an amazing find. and you know, there was always, in the early days of Twitter, there was always those really lovely stories of people finding.
Mike Masnick:Yeah.
Ben Whitelaw:Long lost friends or reuniting teddies with, childhood loss on a train, that kind of thing. do you think that's possible? what's your hope for me? Funny, Glenda?
Mike Masnick:I mean, I think it is, there's not much to go on there, but that, that is a fun one. Um, I would certainly post it to Blue Sky and see, see what we can get. Obviously it's a little bit smaller audience than some other platforms, but Blue Sky is one that I think would delight in trying to help you, find, Linda and find out about her magical trip to Moscow,
Ben Whitelaw:yeah,
Mike Masnick:years ago. I would be down for that. And I think
Ben Whitelaw:yeah. Okay.
Mike Masnick:that's kind of interesting.
Ben Whitelaw:Maybe we make that, part of the controlled speech, uh, mission, uh, and, and I, I.
Mike Masnick:But that's, I think that's actually like, you know, this is a cool reminder of what the internet it was supposed to be good for, what we liked about the early internet was exactly. Those, kinds of things. And so let's see if we can, we can track down Linda.
Ben Whitelaw:Yeah. And if any listeners know Linda or are Linda, get in touch with us. podcast@controlaltspeech.com. And if you have any ideas about how we can reunite the postcard with Linda, please let me know. that's our postcard segment. Done. Quite, quite an extensive one. Um, we should let listeners know, Mike, that we don't have an episode next week'cause it's Thanksgiving. But we do have something exciting coming out on the feed next week, which, is a great conversation that I had with, a really experienced trust and safety professional. Definitely look out for that. It's something a bit different, but we think listeners will like that. And, we've also, it's worth noting had a, a number of episodes that have been sponsored in recent weeks. we've had some really great conversations on the back of our weekly news roundup, and we'd like that to continue, wouldn't we?
Mike Masnick:Yes, absolutely. So, you've heard some of the conversations we've had with AVADA or CCIA in recent weeks, definitely get in touch. and, you know, we talked about sort of the model that we used for sponsorships earlier this year. There was an episode in August. if anyone wants to listen to that or on the website, control alt speech.com, we have a page about the sponsorships. but you know, we're always looking for creative ways to, to keep the podcast going and do creative compelling sponsorships that, people actually wanna listen to. that are valuable and, interesting to our audience, not just, pitching mattresses or whatever it is that, that podcasts do these days. and so we've been really thrilled with, the sponsors that we've had, recently stepped up. a lot of it from us talking about this and, uh, we have some more coming. I know we have some more scheduled as well, but, uh, we are, you know, as we start to look into the new year, we are certainly looking for more sponsors to step up and join us on the episode and have a conversation. And so please, please look into that.
Ben Whitelaw:Yeah. definitely if anybody wants to get in touch, as Mike said, website control speech.com and our email address podcast@controlspeech.com. go into our stories now, Mike, we, we found it tricky to kind of decide what to cover today. There's a lot of stuff going on. we're gonna start with a story that we actually talked about back in April, which was hinted at, at the top of the episode, the big meta antitrust story. And at that point, the trials are just about to kick off. And if you remember, you made a prediction, which, I don't know if you, remember.
Mike Masnick:I don't remember exactly what I said.
Ben Whitelaw:Well, I think basically it was, and you know, this is a nice thing to have played back to you, but you were right. You, you were essentially right. trial panned out as expected for listeners who didn't listen to that episode or haven't been tracking what's been happening this week, just help break down what's happened, what the implications of that big trial are.
Mike Masnick:I mean, this was a case that, goes back like five years. you know, there were all these cases that were brought sort of towards the end of the first Trump administration and then were really picked up and run with by the Biden administration and Lena Kahan running the FTC. And there was this big focus on, sort of bringing down the big tech companies and using antitrust as the, as the tool to do that. And there was, I think, a fair bit of criticism of those earlier cases. and I think fairly so because they were rushed and sloppy and really seemed designed specifically to try and punish these companies for being big as opposed to. figuring out if there was an actual monopoly issue and an actual antitrust issue, and that, that came back to bite the FTC and the US government in this case directly. Because the issue was effectively was, Facebook in its purchase. It was Facebook at the time in its purchase of Instagram and WhatsApp. Was it violating antitrust rules and sort of, keeping its monopoly in place. And so the first step of any antitrust case is always, what is the market and is this provider actually a monopoly provider of that market? And the FTC and DOJ, their definition of the market in this case was so bizarre and just like struck everybody as obviously wrong an almost embarrassing way. So, you know, they sort of picked a few small companies like me. We, which I'm not even sure if it still exists. I mean, it was out there for a little while. you know, it was one of these ones that was out there and they said, oh, that's like, you know, those are the competitors in the space and they're too small. What was not mentioned as potential competitors to Facebook and Instagram was TikTok. And YouTube and they just sort of ignored those things and they sort of tried to define this, market of like social media as like keeping in touch with friends. And so when sort of confronted on this, they said, oh, you know, TikTok and YouTube are not about keeping in touch with friends and we're just talking about this market of keeping in touch with friends and family and. I, I mean, it's just such obvious nonsense. And the judge, it's, uh, judge James Boberg, who's he's been around forever. He is, you know, I was gonna say ancient, that's not very nice. But, you know, he's been a judge for a long time and he is pretty serious, thoughtful, fairly conservative in, a judicial sense, judge. And, you know, this case, like all antitrust cases goes on forever, but there had been some problems earlier on in the case. The fact that it only went to trial so many years later is, you know, partly because, the case was so weak early on and they had to file an amended complaint, but they never really got around this issue of sort of how you define the market. And that's where the case ended basically. The judge looked at it and said, the market definition is wrong, and obviously Facebook. And Instagram compete with TikTok and YouTube and that's only grown over time. And part of the argument is that, you know, if you're trying to bring an antitrust case, it's not just at the moment when the case was brought, but where we are now and today and going into the future. Because if the service no longer has a monopoly, then there's not much remedy here because there's no monopoly to fix. And the argument is that when you define this market properly, it includes TikTok and YouTube. And in that situation, META'S properties are not a monopoly in that market and therefore there's no antitrust issue. And you know, I know that a lot of people got really mad about this, but generally there were people who were just, mad about meta and like, you can be mad about meta and I don't like meta in particular, and I don't like Facebook or Instagram. I don't really use them. I think it would be nice if they were less powerful, but I think there are other ways to deal with that rather than antitrust. Antitrust is such a sort of blunt force instrument for trying to deal with these issues. And here you know, it was one of these things where I think a lot of people in power, just wanted to damage Mark Zuckerberg and meta, and the only tool that they saw was antitrust. And so they brought a case and it was a weak case and it took five years. And, knows how many hundreds of millions of dollars for the judge to be like, you know, this is just a really bad case. and, you know, he is very thorough in the ruling. and just walk through bit by bit, like, look, under this law, the arguments that you're making, they're just not valid. And so, you know, it's a huge win for, for meta.
Ben Whitelaw:And kind of talk us through how How the FTC would've tried to define the market? Were they, were they trying to make it small enough that Facebook was an oversized player in it?
Mike Masnick:Yeah. So that's, that's the thing that always happens. It's, it like every antitrust case goes through this, process because, the definition of the market is everything. Because in order to be a monopoly player in the market to, violate antitrust rules, there has to be a monopoly. and you know, the question is always, well, what market are you monopolizing? And you can go broad or narrow and like, the broadest market would be like the market for attention, right. You know, you could argue that that is a, a market, but then like, you know, that's gonna be almost impossible to monopolize. and so then you sort of try and narrow it down, but it has to be. At some level of reality, right? It's like, the tools that you use on your phone for entertainment or whatever. Like there has to be some sort of definition of the market that makes sense. And here what the judge was saying was that, that did not happen. or the FTCs argument was, just on its face, silly. And in fact, this happens in, in some. Antitrust cases because it's, it's such an integral part of, making the case. and you know, the FTC or the government or whoever's bringing the case is always going to try and argue for the most narrow market because that's how you would define a market in which somebody has, you know, we saw the same thing to some extent with the Google case that, I mean, there's multiple Google cases, but like, you know, there are cases of like, is it the ad market that you're talking about? And then, if you just talk about the overall ad market, that's huge. If you talk about the digital ad market, that's different. And so there are all different Parts to how you define it. And it's like, in an ideal situation, the government is gonna wanna bring a case that basically defines the market as, the weird thing that this one company does, because therefore you absolutely have a monopoly on it. Right? But that's also silly. And so, part of the, trick to being a judge in an antitrust case is sort of balancing those two things and figuring out what is, what is the actual market here? Is there real competition? And you know, what the judge found here is like, there is obviously competition and in fact, like it's kind of interesting, he calls out the, ban of TikTok in India. And in India. What happened was then users. Flock to, Instagram and Facebook and said like, oh, look, you know, there's evidence that there was competition when, TikTok was around, more people were using TikTok. And so there are all these different elements in there that he goes to, to point out that there, clearly are competitors. And also there, there's this element of, of it being a really dynamic market. And that was something that I thought was interesting too because it came out in the Google case, even though in the Google case, the one that that's concluded now, they were found to be a monopolist in that case, which I thought was a little bit of a, a close call at least on that one. I think there's another Google case that was a stronger antitrust case. but they did find that there was a monopoly over the sort of tying, we don't have to get into it, but like the search tying stuff. but when it came to remedies, the judge sort of admitted, you know, the market has really shifted. When we started this case, Google search, you know, you could argue that it was a monopoly, but now, like the AI stuff has become so big and so popular and it's really chipping away at Google search dominance. And so the fact that you have these dynamic markets suggest that, maybe antitrust isn't the necessary remedy to make sure that there's continued competition. and so both of these cases I think, sort of shows the fairly dynamic nature of the internet market. That's not to say that, there aren't potential antitrust issues, but again, this was just a very poorly argued case and a poorly set up case. and ended up with, meta, I'm sure very happy, even if, you know, they spent five years arguing over this.
Ben Whitelaw:Yeah, I wonder if Zuckerberg got in his hydrofoil and had a, had a spin around in celebration. Um, I mean, I've gotta say, Mike, this is a, bit of a personal blow in the sense that I always thought that kind of antitrust was one of the, one of the ways to help alleviate some of the serious trust and safety issues that we talk about on the podcast every week. Right? We come back time and time again to the issue of, scale being the challenge and how, it's only after platforms get to a certain scale and size that it becomes particularly hard to, mitigate harm in the, in the ways that, we see and hear about in the stories we cover. And so there's also the, you know, when we, when we talk about regulation, this idea of. Size being a kind of moat for platforms and, and how some of these bigger platforms are asking for regulation as a way to kind of maintain their stronghold. So I always thought that antitrust was a way to address some of that. do you see any pattern between the kind of Google cases and this meta cases that suggests that that isn't the case anymore?
Mike Masnick:Yeah, so, you know, my views on antitrust are, a little, a little bit nuanced and like everything, um, I think antitrust is a, is a useful tool for dealing with, markets that are not competitive. I have never been convinced that, they were the right tool for internet companies. That's not to say I don't think some of the internet companies are too big. I do. That's not to say I don't think we need more competition. I do, but I always felt that antitrust was probably the wrong tool for that. And I think these two cases, the concluded Google case and then this, meta case kind of show why not only did it take. Five years for both of those cases to actually come to a decision. But the, the market had shifted so much in that time that it, you know, it was the wrong tool and it's such a, blunt tool for this kind of thing. I think that there are better approaches including, regulatory and policy approaches that encourage more competition, and encourage changes to the market more naturally than using a tool like antitrust.
Ben Whitelaw:Right, you mentioned the kind of other ways to deal with it that aren't antitrust. What would you suggest? options are there available, I guess?
Mike Masnick:I mean, there's a lot. And, some of this is, is in this paper I keep threatening to release and then never actually finish. Uh, though I'm really trying desperately to finish it. So I, I don't want to preview everything that's in that paper, but obviously, like, I. I don't know if this is needs the bell, but like, you know, I believe an approach with interoperability and the ability for there to be more competition within a single network is actually really important. And I think that there are ways that you can create incentives for there to be more competition. You know, the, the sort of larger summary of the paper that I, I will at one day release with a whole bunch of very specific policy proposals is building out incentives, rather than mandates. You know, I think that it's very easy for regulators. To look for mandates to say, you have to do this and you have to do that, and you have to, do all of these specific things to get us to our goals. and antitrust is a form of that. It's, you know, using litigation because we have this law saying, you have to do this, you have to be, this si you know, you can't take over a market. I think the problem with sort of mandates in all these things is that the, both the companies figure out ways around it and the markets figure out ways around it so that they're not as effective in the long run. and we're gonna talk about this, I think in our next story as well, in terms of like the way Europe is now looking at the various mandates and regulations that they've put in place where they become ineffective is they just sort of create. you know, something that isn't, isn't helpful for anyone. maybe for good reason and maybe for thoughtful reasons. But I think if you can structure incentives so that, that the companies themselves actually have incentives to be better players in the world, that are, are better for the, the users including enabling more competition specifically across the board, which then leads to more innovation. I think that is a better result, but it's, it's one that involves more nuance, uh, and complexity and sort of thinking through second order effects. I think it's very easy to say, I see this, this is bad. This company's too big. We have to break it up. Or, I see, this, they're collecting too much data. Therefore we have to put laws in, in place for, for specifically saying, don't collect so much data. everyone is doing this sort of first order thing. I see something that's bad, let's ban it. And same thing with like, you know, banning phones in schools or social media in school. Like all of the things are, it, it's, it's the, the very, you know, you only think through one step ahead and you don't think through the eventual consequences of that. And I would like to see a world in which we think through more comprehensively, what is the world that we want to see and how do we get there? And just telling companies do this or do that is not necessarily going to work. And that's sort of the problem that we keep coming up against w with, almost all of the regulations that we see that, fail and things like these antitrust trials, and like I hesitate to, well I don't hesitate that much to criticize'em, but like, I think that they're being done for good reason. And I think that the complaints that people have that, push these laws and push for antitrust settlements and, and whatnot. I understand where they're coming from and I agree with their sense of the problem, but I disagree with their, solutions because I don't think they're effective. I think they, you know, they look good and they feel good upfront, which is what happened with this particular case, but they're not effective on the backend. and I would prefer to have, solutions that actually, are effective and, change the world in a better way.
Ben Whitelaw:yeah. Work all the way through. Yeah. I, I'm sure there are second order effects that I haven't thought about, um, in terms of what a successful antitrust case against meta would've, you know, what would've happened. And, so I think it's a, it's a fair point. when you do get ran to publishing that, uh, that, that paper we'll have something to talk about here on the podcast. Uh, you mentioned the kind of eus approach to regulation, Mike, and that's something that we're gonna talk a bit about now, I suppose whilst, um, there is certain, a certain approach to regulation taking shape in the us which involves quite a lot of litigation. which again, we're gonna come onto in the podcast. The EU has kind of taken stock week as it reviews how the Digital Services Act has been performing since it was launched about 18 months ago. So it has put out an evaluation report, essentially about how it interacts with other forms of European legislation. And it's an interesting kind of moment in time because, the DSA has got a lot of. A lot of stick in some quarters has got a lot of, had a lot of pressure from people, in the US administration. It's become a bit of a talking point in its own right, and this is. Essentially the EU doing a bit of a progress report really on how it's going so far. It comes at a really interesting time, I think, in European policymaking because there is a kind of broader existential crisis. Um, it is fair to say about competition and innovation in Europe. Like, like the kind of eternal question of why there is no, successful, tech startups in Europe is something that, has been occupying a lot of Brussels recently. And it comes on the back of a big report by Mario Draggy, who is a former ECB head who said that, you know, Europe was lagging behind US and China because it had created this kind of myriad, forms of regulation that was stifling companies in the European block. or so on the back of that, the commission did. went through a process that it caused a kind of digital omnibus, which I think sounds like a doctor who Christmas special, but is, essentially a kind of package of changes to how existing regulation works to simplify them and streamline them so that companies don't have as, as many kind of check boxes to tick and, forms to fill in. And actually that came out yesterday, which we're not gonna go into, but kind of in incorporates changes to the much discussed AI act, GDPR, which we occasionally touch on. And, essentially that package of changes from what I've read, goes further than expected in terms of kind of simplify things. So there's, there's a existential, moment within the EU around its digital policymaking. And so this report comes out a time where, you know, you might expect the EU to say, actually the DSA is a little. Overly complex or a little cumbersome. there are parts of the report that, say that the DSA could be improved, but broadly speaking, it says that the DSA compliments most of the EU laws out there. and I'll give you a few nuggets before I ask you to weigh in and, and get your thoughts, Mike. So the report says about how, most of what the DSA has rolled out so far has worked in practice. so for example, the categorization that here they set up of the very large online platforms and the very large online search engines has broadly worked. And the, number of users that it has set for those platforms of 45 million has, Worked fairly well and it talks about Temu and she and as platforms that, didn't have an EU presence at all back when the DSA was being drafted and all of a sudden has got enough users to become flops, that are eligible. it did say that the categorization itself is, could be overly simple because platforms are increasingly a mixture of search platforms and social platforms and, increasingly e-commerce as well. So they may need to review that in time, but that has kind of worked fairly well. There is some duplication it notes, That has often means that platforms have conflicting obligations and have to, I guess as Mario Draggy kind of noted, have to do more, to comply with the regulations. So examples it gives are around transparency, reporting, uh, reporting about terrorist content, what it calls dark patterns, which appear not only in the DSA but in other legislation as well. and there are some kind of other evidence there, but what's notable Mike, is really the fact that, while other countries kind of, I guess figure out how they want to regulate platforms, not only has the EU launched the DSA, but it is also in the process of kind of refining it, through this report and through other mechanisms. What did you make of, the, story from your active, and you know the kind of details of the report?
Mike Masnick:Yeah. Um, here's a surprise for you, Ben. Uh, I, I'm, I'm, I'm gonna praise the EU
Ben Whitelaw:What?
Mike Masnick:briefly here.
Ben Whitelaw:Oh my God. It's finally happened 20th of November, 2025,
Mike Masnick:um, the thing that I appreciate here is their willingness to kind of look at how does the law work in practice. and, and I include the omnibus, as well, which includes, by the way, which is worth mentioning because this comes up in so many discussions, especially among Americans. It's not just the GDPR, and the AI Act, but it also is the EPR Act, which is the thing that people, think. they think the GDPR is the thing that requires all of the cookie approvals, but no, that's the EPR Act and they're looking at that, including the, the whole like checkbox nonsense. so what I appreciate is the willingness of the EU to, take a step back and say, are these laws that we put in place working? that is not something that often happens in the us. We put in place laws and then we wipe our hands and we say, we are done with that until something else bad happens, you know, a decade or two down the road, and then suddenly it's, we'll, we'll consider a different law.
Ben Whitelaw:C would still be slapping themselves on the back after 18 months. They'd be they.
Mike Masnick:Absolutely. Absolutely. You know, EE except in the worst, worst cases, unless something goes really, really bad. So look, so I, I really appreciate the EU process here. and, you know, and I have talked about before, like I actually appreciated the process that, that the EU went through with the DSA and, the DMA in terms of like, they're pretty thoughtful about actually writing these bills, even if I think that the end result is not what I would've done. and I think has problems along the lines of the problems I was talking about in our first story, of the kind of mandates, But I appreciate the fact that they're looking at these things. Of course, it's leading to a whole bunch of people screaming and complaining, like, oh my gosh, you're like giving in to tech lobbyists and, and then the tech lobbyists are saying, oh my God, you're not doing enough. We're, you know, it's still, there's only a, a step, you know, there's all this sort of back and forth on these things, but I appreciate the fact that they're willing to look at these things and recognizing, maybe some of the compliance costs that we've, dumped on these companies is really bad and not very helpful. And sort of figuring out are there ways that we can streamline things? Which again, was like, part of the approach more with the DMA than, than the DSA was like, can we simplify the regulations? So that companies aren't held back. In that case it was more, each member state having different regulations around this. can we sort of, but then sort of this recognition that maybe there are ways that we can simplify these laws, still get the supposed benefits of them, without the, overregulation and, over compliance costs that lead to the problem that you're talking about, which is that there are, I wouldn't say no successful European companies, you know, but, but it's definitely limited and it is a limiting factor for companies that, it's tough to operate in the eu and, I think everybody recognizes that.
Ben Whitelaw:Just on to kind of put a number on those compliance costs. There is, buried in the report, towards the end, a bullet that says that compliance costs were between 15 and 30% of internal legal and IT resources. So like an an additional,
Mike Masnick:Yeah,
Ben Whitelaw:chunk of cash on top of, you know, what it would usually take to run the business, which I was surprised at kind of how significant that was, frankly. and speaks to, you know, why companies are, pushing back.
Mike Masnick:yeah. and honestly, I would ask for what benefit, right? and that's not to say that the laws are totally useless. And I think there are elements of these laws that have been useful and are interesting experiments, but the thing that they need to be are experiments where we can see where they work and where they don't, and then learn from that and adjust. And, and so that is why, I'm willing to praise the EU for trying to look through these things and adjust.
Ben Whitelaw:I sense A, but
Mike Masnick:Yeah, I mean, I don't think that I, I agree that I don't think they're going to go far enough, and I think that they're gonna get pressure to, it's the EU way on these things. Just, you know, I don't think that they can actually comprehend what is necessary to build a successful tech company. and so they're never going to make it as easy as other countries are. you know, and there is this other, the other element, and we sort, you sort of mentioned a little bit, but like right now with the US administration, as much as I am not happy with the current US administration and their view of just about everything. They're putting a tremendous amount of pressure on the eu. I think somewhat hypocritical because they're, a lot of the stuff that they're complaining about the EU doing and how the DSA is a censorship bill, which is an exaggeration, at the same time that they're trying to censor companies here in the US and doing all the things that, Jim Jordan falsely claimed that the Biden administration had done. you know, now the Trump administration is doing happily and eagerly, but I, do wonder how much of what is happening in the EU now is actually a result of the pressure from the Trump administration and JD Vance going and screaming at them and, talking about all this stuff. know, the fact is if that is why they're doing some of this stuff, the Trump administration is never gonna be satisfied. They're just gonna continue to claim that the EU is terrible in doing awful things no matter what. and so, hopefully that's not why they're doing it. Hopefully they are doing it because of the legitimate concerns and the legitimate look at how do we actually make this better. but, I think it's important to take into account the sort of global context as well.
Ben Whitelaw:Yeah. that's a really interesting point is, you know, this, there's an element of what the EU is doing, which is designed to, allow it to compete and allow companies within the block to compete globally, particularly with the US and China and, to be pro innovation and to be kind of broadly, growth focused. there's also an element of, what the EU is doing or countries in the EU that is, is almost the opposite. and it's kind of anti-US and anti China and. Mark Scott, who's, who's a really, plugged in, former Politico journalist who writes his own substack, kind of writes about this really nicely in his, his Substack digital politics. And I, we'll share that in the show notes, there's also kind of a strong digital sovereignty play happening right now. And, and countries like France and Germany are trying to make sure that their companies are, big on the global stage, that they bring in the best talent that they have, all the kind of incentives to grow, to compete on the, international stage. And so there is, this almost bifurcation of. Digital policy in Europe where some countries are kind of going out on their own, but they're also part of a block that wants to kind of as a group of countries compete on the global stage. And it's slightly odd. Mark calls it the kind of Jekyll and Hyde strategy. and it's unclear, you where it'll end up.
Mike Masnick:Yeah, I mean, I think it's always been a challenge with sort of European approaches. I don't think that it doesn't feel particularly new. You know, there've always been these discussions. I mean, I, I, like a decade and a half ago, remember having this. Discussions with various European government folks who were like coming to Silicon Valley and you know, it was always the same thing. Like how do we build a Silicon Valley in our country that, creates globally relevant companies? and I, you know, I had a whole lecture that I would give them and they would talk about why Silicon Valley. Silicon Valley. And the end result of those conversations would always be, well, but we can't do that. It was like, well, good luck.
Ben Whitelaw:Yeah. What? What did they say they couldn't do?
Mike Masnick:I mean, a lot of it was, so there have been all these studies and, I don't, we don't wanna go down this path'cause it's too, too long of a tangent. but, you know, a big part of it was actually the fact that. sharing of ideas between companies to build an entire industry is actually really important. For all the talk that we had about, like monopolies before evidence is that, sort of the success of Silicon Valley is driven by. The ability of people to leave companies and go elsewhere with their knowledge, which a lot of people, especially lawyers, freak out about and talk about, oh, you need non-disclosure agreements. You need non-compete agreements, you need all this stuff and you need patents and trademarks and, copyright to sort of lock in all this information. And yet the studies about why Silicon Valley was so successful is in large part because of, non-competes not being enforceable in Silicon Valley. And so you see different industries over and over again within Silicon Valley that develop and they develop because multiple companies, often with employees who left one and go, start another happen and you begin to build this, growing industry. but a lot of it's because of the, the movement of labor with the ideas that they take with them,
Ben Whitelaw:Right.
Mike Masnick:that allows for innovation and competition and all of these kinds of things that are, are actually really important. If you don't want to have like a single provider and that helps build up the industry. And the thing that people kept telling me, Europeans especially, was like, well, we don't like that.
Ben Whitelaw:Right.
Mike Masnick:we want people to have a job and sort of stick with it for a long time. And the idea that they would be leaving and take information with them or take other employees with them, that that's just not gonna fly in Germany, for example. and so, good luck, you know, but, but like, ha, having a single company, does not an industry make it can in some cases. Right. but having a single successful company usually doesn't develop the industry if you want. To be an industry town of some kind, where there's, you know, widespread innovation. You want multiple companies in that same area competing over the same, basic issues. And there you get the sort of the innovation race and the idea sharing and sort of people even, you know, even copying each other, but then building on it, oh, they did this. Well, we can do that, but better, is where you get the sort of flywheel effect of innovation that that seems to work and then builds out a larger industry.
Ben Whitelaw:Yeah. Good. Good to hear that Europe is still, asking that same question that they were asking
Mike Masnick:Yep.
Ben Whitelaw:15 years ago. Uh, probably with the same answer. Uh, and, and response. okay, Mike? Well, that it's an interesting, Story nonetheless. I think there's a lot of parts to that. it's a fairly wonkish story. the report from the EU on the DSA is, is not, a very easy read, but I think it's an important one to, talk to listeners about. So glad we did that. got a few other stories to kind of note, we've got a few that you raised, Mike, around state-based laws that are getting some, pushback, similar kinds of laws, with elements of kind of age verification and data privacy issues that we've, we've touched on and all being kind of, facing platforms, pushback essentially.
Mike Masnick:Yeah. And so, for all the mess of Europe, like the US is also a mess in terms of passing laws. And, and Congress is completely, immobilized and can't do anything. And so obviously the story of the last five years at least is that the states are trying to step up. they probably shouldn't be able to for variety of reasons like that. Maybe that's a controversial statement. But, under what's known as the dormant commerce clause, the states are not supposed to regulate things that are sort of interstate commerce, right? That's for the federal government. And the internet is sort of by definition, interstate commerce, right? And, and so it really feels like something that the federal government should be in the position to regulate. But they have completely, Congress doesn't do anything anymore, as you may
Ben Whitelaw:No,
Mike Masnick:And so the states are stepping up and they're pushing all these laws, and we've talked about plenty of them and why they're problematic. And we had, you know, the net choice cases that went all the way to the Supreme Court that put some limits on, on states, but states are they keep trying. And the big one now has been like, I. The child safety stuff and the age verification. There was the Supreme Court case, the Free Speech Coalition case earlier this year, which was about age verification, but that was specifically about age verification for adult content, in which that was deemed okay. I still think that ruling had problems, but it was in theory, limited to that. But a lot of states have taken that to say, oh, age verification is now fair game. And so we're doing that for social media in all different forms. And so California passed a law and Virginia passed a law and both of them are being challenged. and so that was the story this week. in both cases, I think, um, I might have the details wrong, the laws go into effect early next year, and so they're trying to get injunctions blocking them. The California law. It's interesting in that we have three different lawsuits, filed Meta, Google and TikTok. I think, rather than them all joining together, there was a, an earlier case by Net Choice, which is the trade group that represents those guys, which didn't go well. which the Ninth Circuit, judge who wrote the ruling did one of these horrifying, comparing them to tobacco kinds of things, which is just I, with every ounce of my being. It's like, no, if you're comparing it to tobacco, you've completely lost the plot. Um, but so that didn't go well. So now the individual companies have sued and they're sort of. There's some overlap between the lawsuits. They may get consolidated into a single case, which will often happen in, in these situations. Um, but they're, they're sort of arguing slightly different arguments for each one. We don't have to get into the specifics as the case moves forward that may become more interesting to follow, but they're sort of looking at that at the same time, net choice again, the trade group that has brought a lot of these cases, sued to block another similar law in Virginia, which has a one hour social media limit for children unless parents approve. And so there's an age verification component in there. In that case too, there was an oddity where, Virginia tries to claim that there's not an age verification. but then Governor Youngin, who is about to leave office, basically came out and said that you have to verify age to, which is, you know, not good for, for, for for Virginia's side of this argument. Uh, nacho is able to play up the fact that the governor admitted the thing that the law says it doesn't require. So, but you know, it's interesting, not surprising, but the tech companies are pushing back on these laws and sort of trying to figure out what, what are they actually required to do. And I think these laws have very serious First Amendment issues. Whether or not the courts recognize that is gonna be the big open question.
Ben Whitelaw:Yeah, and I, the Virginia one I found very interesting in particular because it's kind of specified in the law the platforms can, should use ways of knowing what the user's, ages, including prompting the user to put in their date of birth, which obviously is easily gameable as, many will have known
Mike Masnick:as my children will tell you,
Ben Whitelaw:Yeah. as, maybe even my very young child could, could probably almost guess. but the kind of, almost the way that the law was written means that it's not. It's not stated that that is the way to verify age. So there's a lot of kind of, it's very unclear. And that's left net choice with, I guess a, a worry that they could, do age assurance technology, age verification technology, which would have the privacy and kind of data issues that, that we know exist. So again, I think that that links to the DSA, stories in, in many ways because the, there is this trade off between having regulation that is broad enough and doesn't specify exact ways of dealing with, platform governance, but also leaving enough gaps. That platforms want to know how do I actually deal with this and can I be sure that you're not gonna make me do something I don't wanna do?
Mike Masnick:Yeah. So I mean, I think it'll be interesting to see how these cases play out. I think, the states are gonna just keep trying and the companies are gonna keep pushing back. And eventually, I'm sure we'll get another case that'll go to the Supreme Court and we will, finally get some more rules of the road. I do fear the kind of result of the original net choice case in California against this law, that, there's just such overwhelming hatred for, tech. You know, I think to go back to, to the first story, like Judge Bosberg in the meta case, actually was willing to look at this and not look at the look, well these companies are just bad. if you have a thoughtful judge who's willing to actually look through the issues, then hopefully we get, sensible rulings outta this. But I fear we might not.
Ben Whitelaw:Right. Okay. Well hoping for that. Um. Another story that crossed our path this week. Mike was, an interesting one on TikTok, who have announced that they're going to give users the power to reduce the amount of content they see, AI content they see on their feeds. there's a couple of kind of initiatives that they've launched this, the ability to go into your settings and dial back how much AI content you see. they've also kind of ramping up the automatic de detection of AI content through systems that they built, but also integration with kind of industry-wide initiative called the CTPA, A Coalition for Content Provenance and Authority, which is multi-platform effort to try and detect, not just AI content, but content that is, missing disinformation of the most kind of difficult to track kind and. It is an interesting development from TikTok for two reasons. First is that this was announced at the European Summit that the, the company hosts around trust and safety each year. And that to me suggests that, you know, AI content is a trust and safety issue, as we well know. But like, and as for a company to, to announce that and to kind of obvious, I see, admit that, and, and to make that very clear is notable in a time where so many companies that are, building generative AI technology have not, made that link and who, who probably aren't thinking as much about safety as they should. So that was the first thing that struck me. And then in the context of what other platforms are are doing with social media and AI content, it's also very interesting, right? You know. You've got Sawa open ais social media feed, you've got METAS vibes, which is almost a hundred percent, generated
Mike Masnick:is 100%. Yeah.
Ben Whitelaw:right? And, and as far as I know, there aren't the kind of controls and settings in those apps to allow you to, customize what you're seeing in the same way. so, so that's concerning for users of those apps and for internet users broadly. But it does show, I guess, a little bit of thought here, given by TikTok as to the dangers that come with, generative AI content that is, you know, allowed to proliferate without any kind of user controls and so that's credit to them for doing so.
Mike Masnick:Yeah. I mean, I, I'm a huge proponent of user controls and more user controls, the better, right. You know, pushing the decision making out to the users themselves, giving them more visibility into what it is that they're seeing, and giving them more ability to say, I don't want to see this kind of thing. I think is, is something that I'm, I, I always think is good and I think more companies should do. There's a question of how effective it is. You know, how well will they be actually be able to detect the AI. There are questions about how many people will use it defaults absolutely matter. you know, very few users actually go in and change these settings when given the opportunity. There are also questions about, you know, is it that lots of people say that they don't want AI content, but then we're seeing things like SOA and vibes and apparently so in particular is doing really well. So there's, clearly a market, to get back to market definition, there is clearly a market for AI generated content. and so it'll be interesting to see how this goes, but I think it's good that TikTok is thinking through this stuff and I think it's, it's a positive development and it's nice to see them give users more controls over, over their experience.
Ben Whitelaw:Yeah. Agreed. talking of kind of generative air concept, Mike, I'm being absolutely beset by babies playing with puppies. And I, and I honestly can't tell. my sense is that it's all generative ai, but I've got no way of telling. I dunno why They're always kind of, the babies are playing with flower and giggling. It's an odd genre
Mike Masnick:I, I don't know what the algorithm is, has, has had for you as someone who I, we, we just fostered three puppies. We just got them all adopted. in the last week. I will say puppies are very naturally very, very cute. And, and so, you know, we're already discussing if we're gonna get another batch of puppies to, to foster. but, uh, you don't need AI to, to make puppies cute.
Ben Whitelaw:No, gimme the real deal any day. Um, final story before we wrap up, Mike, is, actually a, a LinkedIn post from somebody in the trust and safety community that, you read, read a bit about, and whose musings are speaking to a lot of what we talk about on the podcast every week.
Mike Masnick:Yeah, this is, uh, by Daisy Soderberg Rifkin, and it's posted on LinkedIn and we'll have it in the, in the show notes and it has a little bit of a clickbait title, I guess, but that's okay. We're okay with that. This is the most frustrating word for trust and safety professionals, and it definitely touches on something that we discuss all the time. And, in this case, we're gonna reveal the clickbait. The word that she, she is upset about is just, but not just the word, just, but in the context of how people who are outside of trust and safety look at trust and safety and say things like, why don't they just. Fill in the blank. Why don't they just verify a kid's age? Why don't they just get rid of all the misinformation? Why don't they just stop the bad people from doing bad stuff? All of these things that all of us who understand trust and safety to some extent, or people who work in trust and safety know that like everything, all the big questions in history and society fall into that word just right. The just implies that there is some simple thing that if only they turn the knob this way, or only if they flip the switch that way, you could fix all these things. And the reality is that humanity is messy and humanity does not fit into that just right. You know, I just, uh, had this conversation, uh, yesterday at another trust and safety related event, about like. how much the world would be nicer if people were just not assholes, right? Like, but unfortunately you have some percentage who are assholes and whether they're assholes on purpose or just by circumstance or whatever, trust and safety is really this, this issue of trying to deal with, with assholes kind of all the time. And, until we solve that, don't solve trust and safety. And so there is no, just do this. and the problem is that the media, and, certainly a lot of policymakers assume that, the reason these things are failing is because the companies don't wanna do just that. And, you know, I thought this was a really nice essay that, kind of gets into a lot of things that we talk about, about the difficulties of all this stuff.
Ben Whitelaw:Exactly, and, and you know, she makes a point that that's the reason why, she, as a trusted safety professional, kind of speaks out and shares her knowledge about the trade offs and the incentives and the nuance, and she encourages others to do so. And it's also really why we started this podcast, frankly, it's the same, same reasoning is that, people need to understand. There is no easy answer that there is no kinda magic wand, and I think Daisy really sums that up really, really nicely. Good point to end, Mike. a, a fairly hopeful, tone to, to strike, at the end of control or speech. thanks Edward, ever for all of your, great responses for selecting those stories for us today and for sharing, what you think about them. we've talked about stories from you're active, from the Guardian, from the Verge, from digital politics. We couldn't do this podcast without those guys. go and read and subscribe and, support them wherever you can. as we talked about having more of the kind of nuance understanding about trust and safety and content moderation and internet regulation is, gonna be massive in the future. thanks for listening. Get in touch with us if you, want to hear more and we'll see you next week.
Announcer:Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.com. That's CT RL alt speech.com.