
Ctrl-Alt-Speech
Ctrl-Alt-Speech is a weekly news podcast co-created by Techdirt’s Mike Masnick and Everything in Moderation’s Ben Whitelaw. Each episode looks at the latest news in online speech, covering issues regarding trust & safety, content moderation, regulation, court rulings, new services & technology, and more.
The podcast regularly features expert guests with experience in the trust & safety/online speech worlds, discussing the ins and outs of the news that week and what it may mean for the industry. Each episode takes a deep dive into one or two key stories, and includes a quicker roundup of other important news. It's a must-listen for trust & safety professionals, and anyone interested in issues surrounding online speech.
If your company or organization is interested in sponsoring Ctrl-Alt-Speech and joining us for a sponsored interview, visit ctrlaltspeech.com for more information.
Ctrl-Alt-Speech is produced with financial support from the Future of Online Trust & Safety Fund, a fiscally-sponsored multi-donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive Trust and Safety ecosystem and field.
Ctrl-Alt-Speech
Outsourced But Not Out Of Mind
In this week’s roundup of the latest news in online speech, content moderation and internet regulation, Ben is joined by guest host Mercy Mutemi, lawyer and managing partner of Nzili & Sumbi Advocates. Together, they cover:
- Meta can be sued in Kenya for human trafficking and for algorithmic amplification of harm (Open Democracy)
- Billy Perrigo on investigating Facebook's 'ethical' outsourced content moderation in Kenya (Everything in Moderation)
- A first look at Meta’s Community Notes (Indicator Media)
- Get Noted (Columbia Journalism Review)
- The Meaning of Being an African YouTuber: Big Audiences, No Big Money + Is TikTok Excluding Africans From its Creator Economy? (Fast Company)
- Is TikTok Excluding Africans From its Creator Economy? (OkayAfrica)
- I was tricked, tortured, finally freed: inside a Burmese scam farm (The Times)
- Tanzania announces shutdown of X because of pornography (BBC)
This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.
Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.
So Mercy. I've recently downloaded the True Caller app, which I'm not sure if you're familiar with, but does Caller ID services and also kind of interventions. I've been getting a lot of calls from weird numbers and texts from people asking me to add them on WhatsApp. So I've downloaded True Caller and I've noticed that they're getting bigger in Kenya and Nigeria, which is irrelevant. for your guest appearance today on the podcast. And the app asks me as I was onboarding to identify every call and see who's calling you. that's, its kind of, you know, its Mo So I wanted to ask you on your appearance today, who would you like to see? Who's calling you?
Mercy Mutemi:Okay. first of all, not a big fan of true collar.
Ben Whitelaw:What?
Mercy Mutemi:yeah, yeah. Listen, um, I must have been maybe 19 when I first came across true collar and. It just always seemed to snitch on you, like from that perspective. Like you couldn't just call, you couldn't just call anyone, but what I didn't like about it is it seemed to get, records wrong. So at the, at that time, I had a boyfriend and for some people my number would show up as my boyfriend's. Name on their phone, and that's how they saved me like for 10 years plus. So this is like a.
Ben Whitelaw:It's a personal thing. So, so, so people, people have been putting their calls down on you thinking that you were your ex-boyfriend.
Mercy Mutemi:There you go. So that, that, yeah. But having said that and seeing that, now we live in very repressive times, here in Kenya. there's this thing where if you speak up against, uh, violations of rights, then you start to receive random calls. To threaten you, like you should turn down, stop fighting the government and things like that. I would like when a call comes from, you know. some paid harasser. I'd like to be able to know who that is so that I
Ben Whitelaw:Yeah.
Mercy Mutemi:not pick the call, first of all. and second of all, like if I wanted to take action, I should be able to take action knowing that there's a person.'cause sometimes they're just anonymous calls. But also to just mention, I've just realized we live in very different realities.'cause your biggest problem is calm calls. I'm like, I dunno that, um, any manufacturer is just like,
Ben Whitelaw:like,
Mercy Mutemi:blind calling. Do you wanna our our soap today? No. No.
Ben Whitelaw:no. my biggest issue is LinkedIn recruiters who have somehow found my number and offering me jobs. So, you're right, you're right. we've already started on a different foot. Oh, I'm, and I'm expecting this to continue throughout the podcast. I'll be honest.
Mercy Mutemi:I love it. I, I love that the difference is clear as day.
Ben Whitelaw:Hello and welcome to Control Alt Speech, your weekly roundup of the major stories about online speech, content moderation, and internet regulation. It's June the 19th, 2025, and this week's episode is brought to you with financial support from the future of online trust and safety fund. This week we're talking African moderators, an early look at Metas community Notes and Myanmar's scam mills. I'm Ben Whitelaw. I'm the founder and editor of Everything and Moderation, and I'm with a very exciting guest this week because Mike Masnick is away. I'm with Mercy Matoo me, the award-winning lawyer, she didn't even tell me to say that. And the managing and the managing partner of In the Zealand and Simbi Advocates Mercy. Great to have you here.
Mercy Mutemi:Thank you for having me. I, I kinda like how you say my name, but like, I'm like, I've never had that pronunciation
Ben Whitelaw:Oh no,
Mercy Mutemi:I like it.
Ben Whitelaw:Come on. You need
Mercy Mutemi:I like it. No, no, no. It's fine. It's, it's Massi.
Ben Whitelaw:Me. Emmi. Emmi.
Mercy Mutemi:Yes. There you go. You got it.
Ben Whitelaw:rule of being a podcast host is you meant to check with the guest, the pronunciation of her name. I, I'll be honest, I'm kind of in awe right now. I've, referenced you a lot in everything moderation. I'm a big fan of the work you're doing. We'll, we'll get into that. Today you're based in Nairobi. Just to give a bit of, context for listeners, you're based in Nairobi, in Kenya. Where your work focuses on advocating for fair and equitable technology in Africa, you're most likely known for three cases in particular that many of our listeners will be familiar with, that Mike and I have talked about a lot, including. Daniel Tang's case and the case of a hundred eighty five moderators against meta, for human trafficking, mental health, and unlawful termination. We're gonna kind of talk about that throughout the podcast, but you've been working on these, cases for over two years now. Is that right?
Mercy Mutemi:Yeah, yeah. Daniel's case, uh, since 2022. actually two cases since 2022, and then the other one since 2023. So, I mean, time does fly.
Ben Whitelaw:how did you get involved in those cases? Talk to us about the kind of origin of your involvement in them.
Mercy Mutemi:actually Daniels and Abrams were the first, cases I was briefed on. and this just happened to be like an accidental meet. Um, Daniel was looking for a lawyer in Kenya. He'd met up, um, with, um, he'd done the time. magazine Expose. Did you read that? By, by Bill. Um, and that had led to him being connected, to lawyers in the UK who were then looking for lawyers here in Kenya. So that's how I got connected, to the Daniel Daniel's case. So Kiana is, which is the, bigger case actually. Bigger in terms of the number of content moderators they've now been. combined into one case by the court. So it's like one mega case, but the case with, uh, more content moderators is, it's actually interesting. filed Daniels in 2022 and then that triggered a mass, termination of contracts. Towards, the end of 2022, beginning of 2023. So then that case began, in 2023, it's very clear it was retaliatory action for the first lawsuit. There's a lot of mess that was happening there. Lots of violation of, labor laws and human rights, laws as well. and we'll, get into it a bit, but that. In context. that, is how I came into these specific cases. But before that, I had. Mastered already in internet governance, and I was already taking on cases that my focus has just always been like, what does a fair and equitable technology for us as Africans? What does it look like? There's lots of conversations that are very American driven when it comes to internet governance and what fairness looks like. so even when you talk about universal access to the internet of freedom, of, of expression, of free speech. So to speak. It's very Americanized, but there's a way in which it doesn't translate into a local context. Right? we are talking about the next frontier of AI and algorithms training. we've not even gotten more than half of our population onto the internet where the data sets. So that's, that's the kind of stuff that eats me up at night. So, um, at first it was before there was meta, there was like lots of lawsuits against the government. Mostly because of, policy that didn't support fair or reputable technology, that didn't advance universal access to the internet. Didn't protect the right privacy or freedom of expression. And when I step back now, I'm able to look at all these cases as a conglomerate of cases of This is who. Has the power in the internet space. It's a government and it's big tech companies, they're chatting it in a way that isn't fair to users and to people and doesn't put people first. So then I'm able to describe my work as putting people back into the middle of these conversations and then seeing whether we can change, the wave of regulation to favor people.
Ben Whitelaw:in that sense, from how you described the situation in Kenya and the cases that you've taken on, it feels like Kenya is, very similar to lots of other countries, albeit as you say, with less power in terms of. The way that its users are represented. how would you kind of characterize The tech policy scene in the country and, uh, you know, are you one of a number of lawyers who are working on cases that are going through the courts? Can you give us a sense of that?
Mercy Mutemi:Yeah. Yeah. I think Kenya is a, walking contradiction of sorts. So on one end, I don't know if you're familiar with the term, the silicon savanna. I.
Ben Whitelaw:Yeah. Yeah, yeah,
Mercy Mutemi:I do not like that term. Let me just, I, I feel like this is a pattern
Ben Whitelaw:you do like.
Mercy Mutemi:I mean, I, I do like this podcast, so, you know, there are lots of things that I like, uh, but I, um, I have very particular feelings around, f of the Silicon Savannah. But regardless of what my personal feelings, uh, I think. That in itself is the contradiction, right? And it's a name that came up during like the ESSA bloom of, you know, this technology around, uh, mobile money and Kenya positioning itself as this leader of technology generally in the African continent. and. over the last couple of years, I've really examined what that really means and what that translates to in, policy perspective. Does it mean that we want to have, homegrown technology in terms of actual infrastructure? Like, you know, we have Safaricom, it's homegrown. We have our own products. I mean, homegrown is in quotes. does it mean that we, we want to see. Like an African social media giant arise out of Kenya. Like what does it really mean? And every time I've, questioned it, it seems that the narrow understanding of that term, at least from the government's perspective, is that we are supposed to be the backbone of the Silicon Valley.
Ben Whitelaw:Mm. Right.
Mercy Mutemi:we, not to develop our own technology, but to be the stepping stone for Silicon Valley. and that's hence why I don't like the term the silicon Savannah, because that cannot be our positioning as a country that, we are just helping, Silicon Valley succeed. But at our cost and what that has looked like is the promise of jobs, digital jobs, a million digital jobs being promised through the business process outsourcing model. And when you dig up, it's basically you're gonna do algorithm training work for the big tech companies in Silicon Valley, or content moderation work in Silicon Valley. So. This center, this beautiful center that's supposed to be the capital of digital technologies, at least for the continent, ends up being the capital city of abuse. And either way, we, we are having lots of conversations around technology, but they're not positive conversations. A lot of them are harmful conversations. And that for me is, what has always been, quite sad about the situation we're in. African wise. From the African perspective, we do have lots that is going on for us. Like good connectivity here. I mean, yes, we've had some good examples of technology arising out of here and being used elsewhere. We seem to be very advanced. Uh, governments, you know, have even government services being online. Those are good examples, but those are very small use cases. What's the bigger picture here? Yeah.
Ben Whitelaw:Yeah, that's a really helpful kind of overview and this is really what makes the story of Daniel's case and of the 185 moderators, which he referred to as the Kiana versus Meta case. And the third case, which we're gonna talk about in the course of today's podcast, the Abraham Versus Meta case, which is about, the Ethiopian Civil War. we'll go into more details on that, which is why these cases are so fascinating. They're fundamentally kind of about digital labor. large part, and this is something that is a exploding story in, not just in Kenya, but in other parts of Africa as well, in Nigeria and in Ghana. and who does the training of, you know, who does the concept moderation, but increasingly who does the training of these large language models that we're all going to end up using? It seems inevitably. and it's this question of kind of the humans in the loop, which is often the term that people use. You know, the humans who are in the process that are, created by platforms and by business processing outsources these huge companies who put bums on seats to do that content moderation at scale. It's a fascinating, fascinating story. not one that has very happy endings for a lot of the people involved, but is very important and is really why I wanted to get you on the podcast to, talk about it. you wrote Mercy about, a few months ago for Al Jazeera about how you felt that the kind of people of Africa. people in Kenya and elsewhere were digital cannon fodder for meta in what you called a war against harmful content that the company has never really committed to fighting. know, is that something that you've, you increasingly believing as you work on these cases? what is it that's made you arrive at that point?
Mercy Mutemi:Yeah. Um. there's a very particular story that these three cases tell. so just like an overview, think about how human content moderation is carried out, and specifically how it's being carried out in the African continent. Number one, it's, Not multiple centers in different countries. It's one center in one country, and then bringing in people from different African countries. the peculiarity here is every African country has a. Hundreds or sometimes thousands of tribes. we are not a monolithic society when it comes to language or culture. So if you are to do content moderation, then truly you do need, people from different communities to come in and moderated content. So you, you have content moderators coming from South Africa, from Ethiopia, from Nigeria, coming into Kenya to do content moderation work. and what the two labor facing cases, for example, focus on is just how. Badly mismanaged. That process was, you are coming to do content moderation work, which we now understand to be, very dangerous work. Work that. Inherently carries the risk of mental health harm, without warning that this is the work you're coming to do without even disclosure that you're coming to do matters work or Facebook content moderation work. It's just like vaguely described. And there's this boom on the African continent around digital jobs. So when you tell people like there's a digital job in Kenya, they're going to go for it because it's being framed as an attractive venture for the future. Right. And then it's the model itself that's also very problematic. Leave alone, content moderation work, carrying inherent, risk for mental health harm. When you do that work in a model where one, If you remember what the community guidelines looked like before January, before January, 2025. I mean, there were numerous, they were granular. It was rule and then subru, and then sub rule and then sub rule, and how it was being done here was. A moderator had to sit with content and match it to a very specific sub rule because they're also training the algorithm to be able to pick this content up in the future. But then adding to the mix, very unachievable metrics, like you have 70 seconds to make this decision. You have 60 seconds to make this decision. The whole model becomes so exploitative. And then now use the example of Ethiopia going through a conflict in a country where a lot of misinformation, a lot of incitement is happening on Facebook. So a lot of content is being reported and being queued to the content moderators for them to take it down. But then this, fix of a system, a broken system where. One, you cannot make a wrong decision. It will lead to you being penalized and your salary being reduced. you cannot take longer than the assigned time for this piece. You'll also be penalized for that. You have to review as many videos as you can in a day, but the videos just keep coming in and you're not enough people on the floor or even speaking the language. And what ends up happening is what I. The Abram versus meta case, you know, shows us that, number one, you have community standards that are not contextualized to the African continent or to African communities. So the issue here is in times of conflict, there are signs that a conflict is. aggravating and I mean, we have to learn from previous genocides that have taken place. Learn from Rwanda. The first step is dehumanize. People call people, you know, the names of animals. That's what was happening here. And the community standards do not acknowledge that. Like when you start to dehumanize people, then this is content that violates community standards. Nor is there a structure where the people on the ground actually doing the work can communicate directly and cause the community standard to be changed. So if the community standard says, like the threat is, if there's a direct threat to your life in English. everything has to be seeded out by that standard. So if it was in Swahili, it's a threat to your life by when you translate it because of, the slang in different languages, then it doesn't translate to ham. The English version takes, priority and that is in itself a false equivalence. Because this thing, what matters is how people are receiving it. So in Abram's case, um, it's content that clearly violates the community. Standards clearly is insightful. It's doxing, You know, when we talk about content moderation, there's always this, concern that, oh my God, free speech. I feel like the world all over has agreed that there's some standards, like there's some min bare minimums where we are not talking about terrorist content. we're not debating about incitement to violence content, for example. So if such obvious things cannot be taken down from the platform, because number one. There's nobody on the floor that's speaking the language or the people on the floor are too, overworked, or the people on the floor cannot effectively communicate with the platform to change the community standard. Then you end up with, the social media platform facilitating violence in real life. So when I step back and look at all of this, then it's that it's. Facebook does make money off Africans because advertising is a model. Advertising is the goal here. So this content moderators are cleaning up these platforms, so advertising can continue on the front end. What's happening to them is their lives are getting destroyed by doing this work, number one, and number two, the communities they come from are being destroyed because the model of doing the work is so. Poorly structured that it cannot lead to any other positive result other than meta making billions in advertising.
Ben Whitelaw:Yeah. you've really kind of unpacked it really neatly there, but, this is the thing. It's it. The, the fact that platforms don't have offices in major countries and markets where they have millions of users, they don't have trust and safety teams. and they don't have the, the skills or the AI models or the, the language coverage to be able to moderate. The users in that country is really a thread that runs through all of these different cases. You know, it's, everything being done from, west coast, us as you say. That has been, a kind of running theme really since I've been doing the, the newsletter since 2018 and continues to be, An issue that crops up in so many of what we talk about on the podcast. Let's go a bit deeper Mercy into the, that third case, into the Abraham versus Meta case. Um, because that, that's really, he, he has spoken to, open democracy relatively recently. he's the plaintiff in the case that you started to talk about there. But I want to give a kind of overview of this for, for listeners who aren't as familiar. So. let's kind of go back to 2020. Many of our listeners will know about the kind of tigray conflict in the northern part of Ethiopia. essentially kind of civil war over the course of two years that broke out on the border of Eritrea, as a result of clashes between government forces and the People's Liberation Front, which is a kind of, group of armed forces in the region who, reasons we won't go into, were not, fans of the government at the time. So we've got a Civil War. The Civil War ended up killing between 80 and 380,000 people according to different estimates. And there was extensive fighting. And within this, there was an anonymous account. On Facebook, the accused, the chemistry professor of funding the Tigrayan Rebels, and also of defrauding the university that he worked for. he was doxed. and for the reasons you mentioned. None of those posts were taken down or had anything done with them. And, we can go into kinda why that was. But he ended up being shot at his home and, The anonymous account was later found to be part of a disinformation account, disinflation network that was kind of seeding disharmony online in various forms. Now, this case you're talking about, Abraham versus meta is his son. it's that professor's son. And I want you to just kind of give a, a, bit of an overview for people about what, are the, reasons for the case, what is he asking for? And, where is the case at before we go into kind of his latest comments to open democracy.
Mercy Mutemi:so. Abrams case is brought by, three parties. There's Abram, there's, and there's Kaba Institute, which is, a civil search organization, here in Kenya. And, um, the premise of that case is. One, content moderation was happening here in Kenya. So when Abram's father was being doxed by a post and people were reporting it using the, reporting tool on Facebook, those posts were being queue to content moderators here in Kenya. Right. And ultimately the decision of do you delete it from the platform, do you not delete it from, the platform was being made in Kenya and that's why that case is being, litigated in Kenya. Fier as well is, he's from Ethiopia, but he was in Kenya when he was being doxed, when he was facing a lot of incitement content. And
Ben Whitelaw:it's worth noting, he's an amnesty international researcher, right? So he, he's been kind of doing research in this area and was also subject to doxing as you mentioned.
Mercy Mutemi:Yes, yes. At that time he was researching for Amnesty International. so. The question then becomes number one, can you be making those kinds of decisions here in Kenya where we have a constitution that says, this is what freedom of expression is. And these are the very specific categories that freedom of expression does not extend to. We have propaganda for war, we have incitement to violence. we have a hate speech, which all of these things, Is what the post qualifies us. So the question before the courts then is if you are having these activities here in Kenya, can you then violate Kenya's constitution in the process and make money of it? Because then you know this post because of the viral nature. They become viral. A post is viral advertising revenue increases. So essentially, can a private company make profit over violating human rights law Here in Kenya is the question before the courts, and the stage where the case is at is we just, had a decision in AP April in that case. Something you mentioned earlier around jurisdiction has become really the biggest issue in all three of these cases. Some facts here are relevant. Meta does not have a physical office in Kenya. I. As far as we can tell, I mean there, there's rumors for like a branding office, you know, where content creators can go and have lunch and, you know, there's rumors about that. Like, but as much
Ben Whitelaw:much. Sounds
Mercy Mutemi:we can tell,
Ben Whitelaw:Sounds really nice.
Mercy Mutemi:as much as we can tell for the purpose of ES establishment and legally speaking jurisdiction, meta isn't here. Right, but it had content moderators here and it makes billions in advertising revenue from Kenya. So can Kenyan courts, hear a case then against them when they're not physically in Kenya
Ben Whitelaw:And that, just to clarify that, that's because the, the, the Business Processing Outsourcing company, was based in Kenya, right? This is this is the way that you can do a certain type of work from a country but not have an office here is because you, you hire essentially a to. hire moderators and, keep them on staff and make sure that they're supposedly happy. In this case, none of that was really done. according to,
Mercy Mutemi:I think the web gets more complex because it's, meta in Menlo Park with a company in San Francisco, summer Source striking a deal the us. Somer source, setting up like a branch here in Kenya of like the business process outsourcing company. all the decisions concerning, the content moderation being made in San Francisco by both summer source and meta, and then meta being completely in charge of the actual system that conducts. Content moderation. So it's, it's really one of the situations that break the mold of Our traditional understanding of jurisdiction, our traditional understanding of, employment. Even the jurisdiction piece has been settled. The employment piece is still yet to be determined. So if we were to discuss a little bit about the jurisdiction, the question, from the beginning of at least the labor cases was just matter saying, actually, you can't sue us in Kenya. We're an American company.
Ben Whitelaw:And
Mercy Mutemi:with Abram's case, it was the same thing. It's like the harm was felt in Ethiopia. We are an American company. You shouldn't sue us in Kenya. And, the decision we just got from the Constitutional and Human Rights Court here in Kenya was that number one, meta doesn't enjoy any form of immunity. It can be sued in Kenya, especially if their actions that are taking place here in Kenya. The Kenyan courts have the authority to examine those actions and see whether those actions violate the Constitution and the Bill of Rights. Okay. Number two, it doesn't matter where the effect is felt, as long as the action takes place in Kenya, what you've done here. If it violates the Constitution, then we handle the issue of harm later. But at least the Kenyan court is the correct court to bring these cases. And I think this has been, very. Presidential and very fundamental because think about, the algorithmic amplification cases. For example, before the Supreme Court in the US and last year, there's a big debate around it when Gonzales, the Gonzales case was going on when the Twitter case was going on, and basically, I mean, the cases were decided on a different basis, but the issue of section two 30 and the whole immunity awarded by section two 30 to take companies. Being center stage, can you really sue these companies for their algorithms? Because you remember like Congress's idea is we need to protect this Nacent technologies. If you allow them to be sued, then we're going to destroy technology. I mean, it's 2025. I dunno that these technologies are still nursing, but like for me, what it was and what is really Very, very paradigm shifting about the decision in April in Abrams case was. African courts, giving us an alternative towards tech accountability, which is examine the effect on people. So if there were rights that were violated, then there is cause to sue the tech company. Forget about administrative law, forget about section 2 30. If there's people that have been harmed, then they deserve a remedy. And that is for me, mind blowing. Yeah.
Ben Whitelaw:And I'm, I'm already in saying that this is the, these cases are the first of their kind in that you have meta being sued, outside of the country because they've lost these appeals. Right? You know, they, they now face these two cases, the combined one of Daniel Mot and, the 185 moderators and the Abraham one that you've talked through, they're now facing these cases in Kenyon Court. At what point in time will that happen?
Mercy Mutemi:So in Daniels and Kenna's case, they've appealed all the way to the Court of Appeal, which. Normally could be the last, court of appeal unless the Supreme Court, finds that this is a case that it wants to hear. So we don't know yet whether the Supreme Court is going to want to hear an appeal for meta. Meta definitely wants to appeal all the way to the Supreme Court in the labor cases. But for all intents and purposes, it seems that the labor facing cases are proceeding to trial because the Court of Appeal has pronounced. Itself on them. And in Abram's case, we've just gotten the decision from the Human Rights court and Meta has indicated that it wants to appeal to the Court of Appeals Court. We haven't seen the appeal yet, but then that might still have some journey to travel. But you are right. This is the first time a court outside of the US for the labor facing cases has decided to try meta outside of the US'cause we do know that there was a content moderation case happening in the us. It was settled, but at least it wasn't thrown out on jurisdiction. Now on Abram's case, I want you to remember about the Myanmar case, which incidentally in the US was, lost at the first stage and is now on appeal. now drawing parallels. it's, it's crazy to think that one of the reasons the Myanmar case was thrown out was section two 30 and the immunity there, I mean, there were other reasons, including limitation of actions. But here on the first trial, the court is saying, we don't care about those other rules. We just care about the people, what happened to the people. So that's what's really, different here.
Ben Whitelaw:And this, this for context is the Myanmar case you're talking about is when, the Rohingya was suing Facebook for$150 billion, if I remember rightly, for Facebook's role in the Myanmar Genocide.
Mercy Mutemi:Genocide. Exactly.
Ben Whitelaw:yeah. so this was a 2021 case. I actually didn't realize that it, it was on appeal now. So again, you, what you have is, Facebook essentially kind of pushing back in all forms. Rather than going to trial in these cases, wherever they are, whether it's in, whether it's Myanmar, the US or Kenya. So one of the reasons we're talking about this now, mercy, is because Abraham has given an interview to open democracy in which he's talked about the case. he is, clearly, very angry about what's happened. he says it's disgraceful that matter would argue that they should not be subject to the rule of law in Kenya. African Lives Matter. and, you know, it seems like something, this is something that he, he is going to kind of continue to run with. talks about his father as well. So again, you know, don't need to go too much into his comments. We can include them in the show notes, but again, it's just reiterating I think the importance of, these cases and, what they could mean if they do end up going to trial and, and going through those stages that you mentioned.
Mercy Mutemi:Yeah, absolutely. And I think one other thing that, um, Mui, points out in that, open democracy piece is really how this was not an isolated incident. Like this is part of a pattern. And you hear this in, Abram's interview as well. Like he understands the systemic way in which. Meta's role, destroyed a lot of communities, but also McKenzie highlights several other families that were impacted in the. Pattern, which, for me is really what we shouldn't miss about this. it just, it's not one person in Ethiopia that was affected. It's entire communities were destroyed by structures that have been set up by tech companies for business purposes. And there's a big conversation there that should never be lost. Yeah.
Ben Whitelaw:Yeah, agreed. again, I hadn't seen these other examples as you mentioned, and it feels like a kind of tip of an iceberg, you know, that these cases, are potentially, you know, hiding a, a much. Greater number of people who've, suffered as a result of some of these issues. It's worth noting that Open Sea reached out to Metta for comment. They didn't receive a response. meta haven't done a lot of, commenting on this, I, I guess, as a result of the cases. but yeah, I think that probably wraps us up for that story. Thank you Mercy for giving us such a great overview. We'll move now to, I guess, what looks like to be the alternative model of moderation mercy if, if meta and other companies have their way. and that is the kind of community notes model. you touched on the January, 2025 announcement by meta, in your opening remarks, we know a lot shifted when Mark Zuckerberg daunted his gold chain and his expensive watch and, and, you know. I still have, I still have his image etched in my brain. I dunno about
Mercy Mutemi:Oh my God. We all need, uh, a moment of cleansing to just raise that because that was crazy. And the three hour podcast after that.
Ben Whitelaw:Yeah, yeah, yeah. if Mike was here, he'd be talking about Joe Rogan and that podcast. Um, you know, for sure. bless him. He listened to the whole thing, which I didn't have for the patients to do.
Mercy Mutemi:I can't.
Ben Whitelaw:a, it was a big moment. Um, and they said they were moving to a Community notes model, much like X slash Twitters, birdwatch turned community notes model. but we haven't heard a lot. Since then, despite them saying that they would. Give us an update. Since then, they've published several reports about, the prevalence of kind of harmful content on the platform. Just a few weeks ago, they put out their community standards enforcement report that said that, mistakes had halved. In just one quarter. So the, the kind of tweaks that they'd been making, making to their dials had meant that there was fewer, incorrect moderation decisions happening. whether we know that for sure, you know, it is just their word, but there was no update mercy on the community notes model. So I was very interested to see that. a good friend of control speech and everything in moderation. Alex Zales, who leads the kind of trust and safety program at Cornell University. he did his own research, and we'll include the kind of link in the show notes. He's, he's actually started a great, little kind of media, outlook called Indicator, which. Is doing some fantastic work around, fighting digital deception and osint and, you know, he and Craig Silverman, who run it are, are doing some fantastic work. But Alexs has basically done some very gorilla style community notes research. He has asked three people in the US to screenshot all of the community notes that they see as early signups for the program and. They have sent these screenshots to Alexs and he, he's done some analysis, obviously it's a very small data set, three people, something like 60 odd, community notes that he was able to kind of analyze and what he finds, you know, there's some good stuff. There's quite a lot of bad stuff and there's some kind of like, frankly, very badly marketed stuff, which I just thought would be kind of
Mercy Mutemi:Yeah.
Ben Whitelaw:So the good stuff Alexis says is that some of the community notes are spotting where content is AI generated or AI driven. So it is for people who are really uninitiated, there are some helpful community notes that say. this has been made by chapter GPT. Yeah. and that's helpful, right? as more and more content is, created through, LLMs. that's
Mercy Mutemi:bare, bare minimum. That's bare minimum. So
Ben Whitelaw:Bare minimum, yeah. Yeah. It's kind of like best case scenario.
Mercy Mutemi:yeah.
Ben Whitelaw:some bad stuff. he notes that some of the notes, included outright lies, um, misinformation themselves. So there was a post, It was made on, I think a Facebook post from LATV about some recent process in the US and that basically had a completely, you know, included a URL that was related to a different protest that was a few years ago. So, you know, you've got misinformation on misinformation in a way. And then the thing I found really interesting was that a lot of the community notes on meta at this stage were basically, Completely not the right thing. They, they were kind of, people had left community notes not really knowing what they were. some of them had left comments, like they were leaving comments under an article. they'd used it a bit like a post itself. Um, much of them were opinion, so these were community notes that just couldn't be fact checked at all. and actually, you know, shouldn't have been community notes in the first place and shouldn't have been requested to be community notes. Alexis has the, is kind enough to say that this is at the early stages of META'S program. We're only three months in Birdwatch. Did a lot of testing, almost two years of testing before, Twitter rolled that out to its community. But, it doesn't look very promising. I'll say, if we're gonna move to this new model Mercy, then I'm not sure it's all it's cracked up to be. What did you think?
Mercy Mutemi:Okay. So. First of all, I need for when we do research, I need, for like a parallel research to take place on what that same feature looks like in Africa and in other countries. I think that's the first context we have to give, from my reading of this, piece. And they've done a really good job. And congratulations also to your friend for having this, page was. If it's being piloted in the us I just want you to imagine just how much less common that feature is or is being used here, in Africa, but like also. It was framed as we are abandoning, content moderation. But I think to me, what this presents is you just, you've just replaced who does the content moderation. It doesn't seem like you've abandoned the content moderation. I think the focus here is still, I. To have some habit of some kind. The only thing that's very clever about what they've done is that finally, this is a way that keeps the user longer on the platform as they contribute to the community nodes. I mean. Genius business decision because that's the ultimate goal is to keep people on the platform. But other than the utility of it, I mean, come on, like you can't be surprised that these anomalies are coming out. First of all, you are hoping that society is going to self-regulate and somehow come to an agreed position on some very controversial issues. when did that happen ever? When did that
Ben Whitelaw:dream?
Mercy Mutemi:I, when did that happen? Ever. I mean, if the aspiration was for, for content to be tagged. Like this is extreme, this is extreme and this is a middle position like that I would understand, but like there's this fallacy that we have that. A very polarized society is going to come argue its way into like a common position. And for me, I feel like that is a fallacy that you are now giving tech engineers to put into action. I don't know how that's going to be possible, but then as well, it's, it's how we are seeing. Digital literacy coming into play as well. How educated is the audience that you want to communicate and to build into these community notes? How, how much literacy do they have around like how to spot fake news? How, how to read.'cause the other thing is for this system to work, people actually have to be very white dread. So I'll give, a utility example from what we are seeing on the community notes on X here. So. I think a lot of African countries are going through what could very likely be like our own African Spring, so to speak, where people want, you know, better governance. People want, better protections of people. So lots of online conversations are very anti-abuse and anti, repressive. And what community Notes ended up being is not that they were. Pointing out what's accurate about something. It was using it as a tool for mobilizing and as a tool for, advancing advocacy. And this is how it worked. If a politician or like the government posts something, you just, you remember what, used to be like the fake news of fake news.
Ben Whitelaw:yeah.
Mercy Mutemi:So like some, someone, someone say, say something that's clearly so fake. or that's clearly so true and accurate. And then you call that fake news. So like the alternative, alternative facts saga, it's the same concept here. So you use community notes to change the narrative, and what you end up with is like a false, positive where. A lot of people are contributing to the community notes to change the narrative of actually what is happening. So you never quite know whether this is accurate, right? So in our case it's positive, but only incidentally because it's towards a good cause. If you flip that and it's now a repressive government changing the narrative, denying for example, that people are being. shortcut during protest, that becomes very damaging. So I think it's, um, it's not a way to solve the bigger problems that we're seeing with content moderation. I think it's a, it's a new problem that we've created that seems to have, um. Some financial benefit for them.'cause people now are staying online longer. I dunno whether the statistics support that, but it would seem that then if I need to keep leaving comments, I'm staying on platforms longer, so you're making more money off me. But at the same time, as a society, I think the core problems that were very extreme, content wasn't being moderated correctly, wasn't being taken down from the floor. We are just right where we were. And this is the other thing is from an African. Continent perspective is you were never good at the content moderation to begin with. You barely put in any effort to it. So
Ben Whitelaw:it's a really important point is to note is that this, you know, all of our focus is on community notes primarily in the us. and that's where, you know, meta has, released data on, about the effect of, its changes to moderation policies. There is this focus on, on us and, a lot of the product pilots, as you say, are done in, uh. Countries in the west, very few of them are piloted in increasingly, uh, as well are piloted in other big markets like Kenya or Nigeria or South Africa even. so there is this massive gap in the data, which I think is really, really key to note. Also, the point that you make there about, you know, this is just another vector to be gamed is so important. there's a piece that's was published this week again about, About the kind of rush towards community notes that many of their platforms are making and a lot of the community notes are being kind of kept in the needs more ratings. Purgatory, you know, so, so basically these notes work through clever bridge algorithms that need people from different parts of the political spectrum to rate them and to, to deem them important. But so many of the notes get left in this kind of hinterland that no one sees them. and this, goes to show that again, they're not as useful as, we maybe thought they were initially. And also there are people who are figuring out ways to reverse, their categorization. So this piece from the Columbia Journalism Review says about how there are telegram communities that will post a community note in that chat and say, can you rate this upwards or can you rate this downwards so that it either gets seen by more people or seen by less people. And it's like, these are basically kind of posts by a different name, And, and we are back where we started in a way,
Mercy Mutemi:Yeah, you, you've basically converted it into a game and like when meta switch state's algorithm, people very quickly figured out how to game the algorithm for more engagement and more monetization. So the same thing is happening here as well. People are realizing, this is how to cheat the system and then more effort is now being put into cheating the system other than actually doing what you're supposed to do. Yeah.
Ben Whitelaw:Yeah, I know. the alternative that these platforms are essentially positing as the answer are showing no benefits, no significant benefits, I would say to what we started with, which is the kind of human moderation. So we've really gone deep in some of the kind of big stories related to moderation in Africa Mercy, and I really, really appreciated your experience. the last kind of 10 or 15 minutes of the podcast, we're gonna go through some of the other stories, which I think are similar. and touch on a number of the different issues that we, have touched on today. the first one that you brought to us, um, is something that I haven't read and haven't been paying very close attention to, but it's, it's fascinating this idea of kind of. monetization programs that creators sign up to being, discriminatory against African creators, basically. and Mike and I touched on these revenue sharing programs last week in the podcast, so this is a, a fascinating kind of addition to that. Talk us through what you found interesting about these stories and, how it's kind of unfurling.
Mercy Mutemi:Yeah. I think my first, shock was realizing like that for an African YouTuber based in Africa, for example, in Kenya, your revenue is determined by, where your viewers are located. So if you have more American viewers and for some reason, Viewers based in the West fetch more higher income than viewers who are based on the continent. so you could have a video with a million views, but if they're all viewers from Kenya. Then you don't get to monetize it at the same level as if you had, say, 200,000 viewers from the US and Europe. And for me, that is mind blowing. And that's why, one of the stories, it's, it's not very reason why I do think it's still deserves the attention because I think it's very relevant to date is a story by Fast company, based in South Africa. And the title is The Meaning of Being An African YouTuber, big Audiences, no Big Money. And the Author. really goes, into the details. He focuses on, a creator who just explains how his video went viral, like the world over. And it was A very popular video about uh, j Cole, the. And he gets 1.1 million views. The monetary return doesn't reflect that whatsoever. and to realize that this is like an African thing, it's not just Kenyan creators, creators from South Africa, from Nigeria. are also experiencing this. So for me, that was the first, red flag. I think I noticed this maybe. Six years ago that YouTube had a discriminatory program. my suspicion, and I, I do think a lot of research. needs to go into this kinds of discriminatory structures, my theory is the reason why there are discriminatory structures is because of where the advertisers are located. So let's say YouTube gets the most advertising revenue from this continent, then it's going to prioritize viewers. From that continent, because this is the same model we are seeing across all other social media platforms, Right. but then if that's the reasoning, and I'm not saying I, we for sure know that this is the reason that YouTube has for me, then it's absolute discrimination because we all join the platform the. same way. It doesn't matter what part of the world. You are based in. Doesn't matter whether you are in Africa or like you're in Europe, we all join YouTube the same way, which then means that YouTube is using a secondary metric, which is how many advertisers in that country are advertising on YouTube to decide which of us gets preferential treatment. And that's classic discrimination.
Ben Whitelaw:It's, it is really interesting. I mean, it goes back to I guess, the kind of, cost per thousand, you know, which is referenced in the article, you know, the classic, how much money does a, well an advertiser pay to get a thousand impressions of their advert. And, it probably speaks to a wider issue across the whole internet, you know, in old school. Formats like display banners as well as, the social media, video inserts and the, and the kind of newer formats. But you're right, it's, it's completely variable. You know, there's completely different experience if you're a creator in Kenya or in Lagos versus if you're in the US and it's not just the kind of CPMs which. is one thing, and you know, it's kind of structural and, and almost embedded, I think in the internet, to be honest. But it's, it's more so the kind of who gets access to these revenue programs. how does somebody within the kind of business development team of a platform decide that, eight out of 12 creators who get allowed onto this revenue sharing program. They're all from the us, you know, who, who's making the decisions, right? About who gets to be allowed onto these, quite lucrative programs sometimes with access to support or, you know, increased revenue sharing, capabilities or, you know, you might get boosts to the content that you, provide. it's all very kind of unclear. And I guess somebody from the platforms who they were here, they would say. That's kind of how it works. Like we are here to make money. if an advertiser's willing to pay more money and they're based in the us like what are we to do about it? What would you say to that?
Mercy Mutemi:Yeah, I mean that's, their problem, right? Um. there's two things happening here. There's YouTube placing itself as an advertising, platform right, to advertisers, and that be wholly dependent on Google's ability to market itself as an advertiser, right? And. then there's. Google that is inviting people to share content on the platform and not requiring any form of tiring. Right. It would've been different if I paid a thousand dollars to join the platform, then I get paid more for every view. That would make so much sense because then at the inception you've already tired people. but the thing here that, I think is quite problematic is that. You are using a metric that users have no control over whatsoever. Like this Facebook creator doesn't get to determine how many advertisers get to advertise on YouTube in their country. That's wholly dependent on how much money Google has decided to spend in advertising itself here in Kenya.
Ben Whitelaw:Yeah.
Mercy Mutemi:So Google's decisions affect this. In a very discriminatory way because then the user does not have any control over how their income is going to flow in. That's determined wholly by very discriminatory practices.
Ben Whitelaw:And it, and it has subtle shifts as well. You know, this creator talks about, focusing on content that is gonna be attractive to us audiences rather than af African audiences. Right? So, so this is kind of subtle, you know, again, you touched on it earlier, like playing of the algorithm or playing of the system more generally, in order to, you know, make a living from it, which I think is, is fascinating and, and something that, As the creator economy explodes, and we're seeing more and more people go directly to individuals with big audiences for their news information, it's just gonna become so much more focused. I, I agree. There needs to be research on, on the kind of programs.
Mercy Mutemi:I think there's something you mentioned perhaps maybe on your last episode, with Mike, which was. That how the monetizing programs are set up, end up controlling what ends up being free speech on these platforms. So if. I realize that, if I post content for like a Western audience, that's the only time I'm going to make a considerable revenue that changes my entire shift. Like I. It doesn't matter what I started out as. I want to make a living off of these platforms, and I think there's something there that is being lost in that. Is this really free speech then, or it's just commercial speech? what, what's really happening here? And sometimes, like a YouTube's example, I mean, this, is this a. You have to sort of go three layers down to discover the discrimination that's there. Think about the TikTok Creator Fund so openly blatant about it. that, you know, the Creator Fund is, open to some countries. Oh, but not these countries in Africa. I mean. That's my My mind. is blown by such openly, discriminatory policies. And these companies don't seem to bat an eye putting it down on paper that this is our policy. and I think sometimes the, the most ridiculous part is, oh yeah, but you can get on the live and people can contribute and send you money. You just can't benefit from the creator fund. Have you seen how big an audience. TikTok has in African. countries. Look at the statistics. It's one of the fastest growing platforms here, and let me introduce another aspect to it. American creators are making more bank because of African audiences because the algorithm has decided that we must all consume Western audience. So the African eyeballs are making. Western creator's money, but that is not reciprocated. Like this is layered.
Ben Whitelaw:Yeah. Yeah. No, it's true. It is true. I, I would love to read, some analysis about this and, and who gets to decide and what are the justifications. We've almost become kind of, expectant that these programs will only work, be rolled out in certain countries to begin with, and we, we don't really question when they're not extended, uh, to other countries. So I think it's a great thing to flag. let's go on to our kind of next story that we, both. Read and we're interested in. This is a story that I found, which builds on something that Mike and I talked about a few weeks back listeners might remember. We talked about a New York Times story about the rescue of hundreds of people from an online scam mill in Myanmar. it was a kind of a huge operation took place in February. It was kind of came out of the blue. there's some politics behind it that we won't really go into now, but actually one of the people who was, rescued a Sierra Leonian man called Mustafa. Momo has talked about his experience and it is a really harrowing read that he's, given. to the times of London about what happened. He's a 36-year-old sports teacher. He, was reached out to by, somebody online who offered him a, teaching job, and he went to Bangkok in the expectation that he was gonna be a sports teacher just like he was in Freetown. and it turned out very differently. He was. kind of shepherded on a four hour bus drive across the border to Myanmar. He was forced to sign a, contract, um, which sounded incredibly scary. He was tortured for three days because he didn't want to initially sign it, and he ended up spending nine months. Nine months, in which he was, essentially scamming people online for, you know, across various different platforms. you know, he had targets to set,$15,000 a month scam target, which he was set, and if he and, any of the other people who were in the mill weren't hitting that target. He was, beaten with, pipes or, tasered or a huge, huge amount of abuse. And he, he luckily managed to get out. Again, we won't go into the background, but this story really kind of neatly sums up that, whole episode, mercy, it goes back to something we, keep talking about really, which is, you know, the structural. I guess, inequalities that come with, being an African on the internet, I would say. and then also, the lack of help that this man had when he was found to be in a meal was just shocking, you know? What was your take on it?
Mercy Mutemi:Yeah, I mean, the story was heartbreaking for me. Um, and one of the things that, stood out to me was, look at the countries, that have been mentioned in the articles, like where, his colleagues came from. He's from Singapore. his colleagues are from Uganda, Ethiopia. Right. And the reason why for me that's very concerning is that on the African continent as I mentioned earlier, there's this huge, crusade around apply for jobs online, take on this digital jobs. And in this particular instance, that's how it started for him. He clicked on a link to apply for. teaching job online. and then from there. it becomes, it's done with this pretense of like, this is a professional job. You're doing interviews, here's your ticket, here's your, permit come in, and then everything changes. So the first thing as an African is like, how much more of this is happening that we are not aware of? Like, if this is just one story, how much more are there? Number two. Our governments, um, literally shepherding us into these kinds of situations because in Kenya, for example. This idea of digital jobs is being paraded by the government itself. And there's been already lots of conversations on the government's involvement, with the human trafficking that's happening in Saudi Arabia and for domestic workers. Now there's this other situation that's looking like the government is pushing young people to go for digital jobs. Promises of being paid in dollars and earning a lot of money without there being proper due diligence channels, proper employment agencies. Being involved. And in this case, if a Kenyan was caught up in this, there wouldn't have been an embassy. Either way, they wouldn't have gotten the help that they needed. And it made me reflect on a, a lady I represented a while back who was caught up in a similar Chinese, scam around we transfer where the. They were involved in like criminal activity. There was this era of, sim code switches or switching of sim code where they would be made to get as many sim cards, register them, and then use the sim cards for crime and. Very many times, like even in the case I'm highlighting, people don't know what they're getting caught up in. All they want is, is jobs and they, we have a right to jobs, but we also have a right to be protected as we apply for these jobs. As we, You know, put ourselves out there and say, okay, I need to do a job. I need to be a responsible adult and, carry out a job. And there are so many other people in this framework who could do better in terms of due diligence, in terms of doing their job and enforcing laws and policies, including the platforms themselves. When you see this advert and people are complaining that this happened to me. I would be very interested, for example, to know that has the platform where he got the job advert, has it pulled down those kinds of adverts or are they still online? Like you have people here who are saying, this is what happened to me. If there isn't a corresponding action by the platform company saying absolutely not, we will not enable this kind of behavior. Shut down this connection. If there isn't corresponding action from the government stepping up and saying, actually we need to investigate this and create protections, then it just does feel like, at the moment there's like some despondence around this issue on the continent specifically. And I look at, uh, I look back at like the au pronouncements around what our future is going to look like and even the AI conversation in Rwanda, earlier this year, everything is about digital jobs. Digital jobs, digital. I'm like, stop. Where are the policies to protect people? We've had content moderators go through. Human trafficking in that perspective. We have people telling you digital jobs that set out as cliques are taking us to human rights camp. Since camp comes in different countries, our point do we actually stop and take action.
Ben Whitelaw:Yeah. You know, totally. I mean, you mentioned the African Union there, the au you know, these, these are large multi-country, governmental bodies who are pushing for, The kinds of journeys that Mustafa has made. And, you know, when it doesn't go to plan, you know, there is, very little that anybody can deal on an individual level. So amazing kind of insight. obviously a terrible, terrible story. let's finish Mercy on a, A really interesting reading. We were talking before we started the podcast about, the Tanzania ban of pornography on X. And I'd seen this story, but you've got a completely different reading of it. You've, you've kind of, you know, you've read between the lines and I want you to kinda share, share your thoughts with the controller speech listeners.
Mercy Mutemi:Yeah. so the story you're talking about is, um, there's a story on the BBC, on Tanzania announcing a shutdown of X because of pornography. Okay. So first things first. It's important to understand like in the African community, pornography is a big conversation that X doesn't seem to care or do anything about it has caused like some conversation. I think it's important to just set that straight.
Ben Whitelaw:And when you say conversation, you mean there kind of very polarized views about whether it's acceptable in any form. Right.
Mercy Mutemi:Yeah. I mean, that there does seem to be like a morality perspective towards it, which is, you know, the, if there are no requirements, like pre-entry requirements into a social media platform, then why should there be adult content on that platform? Because then that means children can access this platform. some of the conversations are reasonable, some of them are extreme. I mean, doesn't matter. But what I found to be quite interesting about this, perspective is. The circumstances surrounding this shutdown, for those who might not, be following what's happening in Tanzania. Tanzania is in the middle of an electioneering cycle and, it seems the opposition leader has now been accused a standing trial for what seems to be trumped up charges of treason and. In Tanzania, they still have like an old constitution. If he's found guilty, he could be sentenced to death. And this is quite literally a very, shocking, trial that's happening there. It doesn't seem that anything is being done as per the law. And, um. As is common with a lot of, first of all, we are all part of the East African community, right. Kenya is, Tanzania is, Uganda is, and a lot of other countries are part of East African community. So there were, there's a lot of solidarity with people standing with Ulisa and going into Tanzania to follow the proceedings when the case came up for trial. and some of the notable. Figures that went in was two formative justices from Kenya. one gets let in, one is returned at the airport, like formative Justice. William Tunga is returned. Formative Justice Marga is let in, Mahaka is returned. Like the, there's seems to be like a lot of games and politics going on around it, but I think the one that was the most painful is that it culminated in two, Members of this African community, Bon Mangi and Agatha from Uganda, bfe is from Kenya being, kidnapped by the authorities in Tanzania, being tortured and being sexually assaulted, and then that giving rise to like a very, very, bitter conversation. Especially on X. There's a whole constituent known as Kenyans on Twitter that is very engaged when it comes to politics and democracy conversations, and now things seem to be escalating, especially on X. they have, I. Somehow managed to access, uh, Tanzania's police, social media handles and are posting a safe. The, so the police is against the Tanzanian president. they've somehow accessed, Uganda's Parliament's website. It's a whole chaotic situation because if people cannot, riot physically, then. Online spaces, and this is what they've become useful for. They help us to do online protest, and this was a form of protest, so this shutdown came in the context of that. And for me, I think the most important piece is the way shutdowns are framed in Africa is always in this shoddy language. You'll have Ethiopia saying, it's because of examinations. We are shutting down the internet to cap cheating. You have, Uganda doing something similar like, oh, it's because of examinations. That's why we are caping down, you know, social media platforms. but in real sense the target is to stop free speech around very important issues. Because In those conversations. lots of Tanzanians were also coming up and saying, actually this is wrong. it's time for a change. There's a whole campaign going on in Tanzania about like election reforms and shutting down X at this pivotal moment has far reaching implications that I think when we approach, these conversations around shutdowns, they should be approached with the complexity and the nuances that comes with it, because then what are the implications when you shut down such an important. I'm not in by any chance suggesting that X is the most important social media platform. I'm just saying that, you know, it has given us a means for all its wrongs. It has given us a means to be able to converse around political messaging and political conversations. So what is the effect of shutting that down in the middle of an election cycle? And that for me is the story.
Ben Whitelaw:Yeah. Fascinating. You know, using pornography as a cover to take out, whole platform, you know, for all that. Mike and I joke about Elon Musk and x slash Twitter, you know, this is a really, really kind of fascinating story and, and a, an example of government, censoring users what they're saying at the time, so. I really appreciate this kind of like unpacking of that story. I kind of thought it was part of the broader theme we've seen recently of, countries pushing back against pornography. Actually, this is very different and that's, you know, why the, the nuance is so important and, and you know, it's why we have people like you mey on the podcast. Really, you know, I. Mike and I want to have conversations where we have global experts like yourself, talking about online speech issues. We had ti talking about India, you know, a few months ago and gave us a fascinating, unpacking of all that's going on in India. And then this has been very similar. You know, I really, really have loved talking to you today. I wanna apologize for getting your name wrong. that's, that's, that's how I wanna, that's, that's how, that's how I want to end this, you know? That's, the least that I can do is to apologize. I hope you can forgive me.
Mercy Mutemi:Oh my God. Like, I, I don't even take it, against you because I, I realize what you did was to read it as an English name. You know, like it's mute me. Like I'm like, I'm African As, as African as they come, like every, every vowel must be pronounced with a consonant before it. So MU is mu t is. Me, like, so it's, it's not a big deal. Um, I have to say, I, I really enjoyed, having this conversation with you. it was maddening, to. To relive, uh, the reality of the different context in which we live in. But I think those are very important conversations to have and, um, I'd encourage like more of that global conversation. I think it's easy sometimes to get lost in the singularity of it, and already in the singularity there's lots of problems. But imagine how those problems cascade down to, you know, the rest of world. And I think it's very. Important to have this conversation, so thank you for having me.
Ben Whitelaw:Yeah. I really, really appreciate that. Um, I've learned a lot from you about the Kenyan content moderators cases that you've worked on for so long. I've also learned how to pronounce your name. It's been a big day for me and, uh, and I hope, I hope we can have you back. You know, I really hope we can have you back on day, so. Thank you for making the time. I just wanna give a shout out to, all the outlets that we've, we've talked about on today's podcast. You know, we had open Democracy, BBC, the Times Indicator Media. We couldn't do the podcast without their excellent reporting and coverage of this important topic. and yeah, just to thank you again, mercy. It's been fantastic to chat to you. keep up the good work and thanks for listening everyone.
Mercy Mutemi:Thank you.
Announcer:Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.