Ctrl-Alt-Speech

Live at TrustCon 2024

Mike Masnick, Ben Whitelaw, Dona Bellow & Alice Hunsberger Season 1 Episode 21

In the first ever live recording of Ctrl-Alt-Speech, Mike and Ben are joined at TrustCon 2024 by Dona Bellow, Platform Safety Policy at Reddit, and Alice Hunsberger, PartnerHero’s VP of Trust & Safety and Content Moderation, to round up the latest news in online speech, content moderation and internet regulation, including:

This episode is brought to you with financial support from the Future of Online Trust & Safety Fund, and by our sponsor TaskUs, a leading company in the T&S field which provides a range of platform integrity and digital safety solutions. In our Bonus Chat at the end of the episode, also recorded live at TrustCon, Mike sits down with Rachel Guevara, TaskUs Division Vice President of Trust and Safety, to talk about her impressions of this year’s conference and her thoughts on the future of trust and safety.

You can also watch the video of the recording on our YouTube channel.

Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.

Ben Whitelaw:

So hello and welcome to the first ever edition of Ctrl-Alt-Speech live. Um, love it. Love it. Um, Ctrl-Alt-Speech is the weekly roundup of the major stories about online speech, content moderation, and internet regulation, as you all know, seemingly. And, uh, this week's episode is brought to you with the financial support of the future of trust and safety. Fund and today's sponsor task us. It is wonderful to see so many faces in the audience, some of which we know already, some of whom are new. My name is Ben Whitelaw and I'm the founder and editor of everything in moderation and independent newsletter about the policies, platforms, products, and people shaping the future of trust and safety. Um, before we start, before we get into proceedings. I want to say a big thank you to the Trust and Safety Professional Association for giving us the chance to do this recording. So can we give a big round of applause to the guys? Charlotte and her team have been absolutely fantastic this whole way through and have nothing but great things to say about all of them. Um, a lot of people over the last few days have talked about their origin story. About how they got into Trust and Safety, how they came to be here. And actually, a little, little story here. A few people know this, but Ctrl-Alt-Speech started at TrustCon last year. Um, Mike and I had never met before. We traded some emails. And it was actually Alice that introduced us at a TrustCon drinks 12 months ago. So this, we wouldn't be here if it wasn't for Alice, for TSPA, for TrustCon. So, um, really, really glad to be here 12 months on. Alice will claim credit for this whole thing. Just, just ignore her.

Alice Hunsberger:

You're very welcome. Please do not ignore me. Lots of important things to say.

Ben Whitelaw:

Um, for those of you listening on the feed, this will go out as a recording on Friday, as usual. We've got a brilliant interview with TaskDust's Rachel Guevara. Um, for those people in the room, you'll want to download that and listen. It's a really great interview. Um, so unfortunately you're going to listen to this and then you're going to listen again. It's good for the numbers. Uh, we appreciate support the podcast. Um, and it goes without saying that if you or your organization want to do a Live recording with us or you want to sponsor the podcast. Please get in touch. We're always looking to work with folks I'm here in the industry We'd love to hear from you and I've got a really great series of panelists on my left hand side here You really need no introduction, but I'm gonna do a bit of that anyway Because they're brilliant and I'll start at the end Mike Masnick is the founder and editor of Tector Which was founded maybe longer than some people in this room have been alive. Is that, is that

Mike Masnick:

unfair? I guess it's nice to make me sound so old,

Dona Bellow:

but it's true. It is

Mike Masnick:

true. We started in 1997. So actually, is there anyone in the audience who was. Born after 1997? You're lucky.

Ben Whitelaw:

You're so lucky.

Mike Masnick:

Oh, wait, wait, we have someone. Oh my goodness. I think I have to retire now.

Ben Whitelaw:

Um, you will know a lot of TechDirt's work, but Mike also does some fantastic work out at the Copier Institute. Which has produced games such as moderator mayhem and trust and safety tycoon. Yeah,

Mike Masnick:

do you can cheer for that?

Ben Whitelaw:

And the person to his right you'll most likely know as a linked influencer But she also has a full time job Alice Hunsberger is VP of trust and safety and content moderation a partner hero And was previously VP of Customer Experience at Grindr, and before that at OkCupid. There's nothing she doesn't know about dating and trust and safety, to be honest. But she, she's, she's obviously really a great person to follow on LinkedIn too. Um, so yeah, shout out for Alice as well. Come on. And last but very not least at all, is the senior manager of platform safety policy at Reddit. She's a trust and safety professional association board member, and she's worked at a whole plethora of organizations, including Google and Twitter and Facebook. Um, so yeah, Donna is an amazing person to have on stage today and we're really glad to have her here. So shout out for her as well. So yeah, you, you guys know the drill, hopefully of how the podcast normally we run through a series of stories. For you, for those of you who might not have heard about the news in that week about trust and safety, we're going to take a story each this week, digest it, unpack it, and then we're going to offer you guys the chance to answer questions at the end of, uh, today as well. So there'll be 10 minutes or so of questions towards the end. Um, so let's kick off, let's get into it. Um, we're going to start with a story that's affected us quite deeply here at trust. Gone in many ways. Um, the crowd strike. And, you know, debacle, let's call it, has meant that people haven't been here, unfortunately, that would have been here. But there's been a kind of really interesting kind of trust and safety element to it, which has emerged as things have gone on over the last few days. Mike, you wanted to talk a bit about it.

Mike Masnick:

Yeah, I thought that was really interesting because obviously it's a, it's a big topic for a lot of people here. I mean, how many times have we heard so and so was supposed to be here? But unfortunately, they were flying Delta, and that was, you know, not allowed for this week, apparently. Um, in part because of, of, of CrowdStrike failing, and there's been a lot of discussion around that. There was a Reuters article, which was actually interesting, about how that was impacting, Trust and safety teams where, you know, some vendors and some of the companies got hit with some of the problems of, of CrowdStrike and what was, what was happening there, but the, the take that I wanted to talk about and that I thought was most interesting would be really good for discussion with actual experts as opposed to Ben and I, who just play experts on the internet is. You know, this question of what happened with CrowdStrike surprised a lot of people just in part, because, you know, it was so core to so many different systems and that one mistake in terms of the, the update caused, you know, global failure of a whole bunch of different systems and a whole bunch of different businesses suffered because of that. And there's a lot of stuff and it's, I've heard it multiple times in the few, few days that we've been here of people comparing the. professionalization and the growth of the trust and safety industry to the cybersecurity industry and what happened there. And so the thing that struck me that I was most interested in thinking about regarding the crowd strike situation was, are there lessons to be learned? Are there things that as we build out the trust and safety world and as vendors and platforms and, and everyone starts building out different pieces of it, are other things. That we should be aware of and we should think through that, you know, if there's a single point of failure that nobody expects and somebody, you know, updates it with the wrong thing, are we going to take down a global trust and safety operations of, you know, multiple companies or tons of platforms or just creating a mess? So I'm going to, even though I'll introduce that as the, as the sort of theme, I want to turn it over to Alice and Donna and find out like. What do you think? What should, what should we be as scared of or what, or what should be, be aware of, or how should people be preparing for any kind of situation like that?

Alice Hunsberger:

Okay, I can start. I think, interestingly, I feel like right now the trust and safety industry is like the opposite of that, where we're just starting to see software companies starting for trust and safety. None of them have that kind of market share yet. Lots of them are startups that are still getting funded and not profitable yet. And so for me, I feel like the risk is more on the end of like, will these companies even exist in a couple of years? And clearly a couple will, but some may not. And so it's, it's like the opposite risk in some ways, but in a world in which there's a couple giants that really do a lot, I feel like with, um, content moderation and especially like. The machine learning classifiers or AI classifiers for speech, you can potentially have, I don't know, if one classifier mislabeled something and all of a sudden the word the is harmful and removed across like all of these major platforms like that would cause complete chaos. Uh, and then, you know, most certainly you'll see things like that even with just like bias creeping in and smaller, more subtle things that won't be as like. Newsworthy as CrowdStrike, but would equally affect a whole lot of people and like potentially really cause harm, uh, to folks who are, you know, getting, getting their content taken down. So that's where, that's where I think there was good.

Ben Whitelaw:

There's something really interesting there about, yeah, the extent to which you should be working with a whole range of different vendors versus a few. You know, is it, is it less risky to focus on a few and have kind of deep relationships with them that they do a lot for you, or is it less risky to work across the piece? And we talked a bit about this at lunch today. You've got some kind of views don't know about the best way to kind of approach that, that question.

Dona Bellow:

I don't know if it's the best way, but your way, your way is always the best way. But my experience, I think I was thinking about this idea of like the potential for something like this to take down the entire operation of a company. And I think in my experience, what I've seen happening is like it's actually really hard to like outsource. All of your trust and safety operations to a single vendor. Uh, because a lot of times you will have like very specific customization and integration, and that requires a lot of working with the vendor and figuring like which parts make sense to art source and which parts don't. And I think this bill sort of like a natural safeguard, right? Like in many instances where, I mean, even sender this morning was talking about their collaboration, I mean, Patrion was talking about their collaboration with sender. Um, and then we're talking about the fact that. the moderation piece, uh, they're collaborating on, but they're still holding on to the detection piece, right? Because so much has to do with the way that their platform is specific and works. And again, I think like having these integrations in place, trust and safety teams are very prepared to deal with bugs and things going wrong because it's literally your day to day. Um, and so I think I'm, I'm. I'm not as sort of like scared about something like this, uh, hugely impacting like a large stress and safety operations, but it's definitely, there's a risk of like something in the system going wrong and having that ripple effect through the industry.

Mike Masnick:

And I, I guess, I mean, I totally agree that it's, they're not there yet. I'm sort of curious though, If there are things that the industry writ large should be thinking about as it grows and changes to prevent that kind of scenario from happening, I mean, obviously, you have some vendors who would like to take over the entire space and would like to be as integral to the trust and safety world as CrowdStrike is to the cybersecurity world. And so are there things that the industry or others can do in sort of preparation to make sure that maybe there aren't single points of failure?

Ben Whitelaw:

I mean, I'll jump in there and say that I was really surprised about the fact that CrowdStrike actually had a kind of 18 percent of the market when it came to security endpoint or whatever is they do. And, and that's not as much as I thought when the story broke and so many companies were affected. You know, that's 18 percent is way lower than I thought it was going to be. And there's a question there about, I mean, we know, we know that it's 18 percent because there's a lot of research done by market research and intelligence firms. Would we even know that one company had an 18 percent market share now if it came to trust and safety? I don't know if we would. I don't think there is the market intelligence or research firms out there that are mapping this work and obviously selling that work because it's not mature enough as you guys say. And so actually there's something there which is, do we, do we have the information that we need to even spot the risks? And to kind of preempt what it is we're saying in the way that the cyber industry is able to do now. So that's something that I was thinking about, but maybe we're a few years away from that.

Alice Hunsberger:

Yeah. I mean, and I think also like checks and balances for sure. It makes sense. Multiple vendors potentially, but then also looking at Um, the risks and sort of thinking, is it, if you have one particular vendor, is it, and it goes away entirely, then what is your, like, pull the emergency cord reaction? Is it shut everything down completely on the site and have, you know, if you don't have the ability to have moderation anymore, like, do you even let people log in? Do you even let people upload anything? Do you put everything on a pause? Do you allow things and then be really reactive for a few days instead of being proactive? like what is what are the rules look like and then what's the comms plan to tell users because most people don't understand that trust and safety is even happening on a platform and so if all of a sudden everything breaks and they Can't post things anymore or everything's getting removed or delay for your photo to be approved or whatever How do you explain not only what's happening? But, but why? And it's like, we don't, it's not that we hate your picture of your cat. It's just that like, we can't moderate right now, you know? So, um, those are all things that like my paranoid brain to think about. It is an

Ben Whitelaw:

ugly cat though. Um, is there anything else that you guys would do as people working in platforms to kind of preempt this stuff? Is, do you have processes in place? Did you have, um, ways of kind of trying to figure out what a process would be if this happened? Yeah, I think what

Dona Bellow:

you just mentioned around being able to communicate what's happened, right, like with with people and also having just like an emergency plan. I think in this case, the case of this outage, it was just like so many people left stranded with like no idea of like where to go next. Um, and I think in the case of platforms like we we've build some of that resilience with you know, last minute events and having a plan to communicate to people. I know at Reddit, even when we have things such as like world events happening, like we have mechanisms to reach out directly to our moderators to check on them, see what they need, what we can do to support them, right? And so just having this Communication channel with people to make sure that we can hear what they need and we have a way to address it, or even a way to say, you know what, we don't know how to address this yet, but we're thinking about it. We're working on it. We'll get back to you. It's very important. Yeah, I was going to add that like so much of the infrastructure for content moderation for platforms has been reliant on humans. And so in the case of natural disaster, if most of your moderation is happening in one country. And that country no longer has power, and people are fleeing their homes and not working. Then you need a redundancy plan, and often the redundancy plan is like, oh, a second location, but sometimes that doesn't work out, or there's a delay, or the capacity is much lower, and so I feel like most platforms do already have an emergency, emergency, emergency plan. Because there's been such a reliance on humans and, you know, that really can go wrong and has, especially with climate change. So, yeah, those plans are in place at a lot of places. Yeah, okay.

Ben Whitelaw:

Cool. Um, that's a really interesting start to today's discussion, I think. And, and fingers crossed no one's flights are affected going home, including people in the audience. Um, so let's move on now to our second story, um, which is an interesting long read that, um, we've talked about as a group, um, Uh, Neiman lab, which is a, a really does a lot of really good interesting work and increasingly about trust and safety. Um, donor, you were going to talk us through this piece about new public.

Dona Bellow:

Yes. Um, so this article was written by Sophie Culpepper and she's a staff writer at the Neiman journalism lab. Um, the article is titled could social media support healthy online conversations? New public is working on it. And so, the article is interesting because it starts with the experience of the writer, um, in a local community, wanting to talk to people about, uh, what she should write about, what kind of stories she should write about. And the things that came up were actually two online communities that they have and where people apparently get together and have all the fun and discuss social issues, the weather. And sort of like, it's really acting as sort of like these, um, communities, uh, where people can come together because they no longer have like the physical spaces that they have in the past, where maybe the local square is not as active as before. Some of the shops have closed, right? And so, um, a lot of the conversation has shifted into these, um, these online communities. Um, they mentioned like, uh, one of them is like a Facebook group, uh, where women talk about, you know, Social issues in the socialize and they enjoy their time together and share their issues and the other one is actually like An email based group where people again talk about things such as like the weather and also share a lot of information about their civic their civic engagement what's going on with their local government and how they can Discuss those things, political topics, et cetera. So, I thought it was very interesting to sort of like, just, um, you know, this highlight of, of the fact that, like, we have, um, In a lot of smaller communities, which are often like cities, I mean, towns, I would say that are kind of like not seeing their, um, their communities thrive, turn towards like the online space to be able to have conversations with each other. Um, and so the article goes on to cover really like how, um, increasingly like in this, these online spaces are the opportunity, um, To, they become sort of like this de facto center of information, right, and center of connection and also bringing this interesting stats from a recent Pew report. Um, And, um, it says, I'm going to read because, um, I'm not good with numbers, um, but it says that, uh, the percentage of U. S. adults, um, that often or sometimes get to local, get their local news from online groups, um, has actually jumped from 38 percent in 2018 to 52 percent in 2024. And so that's really just showing, like, how, um, We have sort of like this growth of community led, um, effort to share civic information and to have spaces where people feel comfortable, uh, doing that when they're not feeling like they can get together and have those conversation, um, I guess in a safe manner anymore. Um, and so this is where, um, the conversation around the people, our new public comes into play. Um, and, um, if you're not familiar with new public, it's a nonprofit organization that was co founded by, um, Eli Pariser, um, and, uh, UT Austin professor named, uh, Talia Strout. And, um, the sort of like the, the, um, actually if you're not familiar with Eli Pariser, he used to have, uh, he co founded also a startup called, um, Upworthy and Upworthy, if you remember, if you're familiar, it's sort of like this. Good news, positive story information that comes into your feed and makes you feel good about your feed. I was a big

Ben Whitelaw:

fan. I was so

Mike Masnick:

sad when, when it went off, but, but also somewhat famous for, for perhaps inventing the sort of prototypical clickbait headline of positive news. But,

Dona Bellow:

but it felt so much like this is the positive news that I'm getting that didn't know I need it. And now I really need it. Thank you. Um, and so New Public, uh, is currently led by, um, Eli Pricer and also, uh, Deepthi Doshi, who is a former community organizer, who, uh, also spent several years, um, working in community partnerships at Meta, actually, and is someone who has thought a lot about community led governance, especially through the context of, uh, Facebook groups. And bringing that experience into this idea of, um, reimagining our digital public spaces to make sure that people better connect and serve, um, And have spaces that really serve the public good. And I think the underlying premise of this goal with New Public is this idea that the current, sort of like, very centralized platforms that we have at the moment where you have billions of people coming together to talk about, um, A whole range of topic is just not leading to conducive, productive conversations and can very quickly escalate to toxicity, uh, because you have so many people engaging at the same time around so many topics. Right. Um, and so. New public has become very interested in examining sort of like what does a sort of like a more limited space approach to public conversation looks like. Um, and uh, in this context, they launch a local, uh, initiative, uh, to run more about specifically the mod moderators of the spaces, um, initiative is called the Local Lab. Um, and figuring out like what does it look like to be a moderator in this? space. What kind of support and challenges you're encountering, especially when it comes to managing conflicts and maintaining civility on these platforms. Um,

Ben Whitelaw:

just, just before we kind of move on the question around, I want to know about your local news groups. I want to just quickly, where do you guys get your local news? Because I think this is a really interesting piece of kind of, and there's broader conversations for For our audience and for people who listen to control or speech. We're like, where'd you get your news? Are you going to like local, local kind of groups and figuring stuff out about what roads are closed out? Yeah,

Alice Hunsberger:

I'm, I'm going to be vague cause I don't publicly say where I live, but you can come talk to me afterwards and we'll say privately I've had. Death threats before from my job. So I just like try to keep that under apps, but there is, there's a local, um, community group and it's amazing. And like, we actually, there's an interesting sort of testament to like how the internet has reshaped local democracy. Even there was a vote in my town that was very contested and lots of people feeling super passionate on one side and the other. And the rule was that there was supposed to be an in person. Debate and to talk about both sides. And then everybody's going to vote, um, in person. And, uh, we all showed up, it was like seven o'clock at night. I had a young child at home. Like we were, you know, nobody wanted to be there, uh, but the vote was super important. And so there are two microphones up at the top of the stage and they said, okay, line up and we'll debate both sides before the vote. And somebody came up and she was like, We all know because we've all been talking about this on this online forum already. We've all seen both sides We've had this debate already on the internet and like we all know and so I call the vote right now and Everybody in the room cheered and we voted and so we didn't have to have like we didn't have to Stay there until 11 o'clock at night because that community discourse had already happened On this small community forum and the whole town felt really, really informed and had had like weeks to be informed as opposed to hours. So I thought that was like super fascinating way that it's like literally changing like 100, 200 year old rules about how local democracy works because the internet.

Ben Whitelaw:

And the way that you probably do their work, one of the things you were really interested about, Dona, is like, how they talk to moderators as well, and how they involve moderators in some of the kind of, Thinking they're doing in the research and, and, and that is something that I think is very rarely done, um, in some companies and some organizations. And I know Reddit does a bit of that, which I think is where you were like, I resonate with this a lot. Talk about that.

Dona Bellow:

Yeah, I think that, so I think the first part where I very much resonate with what the points being made in this article is the fact, this idea that, um, We can achieve more scalable moderation through small spaces, right? When you look at moderators, the other people were very close to their communities. They understand the context. They understand how people interact and they, by definition, have a lot more. So like proximity to the issues that are going to happen in those spaces than we sitting over here in the headquarters of Reddit, you know, um, and so, uh, I think that's the first point. And this is something that's reflected very well in this article, right? Like, how do we think about, you know, Scalable moderation in the context of like this communities and it has to do with these moderators And so I think that this goes to the point around like that also means that we need to regularly engage with moderators to understand What are the pain points that they're facing? Whether it is in terms of like tools for example, right in terms of like how to manage the communities Um, there's an interesting point in the article around, um, potentially leveraging AI to help moderators, you know, with some of the. Sort of like low level tasks with moderation. Um, and I think this is something that, um, we also think about because when you really think about the motivation of someone who starts a community, no one goes into this thinking about, Oh, I'm just going to be a moderator. This is great. This is why I'm here. Right? Most people come to form communities because they have an interest. They have something to say and they want to share it. With people, uh, that they can discuss this with, right? And that's what makes also very, people very passionate in this community because they care a lot about the topics that they talk about. And moderation becomes sort of like this, thing that they also have to do if they want to maintain that community and make it thrive. And so it's important to be able to have the tools to do that effectively so that the sort of like this effort that has to go into moderation, um, can be supported and moderators goes like go back to what they were initially there for, which was, uh, you know, just to ask really helping and engaging with communities and making them thrive. And so, um, I read it, for example, uh, we have this, um, really cool tool called auto mod, um, and, uh, we, you know, have publicly on our site, you know, how to use auto mod and how it's supporting, uh, moderators. And they can go in there and set up filters and a lot of automation in place to be able to effectively that control and take care of like, uh, You know, things that they really shouldn't have to spend time on when, um, there is technology to support it.

Ben Whitelaw:

Alice, you're nodding along.

Alice Hunsberger:

My mic is raised to indicate that I have a lot of thoughts on this, but I, but please stop me if I've gone for too long. So it's interesting in the beginning you talked about trust and safety origin story and like, this is very much my trust and safety origin story, so I'm going to date myself. So, um, when 22 years ago, I founded a little, like, Online community message board. And it wasn't a local community. It was a community based in music and punk subculture. Um, and this was pre any social media. I had never heard of moderation before. I had no clue that like, you know, I was 19 and so I was very naive as well, but like, we didn't talk about moderation in the same way and there was no general sense. And so I got this crazy crash course and like, Oh, you actually need. Community guidelines and you need enforcement and had to figure out, you know, tooling and like how to IP ban people. And this was how I got started in trust and safety is again, completely, like you said, like accidentally, because I was interested in the community aspect of it and then moderation came second. Um, and you know, I've learned a lot over the last 22 years. I think a lot of other people also have learned a lot over the last 22 years. And so like, we don't need to reinvent the wheel at all. We can just share what we've learned and, and, but yet like every year there's going to be more and more people who are learning, Hey, I'm just discovering that trust and safety is a thing that exists, but, or moderate community moderation, like it's very interesting. And I think the more that we can democratize it and have those conversations with people who aren't trust and safety professionals, but give them the things that we've learned as trust and safety professionals, then hopefully also it makes people. Better internet citizens and more able to have discussions that are healthy and inclusive. And maybe now I'm being naive and channeling my naive 19 year old self again. But, um, but I think it is all tied together for sure. And when it's like you're posting on this giant platform. With like the man looking over you, it feels very different than when it's your neighbor saying, Hey, I know you live two blocks away from me, knock it off, you know? Um, and, and yet the enforcement is often similar.

Ben Whitelaw:

Yeah.

Dona Bellow:

Oh, sorry, I just wanted to jump in on the point you raised around. Um, you know, you sort of like sitting there and trying to figure out all these things and not necessarily like knowing what was going on and how to tackle some of these issues. And I think this is another part of the article that I found was interesting and in the work that new public does, which is really to connect. the moderators and those people who are taking care of these online communities so that they can share best practices, right? So it's not like the random, you know, 19 year old sitting over there and trying to figure out like what to do. And I think that's very very important, right? When I started in trust and safety, I did find that this work was very isolating in many ways. It is hard. And, um, you know, having spaces like TrustCon has been great to meet other people and have this conversation. Right. And it's, this is another thing that, um, again, working at Ready, we think a lot about, um, We have this really cool event called like mod world. And we just get a bunch of mods together and they can come and can propose topics to talk about. And it's like their own little conference to share best practices and talk about like, well, this is working in my community because this is the context that I have. And this is maybe how you may want to implement that in your community based on your particular context. Right. And so that's, that's super important to fill people like to make people feel like it's, it's an inclusive space. Yeah.

Ben Whitelaw:

I was gonna ask you, Mike, about, um, one of the things that kind of underpins new public and the donors spoken about is this kind of shared infrastructure that they're trying to develop and that they're thinking about. We've heard a lot about that over the last couple of days as well. Um, in various guises, you know, tooling consortiums, open source. There was an announcement from jigsaw yesterday as well. It's been a number of different announcements from companies who are kind of open sourcing kind of infrastructure and stuff that might be used by other communities to moderate. What do you think about has brought about this focus on that lately?

Mike Masnick:

Yeah, I mean, I think there's a few really interesting things that come out of that come out of this particular story as well that are worth thinking about in the context of trust and safety, which is that. most all of the discussions that we had about trust and safety for years sort of focused on these larger platforms and was sort of viewed through this prism of that's where was where You know where the issues were happening. There were a few cases reddit being an example like okay Well, that's a larger platform with smaller communities and they have a way of approaching moderation which was also really interesting But at the same time, I've been thinking about coming from the journalism side of things. There's been a widespread discussion about news deserts and places that don't have any journalism. And one of the things that came out of this article is the fact that in some of these places where there are news deserts like The community itself sort of steps up and creates these forums and as the article says they're in all different places Some of them are an email some of them are Facebook Some of them are on different forum software and all these things are popping up to sort of fill this void And I think there's been so much less thought about how, how do those communities handle? And that's where it gets really interesting that, that these guys are thinking about that and thinking about, you know, especially not trying to force them off of the platforms that they're on, but saying, like, how can we provide, whether it's tools, training information community itself for the people who are moderating these forums or running these forums to actually, you know, recognize that they're in this position of moderator and they're enabling a community as a, as a part of that. And so I think the rise of these other sorts of tools and the things that are being released as open source and the things that are being put out into the world. Are, you know, are necessary. And the next step is like making them work for all of these communities. And there's a big question right now. There's a big gap between, you know, the communities that are talked about in this article are not going to use the jigsaw API, right? I mean, maybe they should, but that's like a huge step for, you know, some random person who just wanted to talk about like the local news to say, how do I figure out how to implement this API? To make sure that people are not being terrible in my little community. But if we can get to the point where they're sort of like, click here and go and like, and we can provide these tools and we can enable you to have these necessary communities, the communities that are forming because the, the former, uh, institutions and infrastructure that we had for community have fallen apart for a variety of different reasons. I think that's going to be really important and it's something that, like, it would be nice as, you know, the, the trust and safety industry is thinking about and figuring out, you know, how can we provide the tools and set up for these kinds of other alternative communities.

Ben Whitelaw:

Definitely. I think the, the, the kind of time we've taken to talk about this piece shows how interesting it is in a number of different ways. I'm super conscious of time. I want to make sure that we get to audience questions. So let's move on to our last story now, Alice, that you picked out. Um, with somebody who's cropped up a lot at TrustCon, Tech Coalition, they've released a new resource. Take us through that very briefly and then we'll kind of whiz into questions.

Alice Hunsberger:

Yeah, I'll be super quick. Uh, I thought this was really interesting because it's something that I said publicly in the newsletter that I write for Ben, um, Trust and Safety Insider. How there's often a gap in knowledge for platforms, small platforms who want to do better with trust, uh, trust and safety, child safety specifically. Uh, where, you know, you're expected to know how to report to NECMEC or how to write really good guidelines on, you know, What's allowed and what's not allowed as far as CSAM or other child harms. But the, the organizations like tech coalition that have all of the knowledge and all of the information, you have to pass a whole series of, um, they're, they're not tests. It's not like a gauntlet, but like they have standards for membership. They have standards for membership and you have to meet those standards before you're allowed in to talk to them and learn from the groups. And I was like, well, that's. I mean, yes, having a closed forum with like vetted people is a helpful way to have people really collaborate. And also the people who really, really need it might not have 10, 000 to pay for membership and they might not have met all of the standards yet. And aren't those the people who need to know more for free immediately now to help do a better job. And so they released three resources. Um, called Pathways, and it's literally a pathway to become a member if you want to go that route, but it's freely available. So they did, um, a couple on policy, and then one on NACMEC reporting, and worked with NACMEC to make sure that it was all good. So if anybody here listening, Is that a small platform and wants to do better with child safety? Check those out.

Dona Bellow:

I've worked at a 100 person startup before landing at Reddit. And there's also the case that sometimes there's just one person doing all of the things, right. And they don't even know sometimes that there are standards, right. To begin with. And so this is really nice to actually have just that education going out. Yeah. Yeah.

Mike Masnick:

Yeah, I was, I was going to say, I mean, it sort of ties in nicely with the previous story and the idea that, you know, not all communities are on these giant platforms that are able to do all of these things and yet these things are really important and as, as you know, some bad stuff certainly happens within those smaller communities, you know, having, having systems and processes in place that, that smaller communities can configure how to, um, you know, manage that is, is really important. So this was great to see.

Dona Bellow:

And just to add to that also, just I was thinking about the fact that very often once, um, we start implementing things on the bigger platforms, what the bad actors do is just find the dark spaces where they can do their business, right? And it often ends up being very small platforms, right? And so. So. Equipping those folks with the tools that they need and knowledge that they need to deal with. So like this influx of people who have been chased away from other places is critical.

Ben Whitelaw:

Yeah. And I'll just say that tech coalition have thought about this really thoughtfully as well, because they have gone to the smaller platforms within their membership. They've also gone to people who fail to complete the tests slash checks, um, that, that, that are needed to become members. And they've gone to people who've kind of been rejected for membership as well. So they've, they've done some, the reason these resources exist is because they've gone to folks who have not quite made the grade, which to your point, Mike is Is I think exactly the kinds of people who would use this resource. Um, before we go to audience questions, just very quickly in a word, each from each of you, um, what other kinds of resources do you, would you like to see created like this for the kind of smaller platforms? You know, if you've worked at startups, don't know, like what is it you would ideally have that doesn't exist or that you don't think is very visible? I've really put you on the spot.

Alice Hunsberger:

Yeah, that's a really hard one. Um, I put together, so I think, I mean, I think the answer, it exists now because I did it. No, but there's, um, a lot of, a lot of work that I've been doing this year, specifically resources for LGBTQ inclusion and, um, equality and, and trans rights. And that's something that is like pretty nuanced and a lot of platforms don't spend time thinking about. And because I was a grinder for many years, it's all I thought about. But. Cloud also has a lot of good resources, so there actually is stuff now, um, but that's a gap that I see in a lot of platforms. I have to think about what still doesn't exist. Yeah, okay.

Mike Masnick:

Yeah, I mean, I immediately sort of went to thinking through, like, the community that I manage in the TechDirt comments, which is, uh, Occasionally rowdy bunch. Um, and, Understatement. Yeah, and, um, And, you know, we're sort of dealing with them. We've built some tools ourselves, and then we rely on some open source tools around WordPress, uh, and some other plugins that are there. But, you know, a lot of that is not as well thought out, and not as advanced, because people don't necessarily think through like, gee, blog comments are a trust and safety issue in their own way. Um, so I would love to see things that are a little bit more creative than just the standard, like take up, leave down options, where if there was more ability to do creative stuff in there, um, would be really handy.

Dona Bellow:

I think for me, I, I think a lot about sort of like international communities and I feel like a lot of times the way we think about this is sort of like after the facts, after the crisis has happened, we get the learnings of what we should have done better, right? But there has to be, and I know that there are, especially at larger platforms like research that goes into, you know, like how we should think about proactively engaging with some of these communities. And, Sharing that more broadly with like the smaller platform where there is one person doing policy so they can possibly research the entire world and who is going on. Uh, having that more, like those best practices like more readily available so that we can be in front of those international crisis that we don't, uh, Always know how to respond to, um, and, um, especially when, you know, language is the barrier and all of that. So I think that's, that's a space where I'd like to see more of that.

Ben Whitelaw:

Yeah, great. Some really nice ideas. Um, fantastic. So we, it's now time that we panelists beware, there's going to be some audience questions we're going to hand over to you guys. Um, we want to hear from you about stories that you've read this week that you're interested in, give us a summary, tell us what you thought. Uh, or ask our great team up here some questions as well. And there are some mics going around the room, I think.

Mike Masnick:

Yeah, make sure you speak into the microphone. Otherwise, there'll be a long silence on the podcast. And

Audience Member:

tell us who you are as well. Uh, thanks. Ian Corby from the Age Verification Providers Association. But I'm not going to get into age verification. Alison Bowden's here from the Free Speech Coalition, and she and I could fill an entire podcast on that debate. But the fascinating thing I've noticed this week at TrustCon is how close we are to political philosophy in making a lot of the decisions we have to make in the world of trust and safety. And I'm just interested in how conscious you are of that in the work that you do. And how often you go with your political philosophy or how often you have to dare I say control for your political philosophy so as to get to a, an answer that perhaps not might not be quite so instinctive.

Ben Whitelaw:

I'm not entirely sure if I could name my political philosophy. I haven't done a huge amount of thinking about it, but, um, it's a great question. Um, Mike, do you want to take that first? Um,

Mike Masnick:

all right. Thanks, Mike. Sure. Um, yeah. I mean, I think, you know, the, the. That's an, that's an interesting point. Um, the, the thing that I've certainly noticed, and at least in my experience in talking to folks in and around the trust and safety world is actually how little political philosophy comes into it, right? It is mostly about creating policy and enforcing policy and figuring that out. I know there's like, there are. There are public assumptions that are generally wrong, uh, about political philosophy mattering. Um, and, and, and, okay, so somebody is, somebody wants to speak. No, I just, I just,

Alice Hunsberger:

I just want to add that political philosophies is like a fairly charged term and that really what I agree with and what you're getting at, I think is that it's values. And I often say that trust and safety is basically like the way that corporations Um, bring their company values into reality. And so you can say one thing in your corporate statement, but then actually like the policies that you write, the way you deal with your community, the way that you enforce those really shows what your values truly are. And the values are often politicized, especially right now. And so. Talking about trans people like I did 10 minutes ago is like a political statement to one extent, but it's also about values and human rights and that transcends politics specifically. So political no values. Absolutely. That wasn't present.

Mike Masnick:

Great. Yes. You said everything I meant to say.

Ben Whitelaw:

Okay, cool. Um, great, great star. Um, who else wants to ask a question? Hi,

Audience Member:

John Perino of the Stanford Internet Observatory. Um, my question is a bit broader just given, um, this is the last session of TrustCon. I'm curious to hear something that maybe gives you hope and motivation when we're at a conference that, as the lead Charlotte Wilner calls, is all about trade offs and sadness. What gives you hope and motivation in this conference?

Ben Whitelaw:

Dona, do you have something you're not sad about after the last three days? Yes. Please say one thing.

Dona Bellow:

Uh, no, I've actually been pretty energized by some of the conversations that I've been in. Um, you know, we've talked a lot about journey of AI and some of the opportunities that are there. And especially in the context of like, you know, how they can potentially like accelerate our work on the policy side. So, yeah. That made me happy. Uh, I work on policy, so I like to optimize for my work, even the resources that I have. Um, but I think like on the, the trade off conversations, um, it's sort of like the, also the exciting part of the job. Right. And for me personally, One of the reasons why I wanted to work at Reddit and at a company where we actually, um, you know, have sort of like this immediate scrutiny from our moderators, our moderator community was like very active is that like, We have this immediate partners that are sort of like, you know, close, but still external, right, to have those trade off conversations and think thoughtfully about the things that we're doing. So I'm, I'm not sad about this. I think it's, it's healthy. I think, you know, it's healthy debate. It helps push us forward. It helps us keep our ears on the ground of what we're supposed to be doing. Yeah, I'm just excited about coming here and having folks who have like different ideas. I love the panels where people are, you know, get a little spicy about, you know, their opinions. And there is an exchange of debate. Um, and yeah, I hope I answered the question.

Ben Whitelaw:

You guys, what have you been?

Alice Hunsberger:

I mean, I think for me, it's the same as it was last year, which is the sense of community and people who get it and have seen some stuff and commiserate with each other. Um, you know, often in my work, I felt pretty lonely. Stuck between trying to protect my team and advocate for the work that we're doing and, you know, being able to be in, in person with a lot of people. And, and just like be reminded that this is a community and that this is, uh, an amazing group of people who just like, we have each other's backs and it's, it's amazing and learning so much from other folks too. So, yeah, I, I often feel. pessimistic about the world and the way the things are going. And then I'm with this like group of amazing, brilliant, kind, compassionate, like passionate people. And it makes me incredibly hopeful if anybody can solve all of society's problems. It is this group of people.

Ben Whitelaw:

I'm not sure they've signed up to all problems. That's what

Alice Hunsberger:

trust and safety does, though. Like we have to solve, you have

Ben Whitelaw:

no choice, must solve all problems. It's in the small print, Mike.

Mike Masnick:

Yeah, I was going to say, I mean, you know, last year there was a lot of, I think, um, worry, especially sort of coming out of a bunch of layoffs. And there was a lot of discussion about that and sort of what is happening to this industry. And it felt to me that this year there was a lot more optimism in general of people just thinking through, like, we're actually. Coming up with really interesting solutions. The generative AI stuff, really exciting things, you know, some really early things, some further along, very exciting. The, um, the tooling stuff, the open source stuff that's been released, the other stuff that people are talking about doing, it's like people are planning for the future and in thoughtful ways in ways that can have real impact. And that's really exciting for an industry that is sort of used to like, Oh my gosh, we're, we're going to have to deal with some terrible thing again today to have like. Cool stuff is coming, uh, it's kind of exciting.

Ben Whitelaw:

Agreed. I think that is a really nice reflection from all of you. Um, we could be going on and taking questions all evening. I, I will ask that we have this conversation. Maybe over a drink later on. Um, I want to round up today's discussion and I want you to give a round of applause to our great panelists.

Dona Bellow:

And, and a round of applause for Ben.

Ben Whitelaw:

Um, yeah, guys, thank you for coming to the, the live, first live recording of Ctrl-Alt-Speech, the last session here at TrustCon. Um, it's been really, really wonderful to have a, a live see your faces. Um, if you are listening on the feed, stick around to get the interview with task us, Rachel Guevara, um, for anybody in the room or listening online, rate, review, subscribe, tell your friends, tell your enemies. Um, we'd love to have you back on the podcast, uh, again, the future more hopefully be a trust con next year as well. So thank you very much for joining us.

Dona Bellow:

Thanks everyone.

Mike Masnick:

All right. I am sitting here live at TrustCon, with the, Division Vice President of Trust and Safety at TaskUs, Rachel Guevara. Welcome to the podcast.

Rachel Guevara:

Thanks so much for having me. I'm excited.

Mike Masnick:

So, uh, we were just discussing before we started recording that both of us have been to all three TrustCons, and we're wrapping up this one, and it's been a very hectic, uh, Uh, and kind of crazy, but, just, you know, having spent the last three days immersed in, 1350 trust and safety professionals, I guess.

Rachel Guevara:

Yeah, the sea of trust and safety professionals.

Mike Masnick:

What's your impression? What's your sort of, I know it's always tough, like it takes time to like let these things process, but what's your first impression so far of TrustCon?

Rachel Guevara:

Yeah. I mean, listen, seeing the progression of the TSP over the past three years has been phenomenal and no easy feat. So shout out to the TSPA and all the work that they put into these events going from the first year, what, there was like 300 of us here now to 1300 plus with a wait list. I know of more, um, so their ability to organically grow this, really speaks to the hungriness. I think of the industry itself and the interest level of interest in wanting to come together and collaborate with other industry professionals. and Charlotte and Amanda and Kofing and the team have really done a great job creating the space to do that. Highly organized, lots of positive energy this year I would say last year, a little bit of. just, you know, because of all the layoffs that were happening. So definitely felt more optimism this year, than the previous years. as well as kind of new interest from adjacent, types of industries. So seeing a lot of different types of practitioners here this year from law enforcement, legal teams, mental health practitioners, researchers, NGOs, you name it, they're here, which leads to really, you new and interesting views on industry related problems and kind of a bit away from just like the insular tech community, which has been kind of a nice thing to see. So yeah, it

Mike Masnick:

seems like there have been sort of these really interesting conversations and different perspectives but like all towards a common goal, which has been sort of a really, really nice feeling from sort of where I've been sitting, one of the points in this came up in the Tuesday keynote was a discussion of how, trust and safety as a relatively new field. People come at it from all different backgrounds and all different perspectives. And I know that, your background is in psychotherapy, and so very different than many other people in the field. So can you give me a little bit of your journey into the world of trust and safety?

Rachel Guevara:

Yeah, I mean, I think it's probably like a lot of people's like, how did I end up here? I don't, I don't know. It wasn't something I was seeking out. It wasn't an industry I knew anything about. And as you mentioned, I'm a psychotherapist by trade working in pretty traditional mental health care capacities historically, but I have a specialty in trauma recovery. And I'm a social worker. So if you know anything about social work ironically, it's pretty mission aligned with the trust and safety industry around things like social justice and service and dignity and worth of people, and the importance of human relationships. And so that really resonates with me, uh, in the trust and safety profession. And it's kind of, By accident that I've ended up here. I got an opportunity to build a psychological health and safety program for Task Us about five years ago. And, my interest in macro practice and human problems and wanting to solution those problems is just kind of snowballed into now leading all of trust and safety practices. So it's been a really cool and interesting journey, but certainly not one that I planned for, like looking for at all. So, yeah.

Mike Masnick:

But yeah, I mean, I think it fits and, you know, I've talked with people before where it's like, there's something about trust and safety that it attracts people who sort of want to help. Like, there's this, like, people sort of rush towards, there's a problem, there's a safety issue. There's a, there's an issue out here and the field attracts people who are like, I need to help. Absolutely.

Rachel Guevara:

Absolutely.

Mike Masnick:

That makes sense. And so, you know, one of the interesting things, and I've seen how this has evolved over the three years of TrushCon as well, is sort of how the vendor space itself has, changed and grown, and there's a lot of talk of like, what does that mean? Like, you know, what is the position different vendors? So I'm, I'm curious sort of like, what's your take on the, vendor space and Tasks position within that?

Rachel Guevara:

Yeah, I mean, it's complicated, and I think, um, Task Assist has done a good job to come to the table wanting to be included in, helping solution problems that are in the trust and safety industry. And I think vendors broadly have a, a responsibility to come to the table and share best practices and share challenges. and the reality is, is that if we're talking about, vendors as, um, You know, being utilized for outsourcing, really the majority of trust and safety practices are now happening in vendors. Yes. Right. And so if we ignore them as a voice, to bubble up issues that are happening on the front line or in the global majority, we're missing a big part of the conversation, in my opinion. And that's where I see vendors playing a pretty significant role. if we're thinking about where a lot of these companies are, you know, Sitting out of or you know, they're headquartered, right? I mean, I don't want to stereotype but they're all here in california Right, and we've got you know

Mike Masnick:

within probably within 25 miles of where we're sitting

Rachel Guevara:

Yeah, and I I don't know how inclusive that is to address That's the real dynamic and ever changing issues on the front line. And that's where I really think vendors should be a part of the conversation.

Mike Masnick:

Yeah. Yeah. No, I think that's, it's a really important and important way of thinking about it. In terms of. My mind is sort of like rolling with all different ideas and different things that I've heard. there have been a lot of topics that have come up this year. And there's been, a few themes, I think we could say. What, what things have stood out in terms of the sort of key themes from TrustCon this year?

Rachel Guevara:

Yeah, I think, you know, I've seen a little bit more, intentionality on kind of specific topics coming up. Last year felt a little bit more broad, which neither one's good or bad. Interesting. different. I think this year I've noticed a lot more discussions around child safety. of course, there's ongoing and continued buzz on AI that people are still trying to kind of sort out, like, the implications for the trust and safety industry to some degree, in practice versus theory. I also think that, of course, as we got here Sunday, We saw some major announcements in the U. S. political arena. Um, so of course, election. Yeah, so and certainly we all know that it's a massive year for elections globally where like half of the world's population is voting this year. So I know election integrity is of top of mind for everyone, but that kind of like propelled it to the forefront to some degree. Right. And then I just, I think generally noticed like a lot of optimism this year. Like we've kind of come from. full circle, a year post, like mass layoffs in this industry. And it seems like people are landing in places that feel really good to them. And there just generally seemed to be more optimism. And I think, you know, we were, we're talking a little bit earlier, like if I had to distill it down to like, what is everybody's kind of motivation here? I, I kind of likened it to, to, um, what we kind of say in social work, like when you make safe options inconvenient, you incentivize risky behavior. And I think everybody here in one way or another is trying to kind of, approach that in a way that everything that we do bakes in a level of safety into these business practices to make sure that we aren't incentivizing risky behaviors and practices.

Mike Masnick:

Yeah, yeah, no, I think that's, that's a really, really good point. And then just, It's like a final thought. Is there anything that you've seen or heard this week at the event that, makes you think differently about the future of trust and safety or where, where things might be heading?

Rachel Guevara:

I mean, I don't know if I think differently about it. I think I've always felt really hopeful about the industry of trust and safety. Yes, of course, businesses wax and wane and industry interests. You know, from outside entities can, bubble up and then kind of go away depending, but at the core is like a lot of people who by nature are helping professionals and have a real passion, for, helping other people. And yes, of course, that's dynamic and ever changing with, you know, societal norms, um, you know, innovations. all those sorts of things. It's kind of a dynamic field, which will kind of continue to move the way that it needs to, but the thing that keeps me really enthusiastic about the future of trust and safety is the people, the people that are really passionate about doing the right thing for others, the right thing for society. The right thing in businesses, and that's really special. And I think quite unique actually to the trust and safety industry.

Mike Masnick:

Yeah. Great. Well, thank you so much for, for joining me, for sharing your thoughts on TrustCon and, uh, I assume I'll see you next year. Yes, of course. Yeah. We're

Rachel Guevara:

regulars. So

Mike Masnick:

yeah. Thank you. And, uh, thanks everyone for listening as well.

Announcer:

Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.

People on this episode