
Ctrl-Alt-Speech
Ctrl-Alt-Speech is a weekly news podcast co-created by Techdirt’s Mike Masnick and Everything in Moderation’s Ben Whitelaw. Each episode looks at the latest news in online speech, covering issues regarding trust & safety, content moderation, regulation, court rulings, new services & technology, and more.
The podcast regularly features expert guests with experience in the trust & safety/online speech worlds, discussing the ins and outs of the news that week and what it may mean for the industry. Each episode takes a deep dive into one or two key stories, and includes a quicker roundup of other important news. It's a must-listen for trust & safety professionals, and anyone interested in issues surrounding online speech.
If your company or organization is interested in sponsoring Ctrl-Alt-Speech and joining us for a sponsored interview, visit ctrlaltspeech.com for more information.
Ctrl-Alt-Speech is produced with financial support from the Future of Online Trust & Safety Fund, a fiscally-sponsored multi-donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive Trust and Safety ecosystem and field.
Ctrl-Alt-Speech
Moderation Without Representation
In this week’s roundup of the latest news in online speech, content moderation and internet regulation, Mike is joined by guest host Hank Green, popular YouTube creator and educator. After spending some time talking about being a creator at the whims of platforms, they cover:
- Crash Course Coin (Complexly)
- Everyone Is Cheating Their Way Through College (NY Mag)
- The Professors Are Using ChatGPT, and Some Students Aren’t Happy About It (NY Times)
- How Miami Schools Are Leading 100,000 Students Into the A.I. Future (NY Times)
- We Shouldn’t Have To Explain To The FTC Why Content Moderation Is So Crucial To Free Speech, But We Did (Techdirt)
This episode is brought to you with financial support from the Future of Online Trust & Safety Fund.
Ctrl-Alt-Speech is a weekly podcast from Techdirt and Everything in Moderation. Send us your feedback at podcast@ctrlaltspeech.com and sponsorship enquiries to sponsorship@ctrlaltspeech.com. Thanks for listening.
So Hank, there is this site out there called Complexly. Maybe you've heard of it. Uh, it's an amazing site. There's tremendous amount of really sort of brilliant and free educational content on so many different subjects, and it has a tagline which is creating trustworthy stuff that inspires curiosity and lowers barriers to knowledge building. So I am curious, what's inspiring your curiosity today?
Hank Green:thank you for bending over backwards to make that work. I really appreciate it. I, I mean, I have such a good job where like, that's really the whole thing is like trying to find something that's like, what? Like I, I just messaged a friend of mine who's like a religious scholar. I was like, Hey, can you like, can I get on the phone and talk to you about why Jesus has long hair? He is like, yeah, no, actually that's interesting. We could talk about why Jesus has long hair. It's, uh, not in the Bible. and, and I, started on a different path yesterday after I found out that the Mona Lisa doesn't look like that, uh, so, so like I had assumed that the way that the Mona Lisa looks to my eyes is the intently an art of Da Vinci. but in fact it's, you know, and this is very well known, if you know anything about art, you know this, there's like layers of varnish on the painting and that varnish over time, yellows. And so the original image was much more sort of like color correct to reality. So the way I, I thought Da Vinci intentionally made this like, sort of like weirdly yellow glowing thing. There's like plenty, there's a bunch of examples online of people who have. Tried to work backwards to see what the Mona Lisa would've looked like when Da Vinci finished it. And it looks nothing like the one that we see. It's like way brighter and more blue, and the her skin looks more like skin. And I, I was just like, God, I gotta make a video about this. But I don't know what the video's about yet. I'm just like, like,'cause the, and also the other thing is like, that painting that Leonardo da Vinci made doesn't read as, as impressive to me because like, I've been subject to the culture that like, this is a very impressive work and the very impressive work looks like this. And so, so when it doesn't look like that, I'm like, well, that's, that's good.
Mike Masnick:That's wrong.
Hank Green:So, I don't know. I got a long list. I got a long list. I, I keep trying to work on a video called, will Computers Ever Suffer? And, that one's not easy. It turns out I'm very curious about what computers will start suffering at some point. and it seems important.
Mike Masnick:Yes. Yeah, absolutely. right.
Hank Green:So, I don't know. I'm also, curious about a bunch of the stuff we're gonna talk about today.
Mike Masnick:Yes. Yes. Hello and welcome to Control Alt Speech, your weekly roundup of the major stories about online speech, content moderation, internet regulation, and whatever else we wanna talk about, including Jesus' hair. Apparently
Hank Green:apparently
Mike Masnick:it's May the 22nd, 2025, and this week's episode is brought to you with some financial support from the future of online Trust and safety fund. this week we'll be talking about Online creators, AI in schools, maybe the FTC and perhaps a few other things. I am Mike Masnick. Ben is still off learning the wonders of, I guess, age, verifying his child, and we have a special guest host with us this week. I'll note most weeks when we have a guest, host, host. It's someone who comes at these issues from the side of working online
Hank Green:someone who knows stuff
Mike Masnick:It's good enough. No, no, no. But they, the people maybe who have worked in trust and safety or as a lawyer or an academic this week, we have someone who's slightly different, but certainly has experience and certainly knows what he is talking about. someone who has a tremendous amount of experience on the side of making cool stuff that goes out into the world over the internet and using these platforms. Hank Green, welcome to the podcast.
Hank Green:Thank you for having me. I'm a fan. I like to listen. I, uh, one of the reasons that I feel like I became a YouTuber is because I, am obsessed with media and how it works and how it has worked historically. And, I have always had this like little piece of. My motivation, which you need motivation in whatever you're doing, being, like a lot of my colleagues early on, especially like their motivation was, this is a stepping stone to TV or whatever, uh, standup comedy or some like preexisting media that they had dreams of becoming. And like my dream was really like, are you serious? I get to be here at the beginning, like I get to like watch Lucille Ball happen and I like can be friends with her. uh, and then also what you learn as part of that is that, the way that media works in these cycles and if you understand the history, then like it all kind of makes sense. So I definitely come at this, with a lot less granularity, but I do enjoy seeing it from the 30,000 foot view and being like, oh my God, we're in a weird moment.
Mike Masnick:Yes. Yeah. Yeah. for folks who don't know, if you don't know Hank, you should, first of all, you should know Hank more than you know me, I think. Um, but you, you do many, many different things. all sort of in and around the idea of, content creation, creating a whole bunch of amazing videos. you've then sort of spun off and spawned an entire industry of amazing things that sort of educate people and also benefit society. Um.
Hank Green:think so. Yeah.
Mike Masnick:and I sort of, you know, when I look at your career and the stuff that you've done, to me it's, it's kind of the epitome of like everything that those of us who were like internet optimists from the early days sort of hope to see would show up. Like people doing good stuff, making the world a better place using the wonders of the internet. Is that a fair summary of what you try to do?
Hank Green:Yeah. I guess I hadn't really thought of it that way. The, the, you know, like I definitely was a child of that era and we get to enjoy Some of the fruits of that era still, like it's still there. You know, Wikipedia is doing great ish. Uh, it's, you know, of course it's because it is powerful. It is being attacked by the people who will attack any source of power that is not themselves. But, it kicks ass. It's an amazing, it continues to be like one of the greatest creations of humanity. and yeah, I, we, we work really hard to like really efficiently create useful, stuff that helps people be curious and excited about their world and learn and like understand you know, how the complexity of it all works. and then also we have this, we have a channel called Crash course, which is used in. Pretty much every school in America and it's entirely free. We don't have contracts with the schools where teachers and students decide to use it if they want to use it. We don't, you know, the only reason people use it is'cause they think it's good. So we just have to be really good. That's, uh, how it works. And then of course that does create funding problems, but, we figure that out in various ways. If you'd like to financially
Mike Masnick:I was gonna say right now is the time, right?
Hank Green:Yeah, we have our yearly fundraiser happening right now where you can buy a, a$100 or$500 or even a$5,000 coin, which
Mike Masnick:A physical coin. Yeah.
Hank Green:a cryptocurrency
Mike Masnick:Right.
Hank Green:that, it's beautiful. It's a, it's a hand minted, a hand engraved dyes in Arkansas. They're really lovely objects, and this is how we fund a huge portion of crash courses through this coin campaign. So if you're looking and you're like, oh my God, wow. It's amazing that these people, create content that is, used in pretty much every school in America with less money than like your dentist's house costs. Then help, help. The only reason we could do it is because like, 5,000 people give us money every year, and you could be one of them.
Mike Masnick:Yes. There we go. well, I thought since we have you on the podcast and because you know, you have a whole bunch experience and, you, have used a bunch of platforms and obviously YouTube has been sort of a big part of, your. Creation story, I guess.
Hank Green:Yeah. Oh yeah.
Mike Masnick:Um, you know, I thought it would be, interesting to kind of talk through what it's like from your side, right? We we're talking about online speech all the time. We talk about the platforms. but you know, we have a chance to talk with someone who is, more, on the creation side and sort of what your experience is with, the platforms and kind of how you think about this, this entire space.
Hank Green:Yeah. I mean, so like it's interesting because I think people didn't even really understand what I am legally. A lot of times people like think I work for YouTube. Um, but YouTubers and tiktoks and, and Instagram influencers are all small business owners. They may be a sole proprietorship where it's just them, but they are, they're small business that attempts to turn the eyeballs that are given to them by the platform into dollars so that they don't have to have another job. And, it's a great job if you can make it work. and so that's like the sort of legal framework for how it, how it works. We're just all independent contractors. Mostly I, you know, my employees make YouTube videos and they, they're actually employees, but they work for the business, not for a platform. but the platform is the. Just, like we were talking before we started rolling about like Blockbuster and like how people used to make content to then get rented at Blockbuster. YouTube is the, for a lot of people, uh, it's the best place to make money. they established sort of an economic ecosystem way earlier than anyone else. That ecosystem is really stable. Facebook's way of doing this is to constantly change it. So they're constantly incentivizing some other thing, you know, I just got like a note that was like, Hey, we're gonna give you a bonus if you make more carousels of photos. And I'm like, that's not who I am. That's not, that's not what I do. I'm not gonna become good at doing carousels of photos, but thank you. and for a while they were really incentivizing reels and suddenly I was making like$30,000 from doing Instagram reels. But then, you know, one day that goes down to 10 and then the day after that it's five. And then. The next month, that goes away completely. You can't build a stable business in a situation like that. but YouTube has always been exactly the same. You get the views, they give you, 55% of the ads that pay, that play on those views, which, has created a pretty stable economic ecosystem. And what this, what I realized, I don't know if you'll like this, but what I realized like about, I don't know, eight years ago, is that I lived in a town and I built my business on a town with streets, maintained by and with, you know, rules set by a government that I didn't get to vote for, that, deeply influenced my life. And they could, come into my house and change the layout of the rooms. Like I could wake up one morning, the bathroom would be in a different place. And, and the levers of power for influencing that government that I am really deeply influenced by, in many ways, more than my, state government, maybe not more than my city government. and this might not be the case for all people in my state, but for me specifically, um, the levers of power are really different than they are in democracy because they're not democracies, they're, they're corporations and they are run by a board and, uh, CEO and those levers, I kind of realized like, where can I pull? Like what are the levers like we can pull on? And, and it's not like they don't have levers, they do. But,
Mike Masnick:Yeah.
Hank Green:it is weird to live so much of my life in. This like little autocracy, that is not so little anymore. And how much, like as we invited our social lives onto the internet, how much of our living began happening in places that were not in any way democratic and, were set up to monetize us and to turn us into dollars, which is not how, places usually exist. Usually. I mean, I, well, I guess, you know, you're turning your citizens into dollars through, through taxes and to all taxation is theft. And I am a libertarian and roads are bs and fire departments also should just be funded by people paying money to, and if they don't pay, they shouldn't get fire service. Um, so, but yeah, the, the, uh, figuring that out and like, kind of being like, oh, well here I am. Like, like I, and it's not just me because like of course it's very present for me because I am have 70 employees who are, you know, may, may not get paid if I can't figure out how to, squeeze the right lemon to get the right lemon juice. That is a wild situation that we didn't realize we were signing up for, was no one's idea. I mean, it was someone's idea, but nobody, we, we just sort of like, uh, ended up in a really dramatically different world with a very different power structure. and it's funny because I talked to the people at these platforms and very few of them, That resonates with very few of them. They're like, oh, no, no, this is like, we're not like achar. We can't control any of this stuff, and we're so outta control. We have, we have no leverage. And I'm like, and this isn't like Neil Mohan, but like, you know, people who I see as making like really influential decisions that affect my life really deeply. They're like, oh no. We like, oh man, we have so many constraints. And I'm like, all right, I don't believe you, but okay. I don't, I don't know if you know how many constraints I have, uh, but
Mike Masnick:Yeah, I, I, I wonder why that is, because you would think that they would realize, right? Like, even if it's just like, you know, whatever Google decides that it needs to make more money and therefore it, passes down an order from on high. That, YouTube has to treat stuff differently and, more stuff has to be monetized or something different has to happen. and do they not recognize how that then, trickles down to you?
Hank Green:I don't know that they always do. and I think that oftentimes they, might feel like ultimately it's not like a, thing that they're doing to me. It's like they've been given a directive and they have to take it. and ultimately what they, I think believe is, I don't know, I'm thinking of some individuals here and I think it's different d person to person, but ultimately, there is a sense that if what they are doing, they're doing in order to make money, then like, well, of course, like we're a business. That's what it's for. And they don't only care about that. And, and like this is one of the levers is that the actual people who work at these platforms are people who have values that are not defined by their bosses. and they can, I. they have some power. They had more power. They had a lot more power five years ago, but they have some power and that's been wild to see that lever kind of get smaller in the last five years.
Mike Masnick:So one other thing, I, I had written something, a few months ago. I don't even remember when it was that was sort of talking about this, but from a different angle. And I'm sort of curious your take on it, which is. in thinking about the move towards more decentralization on the internet and new tools and, and systems that, might be more decentralized. And, here, I guess I should, Ben, just before he went on, on, paternity leave, bought a bell to ring about the disclosure that I'm technically on the, on the board of Blue Sky, and therefore you should take everything I say as being incredibly biased and horrible, do not trust these things, what I'm about to say. Um, but in, thinking about the importance of, decentralization, this sort of realization that occurred to me that for the last, let's say 10, 15 years, I described it as kind of like this learned helplessness where everyone sort of put their lives into the hands of platforms controlled by other people. And if you didn't like something, your only options were to scream about it and you could
Hank Green:Yeah. Or like go and like live in your own city, not on Twitter.
Mike Masnick:But, but like, it
Hank Green:you would advocate that that land, you would, you would vacate and say like, this is, I no longer wanna be part of this conversation.
Mike Masnick:but it, felt like no matter what, like yeah, you could vacate to, but to somebody else's land and then you were just left either screaming at a billionaire to fix things or the government or Now those are the same thing as, you know, the billionaires run the government. but can you envision a world where, you know, could there be a YouTube where you didn't have to worry about that they would make some sort of change and that you would actually have more control over it? Or is that an impossible thing?
Hank Green:I think it's probably an impossible thing. one of the things that, and every time in the past, this actually has kind of gone away, but in the past when YouTube has made some like big controversial decision that has been negatively impacted, some creators, but not all, there's been a, we need to create a YouTube that's owned by creators. And there are things like this, you know, there's streaming platforms, there's Nebula, there's dropout, there's like places where creators have been like, we're gonna make stuff on YouTube, but we're gonna have a place that's a place where we actually monetize the content directly and, can. Sort of choose how we wanna do this. But in terms of like a really large scale thing, I think it's very hard because I think if you try, like, turns out democracy's difficult. It's slow, it's supposed to be slow. And, these companies, benefit from their speed. And so if Youi wanted to have like a nice slow plotting, deliberative, taking into account 200 different stakeholders version of YouTube, I think it would, uh, not be as appealing to, for example, I don't think that it would've launched shorts because no YouTuber wanted shorts. We didn't want that. I mean, we're tired. We're so tired. We don't wanna make a different kind of content but like TikTok made them launch shorts. They had to launch shorts. I was, if I worked at YouTube, I'd be like, yeah, we gotta do shorts. Um, so I, I think that that would be very hard. And of course, YouTube has benefited a great deal from launching shorts. and also TikTok moved so much fa and still does even now, moves faster than YouTube. But especially at the beginning, the features launched every fricking moment. and one of the benefits that TikTok has, I love this stuff. One of the benefits that TikTok has is that, creators have less power on TikTok because when I make a YouTube video, even if it doesn't, even if it's like not gonna do well. It's still gonna reach somebody, like, it's still gonna reach like a, like 10% I had a video that had, got 10% of the, average views of a video I post on YouTube, I would be astounded. Like it never happens, like 30 maybe, but I'm reaching a lot of the people who are subscribed to my channel. but if I post on TikTok and I reach 3% of my average views, that's totally normal.
Mike Masnick:Huh.
Hank Green:And so this happens all the time when I, like, when you make something that isn't actually satisfying, like what the people of, your audience want, they will not, like, the sensitivity is so strong to even a very slight decrease in attention from your audience that like you get demoted super
Mike Masnick:And, and this is, this is all based on the algorithm,
Hank Green:yeah, yeah. And, and because TikTok gets way more data points because in the amount of time it takes to watch a YouTube video, you've watched 40 tiktoks. So they know really well. They just have more granularity to be able to decide what to promote and what not to promote. And they don't have to show your stuff to as many people before they decide to never show it to people again. and that's really highly prising the, user as like the best possible user experience, keep them best possible, most addictive user experience. And, that means that TikTok has way more power. Because if I wanted to complain about TikTok, I'd need to like work really hard to make that compelling for TikTok to show it to anybody. And then secondarily, discovery, this is the best thing about TikTok. Discovery is so good. So if you are a new creator who's like, just really wants to get attention and wants to be like your. heroes. So much easier to get attention, but that means that established creators, it's so easy to lose it. There's always somebody sitting there waiting to take your place, and it happens instantly. if you start to burn out, you're so disposable. it's much easier for YouTubers to work together to push back against YouTube and say like, you can't just do this stuff. You have to give us time. You have to give us information. whereas, TikTok, you really don't. And then Facebook seems to be like a kind of middle of the path where it's like you can push back against them, but they don't care.
Mike Masnick:Sounds about right.
Hank Green:And maybe that, maybe it's like, because I've, I've never really been an instagramer, but like, man, I, I think they, they might like hate me specifically. I don't know what it is, but like, I, like never in my life have I ever been able to fix a problem at Facebook.
Mike Masnick:Interesting.
Hank Green:Yeah.
Mike Masnick:And how much, if at all, like, you know, all of these things, when these platforms put in place, these policies or they have changes to them or, they offer new things, like how much does that directly influence the content you, that you
Hank Green:Oh yeah. I think about this all the time. It's not just me, it's it's everybody. there's like the speech that is being intentionally quieted by government action. And then there's the speech that's being like, and intentionally quieted by corporate action. And like, those are the things we think about, but there's also just the stuff that never gets made because it won't get reach. And, you know, the, quintessential example is 10 years ago or so, YouTube decided to stop ranking clicks and start ranking watch time. So if somebody click, clicked on a video, but it was only a minute long. That's gonna have to have a really high clickthrough rate to compete with something that's 10 or 20 minutes long, far fewer people might need to click on that per impression in order for YouTube to continue recommending it. Because it keeps the people on the site. You get more opportunities to advertise to them. And, watch time is like, they just want people to stay on the platforms. This is what everyone prizes now and when that happened, there's a bunch of creators who were like thriving as animators. It's really expensive to make animated content. It takes a lot of time. So it's like, much easier to make like a minute long animation that lots of people are going to click on and enjoy. But the moment the switch happened, an entire class of artists completely lost their audiences. And YouTube was like, well that's like kind of the cost of doing business. and there was lots of, you know, internal debate and, lots of creator uprising about that, but ultimately what that meant is that content just stopped getting made. it's not like it's out there and not getting views.
Mike Masnick:Right. It's just not even being made at all,
Hank Green:Yeah,
Mike Masnick:for you more directly, like how much of what is happening on the platforms helps determine what kind of content you make? Does it?
Hank Green:yeah, yeah. as a creator, like the, game that I play
Mike Masnick:Mm-hmm.
Hank Green:is how do I get people to decide to click on a video and how do I keep them watching once they have, if you can do that, you can be a YouTuber but like, the first step In that funnel is how do you get YouTube to show someone your thumbnail and like they want different things. but also like the way I can see this, there's like, I don't really know how it works and I, people who do know how it works probably have a problem with what I'm about to say. But, it seems like there are, groups somehow, like the recommendation algorithm has identified groups of people who are into certain things. And I feel like oftentimes I'm like, if you showed this to another group, they'd be into it, but you never will. Like, it just won't happen because it, like the way that the recommendation algorithm has clumped things together has meant that there's sort of It doesn't feel to me like there's a lot of overlap between sections of YouTube in the way that there should be. And, and we get a lot of demographic specificity on our content. And when we make science content on YouTube, we find that it's, it's like 75% male when I'm I'll paste the exact same TikTok on shorts and on TikTok, and I can see a precise flip in gender where TikTok will be 70% female watching it and YouTube will be 70% male. And I'm like, what's going on here? Is it that these are different women on these different platforms? is it that the algorithm is defining something? Is it that there are just way more women on TikTok and men on YouTube? I'm like, I don't think that's the thing that's going on. So I don't know. but I, I do know that, it feels like I. I have to make a certain kind of content cause a certain algorithmic group has been defined. And so I have to create for that audience that has been created for me because otherwise they will not. And if they don't click on it, then I will reach no audience or it will be very unlikely that I do.
Mike Masnick:Interesting.
Hank Green:So in many ways I feel like as, creators, what we make is decided by how the systems of these platforms work. And that is largely through the recommendation algorithms, which from what I can tell, no one knows how they work and they, they're optimized for goals uh, and then once that happens, it's pretty black boxy and maybe they have some interpretability there now that they didn't have the last time I got a straight answer out of someone. But, um, from what I can tell, it's pretty mysterious.
Mike Masnick:Yeah.
Hank Green:I feel like those black boxes like define reality for a lot of people, and it's like, who is the, who is gonna be the architect of reality and is it gonna be three guys? That's how I, that's how I feel. Wanna talk about AI now?
Mike Masnick:Yeah. I was gonna say that's, that's, that is a nice segue way. You're, you're good at this. into the next story we did wanna talk about was a little bit about ai. and this, this was actually, there was a collection of three different articles that appeared in the last week or so, that I thought sort of painted an interesting picture that were worth talking about and thinking about. One was a New York mag that said everyone is cheating their way through college and was all about how all the students are using ai. and then two others that I thought, you kind of have to read all three of them together. Uh, the, the second was cashmere Hill piece in the New York Times, about how the professors are using chat, GPT, and some students aren't happy about it, so you're like, okay, the, the kids are cheating, but some of the professors and then the students are not happy that the professor's using it. And then finally was another New York Times piece about how Miami schools are, have decided to embrace ai. They had banned it for a while and then they decided, no, actually we're going
Hank Green:gotta deal with Google.
Mike Masnick:Yes. They did a deal with, with Google in particular, and there's, they're sort of trying to train students how to use AI properly. what's, what's your
Hank Green:man. This is freaking me out. Is what it is. Like, it is, it is pretty important for people to like, know stuff and do things and become capable and like develop, themselves. Like build themselves as like the greatest creation of, of all of our lives is like we build ourself and, and I'm like, uh, who are we gonna, are we gonna stop building ourselves or we gonna just be like plugged into, I don't know, the space between like, what I want right now. And, my long-term wellbeing is, is not like nothing. There's like a, there's a gap and I'm just like, it's so easy to just go for what I want right now. And I am, I know I'm 45. I know I'm middle aged. It's, but it's happening. It's, it makes me angry because I'm like, I knew it wasn't gonna happen to me, but now it's happening. And I'm like, well, then it must just be because this is what happens to 45 year olds. But Mike, it's actually bad. So it's not like in my case, no, I'm the exception where I'm not just having a moral panic. This is really fucking bad.
Mike Masnick:So, so what, specifically is, do you consider bad because the, there's, there's a lot of different,
Hank Green:and like, this is the thing, I don't know, I don't like when you say 90% of kids are using ai, they're using Chachi PT to like help them with homework. I know for a fact that some percent of those kids are using it well.
Mike Masnick:Right.
Hank Green:and some percent of those kids are like, write my essay. And then they're like, it's too many m dashes, take'em out. Uh, and it is like, can you make it dumber? Can you just like, make it a little dumber? and like that's how they're learning how to use chat. GPT is like their prompt engineers. And that's, I don't know how to prepare a child for the future right now because I don't, I I don't, I don't not know what the future is. Like. I don't know what six months from now is like. And, so that, that is a little, like, I know that these kids are probably gonna have to be a little AI project managers in their, careers. Like not def definitely, but probably those, like, they might be plumbers and electricians, in which case Good do that. Because that's the, like I'll tell you what, the last thing robots are gonna do is pipes. They're gonna be bad at pipes. They might lay out the pipes for you, but they're not gonna crawl under the building and squeeze around a corner and get a weird wrench to do a weird thing. but gonna be a lot of jobs that will be managing the ais. And so it's good to know that. I do feel like we are in, the big mushy moment where we don't know how we're gonna handle this stuff. I think that Miami-Dade being like, let's just have Gemini be for everybody. I'm kind of almost surprised that Google said yes
Mike Masnick:Why?
Hank Green:because that's not what Gemini's for. Like Gemini's not designed to be a, tool for children in schools.
Mike Masnick:Yeah.
Hank Green:And I what I want is, tools developed specifically for children in schools? I think AI tutors are obviously gonna be a big thing. I have a 8-year-old son. He loves to learn. and like it's really actual, like, you know, he talks to Google Home and he's like, play a fart noise. He talks to Chachi BT and he's like, how many bones does a snake have? and then like, he has a conversation about snake bones and he learned a bunch of stuff. And I'm like, your dad is a literal science guy. You could ask me about snake bones and I'll be happy to have that conversation with you. And he is like, yes, but you were on Blue Sky. And I'm like, this is a good point. so I do think that there should be, these tools should be designed because like you've used Gemini, you've used chat pt. It's just gonna be a bunch of high school students getting gassed up by AI telling'em all their ideas are awesome and they're not. Like, like Edia has never told me one of my ideas was bad. I'm like, dude, you gotta, like, they're all bad. You got like, none of the No, no real people in my life are telling me this is a good idea. You need to stop it with the, uh, I almost said the word glazing, which me has two different meanings.
Mike Masnick:but, but has become, yes. Is now associated with, with AI telling you you're wonderful.
Hank Green:Anyway, I I worry about this, I worry about, obviously a lot of, I don't know, what do you call it? when you moderate ai, when you like alignment. Like, is there trust and safety in this space? Like, you don't want the AI to flirt, flirt with the students. You don't, you don't. Even if the students are flirting with the ai, you need to figure out how the AI is like, ah, that was fucking stupid. Don't do that. Super cringe, dude. fall in love with me. Um. you gotta figure all that stuff out. And I just feel like we haven't yet. and the tools should be designed specifically for teaching.
Mike Masnick:Yeah. Yeah, I mean, I think it's, tricky because you know, I know that there are a bunch of people who are just like, no, ban it outright, which is what, this Miami schools had done. And what was sort of the reaction to that New York Mag piece was like, oh, but I don't think that's the right thing either, because like, there is a way to learn and yes, the tools are changing all the time and there are all sorts of ways that they're used badly, but there are ways to actually use them. Well, as you noted, like some percentage are, but,
Hank Green:and I, and I have no idea what that is. Like, there's a lot of, it sounds very typical of a news story about a new thing for there to be someone quoted saying, all of my students are graduating and they're functionally illiterate. And I'm like, yeah. Oh, I'm so surprised that like a professor thinks that his students are dumb.
Mike Masnick:Right.
Hank Green:Um, like I remember college, I was also dumb, but, I don't think we know, and I, I think that this is like, I think it's a problem. I just feel like this, like so much of modern society is done to us. There's like this technology no one asked for is, but everybody likes using. I mean, not everybody, but a lot of people like using it.
Mike Masnick:Yeah, I mean, I, I do think that there are, and I don't, I, I certainly don't have the answers to this at all, but there, I think there are some things that can be useful. And, and I've talked about this a few times where, I thought a cool way that I, and I, I've seen a few different variations on this, that I've seen teachers using AI in classes saying have the AI generate an essay on this topic. And then you, the student have to correct it. And, which not only teaches you something, sort of forces you to put you into the, the sort of teaching chair, but also teaches you that you can't trust the ai, the AI is not going to be great. and I'm sure I've said this before, I don't know if I've, even if I've said it on the podcast or not, but like when I was in college, I took a ton of statistics classes and you know, higher and higher level statistics. And in theory I knew a whole bunch of things about statistics and then I had to teach. Incoming freshman, a statistics class. And I discovered I didn't know statistics at all. And in fact, I learned so much more statistics when I had to teach these students because they would ask the questions that I had ignored for, for whatever, three or four years up to that point. And suddenly I was like, oh. But once I was teaching it, then I actually began to get a real grasp of the subject. And so there's always been this element of like, wouldn't it be cool if the way to teach people stuff was to force them to teach effectively? And like maybe AI enables that in some way, but that like would take intentionality in terms of setting that up.
Hank Green:We, we have to figure out the ways to use it. I think that it's like learning how to use it is something that I would want if I had a high schooler right now. Something that I would want. and then it's interesting reading this, uh, this article about the professors using it too. And one of these, one of the students was like, you use Chachi PT to make my lecture. I wanna refund.
Mike Masnick:Yes.
Hank Green:It was like, but it like, It reminds me that like one of the things that we've discovered in making lots of educational content is like you can make a really great video lecture, but it does not actually, like, most people won't use it unless they are in, they won't use it as an intentional part of gaining new knowledge unless they're in formal education. So you're gonna get the exact same lecture. but if you're getting it as part of formal education, it becomes a much more robust thing for a couple of reasons. And these, this is like something that I think no one thinks a lot about and they don't think that this is part of the teacher's job, but it's the most part of the teacher's job. most important part of the teacher's job is to motivate students to actually do the work. So like, what am I here for? I'm here to be a respectable person who by the virtue of my knowledge and my station and my position at the front of this classroom, who you know, you are letting down if you do not do this stuff. And I don't know how much, that's discussed in like, uh, your Masters of education. Uh,'cause I didn't have, I don't have one of those, but it seems like it's the most important job you do. it's not just that, I'm the person who decides the letter that you get at the end of the year, and that's gonna have an effect on your future for a lot of students, I'd say a majority of the students, there's really a component of like, I have been set up psychologically to be motivated by this person the way, like, as if they're a kind of personal trainer for your brain. and you have to show up and be there for that person. And then like, your classmates are also sort of a secondary set of motivations, especially if it's a more collaborative course. And, and, you know, if they're like workshopping or, showing off in whatever way, the things that you are making so when, when somebody's like mad that a teacher used AI to help with a PowerPoint presentation, I'm like, you're missing what the job of the teacher is. Like. The job of the teacher isn't to like make PowerPoint presentations. There's a very good chance that if they weren't using ai, they would just be getting a sort of out of the can lesson plan and using the PowerPoint presentation that someone made had then sold on teachers pay teachers.
Mike Masnick:Yeah.
Hank Green:and you wouldn't want your refund there because like what the job is, is not making PowerPoints.
Mike Masnick:Right.
Hank Green:Yeah. But don't know, man, like, like all work right now is up in the air.
Mike Masnick:Yeah, it's interesting because I mean, the other thing I've been thinking about, and I'd written maybe a year ago, I'd written about how I'm using AI with teched articles, which, is not to write them. so I will write my entire articles by hand, you know, artisanally, the, the,
Hank Green:Mm-hmm.
Mike Masnick:the old fashioned way. but then I've built up a series of prompts and I've trained a system specifically on the articles I think are the best teched articles and say, okay, now help me edit these articles. And I find that incredibly powerful and incredibly useful. And people yell at me and say like, no, that can't be, and that just means you don't know how to write and you don't know how to edit. And that's bullshit. Like. I actually think the fact that I kind of do know how to write and kind of do know how to edit makes that process even more valuable because I can tell when it has a good
Hank Green:Yeah.
Mike Masnick:and when it has a terrible idea and I, I have this back and forth and I'll yell at the AI and just be like, no, come on. you've totally missed the point. You're like trying to get me to emphasize something that is not important and then it will, uh, glaze in response and be like, oh, you're
Hank Green:Oh, you're, oh, well. Yes, sir. Yes, sir. Very good. Very good. Yes. You know so much, Mike. I love you. You're so smart.
Mike Masnick:But, there is this weird aspect to that, which is like, the tool is useful to me because I, I know how to do this. I would say, fairly well at this point, but if you don't, then I wonder if it's more challenging. And so
Hank Green:And I, and like, I wonder, and, and also importantly, I do not know, I do not know the extent to which, like writing a intro paragraph into Chachi piti and saying like, okay, here's like where I wanna go with this essay. Can you finish this for me? And then like, going back in and reading that and like editing it and doing a couple more prompts. I don't know if that's doing some of the work that I would get if I was a student that I would get out of actually like, sort of like slaving over it. But here's the crazier thing. I don't know if it might be better.
Mike Masnick:right.
Hank Green:I don't know if I might as a student actually learn more from doing that, or less or different, or
Mike Masnick:Yeah. And I don't think anyone
Hank Green:same. Like, I just like, I have no idea. But I think it's really, it's really easy to sort of like start from the position of like, everybody's gonna be functionally illiterate. I do think that it's also very easy probably to get through a lot of school, literally without doing any work. Um, so like that is, that is a concern.'cause I, you know, I was, I was pretty lazy
Mike Masnick:Yeah.
Hank Green:I'm, I'm glad that I did a bunch of work.
Mike Masnick:Yeah. I mean, there was one other interesting thing, which we've come across.'cause I do have a child in high school, and the school has, you know, they turn in their assignments that are done in Google Docs, and I think they have a contract with, Grammarly so Grammarly is built into the system that they use
Hank Green:Man, I'm so freaking jealous of all these ed tech companies that get paid
Mike Masnick:yes,
Hank Green:schools. We're just giving it away for free crash course coin.com. Everyone, if you'd like to support crash course
Mike Masnick:Yes,
Hank Green:budget of Less than your Dentist's house, we're just, we're used in every school and we get nothing.
Mike Masnick:Uh, but,
Hank Green:If this, look, if this works,
Mike Masnick:yes.
Hank Green:if we get, if literally if we get 50 people buying a coin from this, it's gonna be like, it's gonna be like, oh my God, I'm glad I did that podcast.
Mike Masnick:Excellent. Excellent. Let us know,
Hank Green:Yeah.
Mike Masnick:if you do. Um, but, what's interesting is, is that it has an AI checker built in to Grammarly. my son will write, whatever the assignment, and then before he turns it in, Grammarly will run through and give it a score of like, how likely it thinks is, is ai. And then my son who writes the whole thing himself without AI freaks out if it gives like any percentage and is in like, well, I need to edit this so it doesn't think it's ai. And I'm just like, it's, that's not good.
Hank Green:a, that's a, that is a huge waste of your time.
Mike Masnick:Yeah. But that's, that's what's happened. And like he's gone through and even tried, like removing each sentence and then rerunning the score again to, to see like
Hank Green:me where it is.
Mike Masnick:which sentence is the problem. and that doesn't seem
Hank Green:not how it works.
Mike Masnick:a particularly good use of anyone's
Hank Green:No, no, though I'm sure he is learning something.
Mike Masnick:Yeah.
Hank Green:I, this isn't like a, uh, AI in Schools podcast, but, I just, I think it, it's going to look very different a year from now than it does now. I think a lot of people who are like, I can't believe that you are saying that AI is helping you at all with writing. Maybe haven't used it in a year. and that's just how, it just has changed really fast.
Mike Masnick:yeah. I think, I think there are two things that people don't understand. the people who, who have completely written off AI and like, I'm, I'm not, an
Hank Green:Yeah. No, I like, I, I honestly understand that, like, I think that's a, that is a defensible position.
Mike Masnick:the thing that I, I think people don't understand is one, how much better the, the models have gotten because they have, like, it's insane how much better they've gotten, but also like for all the joking and mockery of this concept of like prompt engineering, the prompt matters
Hank Green:do, you do have to, you do have to actually know how to use it. And so like when you first come in, it's like
Mike Masnick:Right. Like you ask it something very simple, it does a terrible job, and you're like, well, this sucks. And so it's like that editing thing that I do with tech tier pieces, like I have a pre-written prompt that I start with that is like paragraphs long and it has all sorts of detail. And you are looking for this. You're not looking for this. Don't tell me this, do that. and that's what, primes it. And it's also primed off of a bunch of teched articles and saying, this is for Teched. This is the audience of teched. These are the things to think about. So there's a lot that goes into it, not just like, fix this piece.
Hank Green:Yeah,
Mike Masnick:And so, you know, I'm not sure
Hank Green:yeah. And like, if you, you could put in like the dumbest idea and be like, is this a good idea? Uh, and if it's, if it thinks it came from me, it's like, yes. But if I'm like, some asshole on Twitter said this to me, it's like, actually this is a terrible idea. And I'm like, you're I do worry about Like any other echo chamber that it's just like, it is there to tell you, you are right. And I'm like, we need an AI that actually knows things. And, you know, that would be an interesting thing for pedagogy. Like, imagine a kind of, the potential advantages of a kind of test where you, like the test is a conversation where like the AI tries to figure out if you actually understand material rather than,
Mike Masnick:yeah.
Hank Green:multiple choice ification of everything.
Mike Masnick:I mean, I think there's real opportunities there that could, could be really interesting. I wanted to talk about a couple other stories, and this is in theory, our lightning ish round. So we'll
Hank Green:Okay. I'm gonna do it in 20 words or less.
Mike Masnick:try and keep it relatively simple. Um, this week, this was the end of the FTC comment period. I submitted a comment. It is on their investigation or inquiry technically into big tech censorship. And so you, as a user of big tech, and as we have described, being concerned about it, are you concerned and do you want the FTC to be stepping in to deal with big tech censorship?
Hank Green:Yeah. You know, I really think that the whole, the government should just be an arm of one man's whims. That's what I've always said, and I think that that's what the FTC was created for. Uh, to just make sure that M-S-N-B-C is treating Donald Trump, just like Cha GPT would, at saying only ever nice things about how he's such a genius and all of his ideas are very good. uh, yeah, I, I, I, I do worry about the Constitution right now. I have to say, do I, should I be worried about the Constitution? It seems like, uh, it seems, I don't know. where does this go from here? Does the Supreme Court say, wow, that's dumb and you shouldn't do that? Like, who, who decides that this is illegal?
Mike Masnick:in theory, yes. It could come from the Supreme Court. And I will note that just last year, uh, the Supreme Court kind of ruled that agencies like the FTC have even less power than they thought they did. and the MAGA universe and, conservatives cheered that on, uh, because they were not in control. And now it's kind of interesting to see them ignore that and think that they
Hank Green:seem to be the way.
Mike Masnick:Do these kinds of things. Um, I think that the FTC comment thing will be interesting to see what, if anything, commissioner Ferguson tries to do with all of the comments. I've read through some of them. There's a whole bunch of comments that are just random people being like, TikTok ban me. Uh, and so that's bad. And, you know, and the funny thing was that, for all to talk about big tech censorship, if you go through some of those comments and people were uploading like screenshots of stuff that they got banned for and the FTC was taking those down as inappropriate, and it's like, oh, the FTC recognizes that you need to do some content moderation. Oh, interesting.
Hank Green:It is such a weird spot we are in. Can I hit you with a theory that I'm sure you've heard before, so obviously I. Tech companies have to take stuff down sometimes. And, and, you know, my, position is that they should be able to take whatever they want done because it's their websites. even if that makes me angry. but also they can't be, they can't be held liable for the content that they have on them. They can, told to take them down and then take them down, but like section two 30 says, it can't be your fault that somebody posts something
Mike Masnick:right.
Hank Green:liable on your platform. and that can I, I think that hosting is very different from editorial decision making. That's definitely, is that definitely true?
Mike Masnick:under the law or conceptually
Hank Green:Let's just say conceptually.
Mike Masnick:there, there are, there are different factors there. Um, but I would argue both are protected under the law. and I can give you a quick explanation for that if you want. So.
Hank Green:I, I, yes, please do. Here's my argument. I feel like deciding whether to put something in front of 20 million people is content. That decision itself is you are creating new content by deciding what you place in
Mike Masnick:right. and so the theory is, and, and lots of people are talking about this, and one court agrees with you, and most of them don't, but,
Hank Green:All right. Well, tell me which one's the good one, and I'll send them the flowers.
Mike Masnick:it's the third circuit, and I don't think they're the good one. I think they're wrong.
Hank Green:Oh, no. No.
Mike Masnick:and I, I think they'll face some trouble, eventually when that issue gets up to the Supreme Court, which it might very soon. Um. So the issue that you have, uh, there's a few different issues with that. and I understand the thinking. I'm not saying like, this is a dumb idea because lots of people have it. Uh, you're not a dumb person. It's a, it's a, it's a reasonable instinct to have that once they have put something into a recommendation algorithm, that they're taking some sort of ownership of that content to some extent or another.
Hank Green:Yeah. or they're making, they're making an editorial decision and that decision itself is content.
Mike Masnick:Right. Yes. Now, a few things on that one, editorial content is protected by the First Amendment.
Hank Green:Sure. Yeah, yeah, yeah,
Mike Masnick:so you have, that element
Hank Green:Uhhuh Uhhuh,
Mike Masnick:The second part of it is that,
Hank Green:but not editorial content, but like what if that is
Mike Masnick:What, what if that content violates the law in some way, right? So let, right, so let's jump to that. So now here's the second issue, which is that when they are putting it into an algorithm or promoting it in some way or another, what exactly are they doing? so what, addition to the content, you know, what is their contribution to it. This is always the kind of issue that comes up. And what they're really doing is recommending and sort of saying like, I like this or I think you will like this more likely, right?
Hank Green:Yeah.
Mike Masnick:Which is an opinion. And an opinion by itself is also protected speech, right? They're not creating the underlying speech. And if they're recommending it to you, there may be some exceptions here, but most of the time if they're recommending it to you, they do not know that it is illegal and for them to be
Hank Green:all, all they know is that you might like this and they don't really know
Mike Masnick:They don't know that the machine, the black box thinks you might like this, and that is an opinion, and a opinion is protected speech. And if there is illegal speech in some form or another, in order for that to create a problem, they have to have the actual knowledge under the law that it is illegal, that they are recommending to you that it is illegal. Now, there are a lot of people who say, well, they should have known, or there's, you know, they should have done this, but that's not how it works.
Hank Green:Yeah.
Mike Masnick:In part because you have these black boxes and black boxes can never quite know that this is illegal. There is also the sort of even further out there argument, which is like, you can't know it's illegal until a court has adjudicated whether or not it's illegal. And that's a whole other issue. But like, it's hard to get to the point where the con, the platform and the controller of the algorithm. Could reasonably be legally responsible. Now, you can make arguments morally that, that they should do more to understand this. And I'm not gonna disagree with you on that, but I think it's, it's a trickier problem than people recognize. And it also gets to, when the algorithms, whether they're good or bad, and the black box and everything, they're trying to recommend stuff that they think you will like. And some of that is based on what they have learned about you, good or bad. And we can complain about how much information they have on you and all of that as well. But like, also some of that comes back to you, right? Like you've clicked on all these things that now suggest like you like this content. So does that
Hank Green:only, yeah, the only inputs to the algorithm. I, I always say this to other creators, the only inputs to the algorithm are human decisions. So like it's operating on human choices, whether to keep watching or whether to click, or, you know, whether to like or comment. It's, the only inputs it has are the decisions of either real humans or of course now many bots surfing and pretending to be real humans so that they can do spam.
Mike Masnick:Which is a, a real issue.
Hank Green:It's, I imagine it makes things freaking hard for them. It makes must make things hard for YouTube. okay. I don't like that, that made sense to me because I, I, I, I want it to be their faults. Um, and of course it still can be, even if it's not, something that could be, enforceable by law.
Mike Masnick:Yeah. there are moral arguments for why they should do better. and then my argument going back to my, decentralization thing is like, well, it would be better if we had more control over the algorithms and, and that is not just one giant algorithm that is trying to make Mark Zuckerberg or whoever else richer, but rather that we could select and we could also give it our intentions and sort of be intentional about it. Which is saying like, I only want to see the good'cause, like if I am making an explicit thing that want to see the healthy content. I know like my brain in the moment might give me the bad stuff, but if I could tell it and say like, you know,'cause I might do that
Hank Green:Yeah.
Mike Masnick:in that
Hank Green:And they, and they like YouTube theoretically wants that. They like, they like, wanna stop showing you car crash videos. They wanna show you more how to make a healthy meal Videos, you know, they they want that, they wanna deliver more value be less brain rott. of course it's very hard because of course our brains want to rot. But, it's this, it's this space between like what I wanted and what is best for my wellbeing. It's still, still being a substantial gap. And I don't know how to close that except by plugging Sam Altman into his, his little creation into my brain to tell me what to do at all times.
Mike Masnick:Well, on that note,
Hank Green:and there will, and there will, there will just be a few different ways to be, you know, you'll have to pick between then we will, we will eventually kill each other about it. just assume that that's where it's all headed. we'll all sign up to have different AI gods and if you signed up to have a different one than the people near you, you're gonna hate them. I'm calling it,
Mike Masnick:Okay. Okay. I think we need to close on that,
Hank Green:does everybody know that I'm also a science fiction author? Like is that, is that new information for people at the end of the podcast
Mike Masnick:Oh, man. I, I, yeah. Yeah. Well, I think we should close
Hank Green:sorry. Sorry. Yeah.
Mike Masnick:bringing everybody down.
Hank Green:Look, we gotta know what to watch out for.
Mike Masnick:Yes, yes. And that is what science fiction is useful for. But, thank you for this conversation. I think this was really fun. I really enjoyed it. I think, uh,
Hank Green:It is one of the most important topics that I think is least has the least attention paid to it, so I'm really glad that you are bringing it. Lots of attention.
Mike Masnick:I am doing my best. Uh, and so thank you again and, uh, thank you everyone for listening as well.
Announcer:Thanks for listening to Ctrl-Alt-Speech. Subscribe now to get our weekly episodes as soon as they're released. If your company or organization is interested in sponsoring the podcast, contact us by visiting ctrlaltspeech.Com. That's C T R L Alt Speech. com. This podcast is produced with financial support from the Future of Online Trust and Safety Fund, a fiscally sponsored multi donor fund at Global Impact that supports charitable activities to build a more robust, capable, and inclusive trust and safety ecosystem.