The Problem Is So Much Bigger Than Grok

Published 2 hours ago
Source: theatlantic.com
The Problem Is So Much Bigger Than Grok

Subscribe here: Apple Podcasts | Spotify | YouTube

In this episode of Galaxy Brain, Charlie Warzel confronts the growing crisis around AI-generated sexual abuse and the culture of impunity enabling it. He examines how Elon Musk’s chatbot Grok is being used to create and circulate nonconsensual sexualized images often targeting women. Warzel lays out why this moment represents a red line for the internet: It is a test of whether society will tolerate tools that silence women through humiliation and intimidation under the guise of free speech.

Warzel is then joined by The Atlantic’s Sophie Gilbert, the author of Girl on Girl, for a conversation about how misogyny has been a constant throughline in the history of internet innovation, from Facebook to YouTube. Warzel and Gilbert discuss today’s AI-powered exploitation and explore how new technologies repeatedly repackage old abuses at greater scale and speed. They discuss why this wave of hostility feels so intense right now, how backlash politics and platform design reinforce one another, and what is at stake if lawmakers, companies, and the public fail to draw a red line with Elon Musk’s Grok.

A note from Charlie Warzel: On Wednesday—after this episode was recorded and after The Atlantic published its article on the Grok undressing scandal—X’s Safety account posted an update noting that the company had “implemented technological measures to prevent the [@]Grok account on X globally from allowing the editing of images of real people in revealing clothing such as bikinis. This restriction applies to all users, including paid subscribers.” It also noted: “Image creation and the ability to edit images via the [@]Grok account on X are now only available to paid subscribers globally.”

I reached out to X to ask if this update was different from the changes it had made to the feature that were rolled out last week. I asked if Grok was now unable to generate bikini images for any user. X’s media strategy lead, Rosemarie Esposito, did not respond to our request for comment.

As of this writing, it is unclear if the safeguards are always working. In some cases, Grok seems to be unable to generate bikini images. However, some users are still posting images on X that suggest they can still prompt the chatbot to do so.

The following is a transcript of the episode:

Sophie Gilbert: It is about power. It’s about asserting that in certain spaces, at least online, women are not equal human beings. They will always be seen as nonhuman objects. Any time they have ways of speaking or voicing things, they will be

essentially silenced, they’ll be driven out of certain platforms, they’ll be made to feel unwelcome, they’ll be shamed in lots of ways and humiliated.

Charlie Warzel: Welcome back to Galaxy Brain. I am your host, Charlie Warzel. Last week, I offered up a little bit of a rant at the top about Elon Musk and Grok and the chatbots’ undressing spree. And it turns out, I’m not done talking about that one. Today’s episode is going to be about the ways in which technology has shaped a culture that’s been increasingly hostile to women online and elsewhere.

I’m going to be joined by my colleague Sophie Gilbert, who writes extensively about culture and the ways that it talks about and influences and shapes women and our perceptions of women. She’s the author of a fantastic book called Girl on Girl, which traces how pop culture has turned a generation of women against themselves. We’re going to be talking about the Grok stuff, but also the ways that these tech platforms have this rich history of exposing women and why it’s become so popular to think about women as these nonhuman objects online, how AI is encouraging this, and what women are supposed to do in response. But before we get to Sophie, I wanted to address the Grok issue head on again. I remain truly, incandescently mad about this, about all of it.

Here is a sentence that is true. For more than a week, beginning late last month, anyone could go online and use a tool—owned and promoted by the world’s richest man—to modify a picture of basically any person, even a child, especially women, and undress them.

At the moment, Elon Musk seems to be not only getting away with this, but reveling in it.

Here’s where I need to note that xAI says it’s prohibited the sexualization of children in its acceptable-use policy. A post earlier this month from the X safety team states that the platform removes illegal content, including child-sex-abuse material, and it works with law enforcement as needed. The company said late last week that it limited image generation with Grok and editing to paying subscribers. Now, this is disturbing in its own right, because Musk and xAI are essentially marketing nonconsensual sexual images as a paid feature of the platform. But X users have been able to get around even this very low bar of moderation using the “edit image” button that appears on every image uploaded to the platform, or by going to Grok’s stand-alone app and creating images that way.

To be clear, the deluge of people online, anonymous trolls saying @Grok put her into a bikini, et cetera—that’s subsided slightly. But on subreddits and on these backwater message boards, I can tell you, personally, that I’ve seen creeps strategizing and sharing tactics for the best ways to get around these safeguards and to create realistic pornographic images of women.

The problem continues. Musk himself has said that he is quote, “not aware of any naked underage images generated by Grok. Literally zero.” He’s talking about images of children. Now, he might not be aware of it, but as my colleague Matteo Wong has shared with me, there are investigators who look at this stuff, who have found Grok-generated images on the dark web that clear the bar for child-sexual-abuse material. It is out there. To say nothing of the harassment that’s gone on in broad daylight on X with people undressing women and public figures.

And so I can’t stop thinking about this. Now there’s starting to be a little bit of pressure from governments around the world. Government bodies in the U.K., India, and the European Union have said that they’re trying to investigate X. Malaysia and Indonesia have blocked access to Grok. In the U.S., though, the response has been a lot different. This week, Defense Secretary Pete Hegseth touted a partnership publicly between the military and xAI to use Grok in war-fighting capabilities.

Meanwhile, the United States State Department has appeared to threaten the United Kingdom over their probe into Elon Musk’s app. Senator Ted Cruz, a co-sponsor of the Take It Down Act—which establishes criminal penalties for the sharing of nonconsensual, intimate images, real or AI-generated, on social media—wrote on X last week that Grok-generated images that were “unacceptable and a clear violation of the law.” On Monday, Cruz posted a photo on X of him with his arm around Elon Musk. The caption said, “Always great seeing this guy!” Rocket emoji.

Make no mistake: This is a part of a crisis of impunity that goes well beyond X or Elon Musk. This is the result of politicians, despots, and CEOs just bowing and capitulating to Donald Trump. A financial grift and speculation running rampant in sectors like cryptocurrency and meme stocks. A braggadocious, get-the-bag ethos that has no room for shame or greed. Of Musk realizing that his wealth insulates him from financial consequences of all kinds. It’s cynical. It’s cowardly. It is cancerous to the social fabric.

I can’t know what is in the heart of these CEOs or these politicians or the employees of the companies that are refusing to comment on the fact that they’ve invested in a company that has weaponized and viralized abusive, suggestive, sexual material. But I feel confident in saying that their silence on this issue is due to a hope that if they’re quiet, everyone will just move on. It’s a strategy on today’s internet among people with power to just bank on a culture in which people have given up demanding consequences.

And we just cannot allow that to happen. Because this is a line-in-the-sand moment for the internet—but also for us culturally. This is not, as I said last week, a partisan issue. This is not, as Elon Musk would have you think, a free-speech-maximalist issue. It is, however, a free-speech issue in that this tool is being used to silence women through intimidation.

The Grok scandal it is just so awful, so egregious, that it offers a direct opportunity to address this crisis of impunity, head on. This is a moment in which people with power, or people who can exert pressure—the Apples and Googles of the world, our politicians, other people in Silicon Valley—this is a moment when they should demand accountability for this. Elon Musk should wear this.

The stakes could not be any higher. Because if there is no red line around AI-generated sexual-abuse material, there’s no red line.

And so joining me now is Sophie Gilbert, who can speak to, in many ways, the other side of this. The consequences and what all of this horrible culture of misogyny is doing to women online. Here’s Sophie.

Warzel: Sophie, welcome to Galaxy Brain.

Gilbert: Charlie, hi. Thank you so much for having me.

Warzel: I’m sorry to talk to you under these circumstances, but it’s a delight regardless of what we’re about to get into. And on that note, I did want to start with the news. Which, it’s been a pretty horrific few months, even by internet and 2020s standards, in terms of the blatant, flagrant misogyny that is out there.

We have the Grok-undressing stuff; I just talked about in the beginning, here, of the episode. We have the Epstein files coming to light. The vilification on the right of Renee Good in Minneapolis. There’s a lot more, which you’ve been writing about—including the president calling a reporter “Piggy.” What has been your reaction to the volume of this really ugly misogyny manifesting right now?

Gilbert: God. I just think, like—I used to be a TV critic, not to be glib. I used to write about culture. I mean, through a very, you know, lens of gender. I mean, I still write about culture, but because I think and write so much about gender dynamics and misogyny, of course, and women. God, there’s been a lot. There’s really been a lot. There’s been a lot of kind of stories to cover, of things to respond to. It feels like everything is peaking. I don’t know if this is actually the peak, or if we have a way to go. But I mean, one thing, I think, is that so much of our culture has sort of learned from, and is responding to, the example of the president. Who is not, I would say, the most decent person when it comes to talking, thinking about, talking to, treating women.

Warzel: I think that’s well documented. Yes.

Gilbert: Well documented; yeah. I’m wondering, will I always check in? But obviously, his example, I think, has had a real profound effect on the culture. I feel like it’s enabled a lot of people to be much more honest about how they feel about women. I wrote this in one of my pieces last year, but it did feel like, for a while, that sexism and misogyny were not generally acceptable in public discourse. They were frowned upon. I mean, you would get fired as someone in the workplace if you said something misogynistic or sexist. You would, you know, have an outcry, public outcry, if you tweeted something or said something publicly to that, in that line. And now, it’s just open season. I don’t know if it’s, like, the Overton window being broadened. Or it just feels like there is so much license now for people to say the most outrageous, the most indecent, the most horrific things.

Warzel: Do you feel like a lot of that is a direct backlash? Some people frame the Trump years, the Trump election in 2016, as a backlash to the historic election of a Black president. This idea that we are constantly ping-ponging back and forth in this very extreme way to the thing that happened before. Do you see this as—I know you just referenced Trump now—do you see it as also a reflexive reaction in the culture to the #MeToo stuff?

Gilbert: Yeah, definitely. I wrote about this a little bit in my book. I wrote a book about the pop culture basically of the 1990s and the 2000s, and the ways in which it presented women and the big influence of porn on popular culture. And so much of the pattern of that period of time is: progress backlash, progress backlash, progress backlash. I mean, Susan Faludi, I think, wrote this in Backlash that any time women are perceived to be making strides—in terms of status, in terms of equality—there is a profound backlash to that in the culture. And I think certainly, Trump himself was manifesting as a reaction to a lot. And then obviously after #MeToo, you had this real pushback. And then after what happened in 2020—with the George Floyd protests and sort of sustained efforts on the part of lots of people to build a more equitable culture—there was obviously a backlash to that too.

So it just feels like everything constantly is kind of reverberating back and forth, back and forth.

Warzel: Is that why … there was a line that struck me in the piece you wrote after Trump called a reporter “Piggy.” It said, “Over the last few weeks, it feels like we’re revisiting the same characters over and over, with no consequences and no forward momentum.” Is that part of it, or is that a different dynamic?

Gilbert: No. I mean, there are definitely the main characters, right? Like, there’s Trump; there’s Elon Musk; there’s [Robert F. Kennedy] Jr. I mean, so many of these people just keep coming up in the news with reference to these kinds of stories. And I think in part that’s because of who they are, and how they behave, and what they’re doing. And the example that they set, as well.

Warzel: Your book, which you just mentioned, is fabulous. And it picks at the ways in which so much of this is baked into our culture. Like, for example, sexual objectification is normalized and kind of branded in the fashion industry, and they profit off of all that. And then it emanates also, you know, into every other corner of popular culture there. I’d love to—in part because of what’s happening with Musk and Grok—love to trace the role that technology is playing in all of this. Do you feel it’s a similar kind of dynamic to what you were writing about in the book? Because you and I were talking a little bit before this, and you told me that you’re thinking about how this is coded in from the beginning on the tech platforms. And I wanted you to expand on that. How can we start to just trace this through some of these big tech platforms?

Gilbert: Yeah. I mean, this was one of the things that really surprised me the most in my research. So many of our major tech platforms that are really incorporated into our daily lives were built on the exposure of women; on the desire to look at sexualized pictures of women. I mean, I can talk about this a bit more in a minute, but if you go way back to the ’90s, when the internet was really first becoming a force of presence in people’s lives, the first real viral video was Pamela Anderson’s sex tape, which was obviously stolen. It was a private video made with her and her husband on her honeymoon. It was stolen from their home. It was released without her consent. Was broadcast on the internet, sold as VHS. It became just such a moment of mass-media consumption in a time when I think no one really understood what was happening.

So it’s just—it’s in the history of our technology. I mean, before we had Facebook, Mark Zuckerberg created Facemash, which was a site where people could compare the relative hotness of women at Harvard. I mean, Google Images was created because Jennifer Lopez wore a very low-cut Versace dress to the Grammys, and there was so much unprecedented traffic. And there was no way to easily provide people with images. So that was created as a response.

Warzel: I did not know that.

Gilbert: Yeah. So it’s basically like—name a tech platform in there. It has been created because someone, somewhere, wanted to ogle women. And it’s not that that’s a bad thing, per se. It’s just that it’s part of the coding. It’s part of the texture of why the internet was made, and what people have always wanted to use it for. I think what you see, and what we’re seeing now with Grok, is that whenever there’s a leap forward in terms of technology, the first thing people use it for seems to be sex. And often sexual exploitation as well.

Warzel: Well, I’ve done previously, in past decades, stories about technology and the porn industry, right? And the porn industry has the similar part—or maybe slightly the flip side of that dynamic is that the porn industry is also very good at seeking out. Whether it’s VHS, or whether it is some of the internet stuff, some of the even just like the switch posts. You know, the Pamela Anderson sex tape, to this idea of amateur porn. Then artificial intelligence and VR goggles, and things like that. Like the industry itself—not the nonconsensual part of it, but the bought-and-paid-for part of the industry has always been excellent at leveraging that and figuring out new, novel ways of distribution.

Gilbert: Yeah, one of the facts that really blew my mind was learning that in the mid-’70s, when VHS technology was first introduced, up to 75 percent of the tapes being made, really the first day of VHS being available, were pornographic. The early adopters were just that fast to see what VHS would be used for.

Warzel: One thing that I was looking at, reading in your book, that was just so striking—post all of this Grok stuff, and this idea of the internet just being flooded at the moment with all this AI-generated nonconsensual sexualized imagery. Is the description from, and you’re just describing, a scene from the ’90s teenage-sex comedy American Pie in which one of the characters has an exchange student over and sets up a webcam. And then, you know, like runs over to his friend’s house to basically watch her, surveil her in his room, and then broadcast that to the whole school or whatever. This idea that—you know, she’s basically shamed. She’s an exchange student, shamed to go back to her own country. Where the main character who did this is just, you know, kind of “boys will be boys’ed” out of it.

And I was just so struck by the way in which, I mean, American Pie was just like a canonical film of the late ’90s, early 2000s. Just so popular in our culture. And yet right there is this idea of, I don’t know if it’s exactly revenge porn, but really horrible, nonconsensual broadcasted imagery. Over the internet. The fact that these tools, like, should be used for this type of spying—or even if not, it’s kind of funny. And that there really are no consequences for doing so. Can you just tell me a little bit about how you feel popular culture has dealt with this rise of the broadcasting of women, the ogling of women, in this? And sort of made it acceptable for people to treat women in this way?

Gilbert: Yeah; I mean, I think one of the things that happens is that technology is so fast that we don’t, as a society—even in terms of thinking about our laws or our structures or our ethical frameworks—we don’t have ways to respond as quickly as technology arrives. So when things like webcams, and the ability to stream video, or the ability to furtively take pictures of people without their knowledge … when things like that arrive, it takes us a while to sort of build ethical frameworks in terms of usage. So I think that example in American Pie is so interesting because you see this new technology. You see it as a story in a film, but we didn’t have the language, right, to say nonconsensual porn. Or, I mean—we don’t have the words necessarily to accurately describe what’s going on. And I think that it always takes a while for us to catch up.

So with something like—another thing I write about in my book is the way that photographers, paparazzi photographers in the 2000s, would lie down on the ground to take pictures of women’s skirts. Basically to try and capture, catch them, photograph them without underwear. And now I would call that nonconsensual porn, of course. But back then, it was called upskirt. Like, upskirt photography; upskirting. Because we didn’t yet … we hadn’t figured out the right way of thinking about what it actually was, what it actually meant. And that’s why I think language is so important. And it’s why something like what’s happening on Grok right now is so dispiriting to me. Because it’s like: We’ve already done it. We’ve already learned the lesson. We’ve already decided as a culture that this is not something that we want to participate in. That it’s wrong; that it profoundly hurts and traumatizes women. And yet new technology comes along, a new way of doing it. And everyone kind of forgets everything that we’ve set forward.

Warzel: Do you think it’s that, or do you think it is a backlash, again, in this way? Or people trying to reclaim it, right? Something that I have noticed in the dynamic—and other people have noticed, of course—of what’s happening with the Grok stuff is it’s so clearly all about power, right? Like, it’s this idea of immediate ritualized, viralized shame and humiliation. And it feels to me almost less like we haven’t learned those lessons, and more like a bunch of people saying like, “You know, we want to go back.”

Gilbert: Yeah, I think you’re 100 percent right. It is about power. It’s about asserting that in certain spaces, at least online, women are not equal human beings. They will always be seen as nonhuman objects. Anytime they have ways of speaking or voicing things, they will be essentially silenced. They’ll be driven out of certain platforms; they’ll be made to feel unwelcome; they’ll be shamed in lots of ways, and humiliated. And it’s very much about underscoring the idea that, again—I mean, it’s sort of taking away our full humanity, in a way that I find, again, horrifying in so many ways. It’s not even about making sexual material. It’s about making sexual material of women in a way that is trying to dehumanize them and objectify them, but also to sort of push them out of public life.

Warzel: This may seem like an obvious question, given that we’re talking about Grok right now, but how are you thinking about how AI is supercharging this, or how it’s changing the dynamics? Obviously, it’s allowing it to happen so much more quickly. It’s allowing there to be, I guess, a level of crude and awful imagination in the scenarios that one can be put in. But yeah, how do you see AI changing and pushing this dynamic forward?

Gilbert: There are so many different elements of AI that I find disconcerting. I think one is the way that it just affirms whatever the user wants to do. It’s sort of, it’s very sycophantic. It’s very obsequious. It will try and keep users engaged by really making them feel good about themselves and what they’re doing. There’s not a lot of pushback. And in terms of a lot of the relationships that are being set up, in terms of people’s emotional and intimate relationships with chatbots … it’s not normal to have that kind of relationship. You don’t have that kind of relationship with a human being. With any kind of human being, there’s friction, there’s pushback. There’s a two-way power dynamic. It’s not the person in the relationships, let’s say the man in the relationship, being constantly affirmed, constantly catered to, constantly gratified in any way that he might want. And so I think setting up that dynamic in terms of: What are people’s expectations? How are they being set up to have real human relationships, in real life? That’s one thing that I find troubling.

But there’s also, I mean, you could look at the way that I think when ChatGPT, a version of ChatGPT, was launched in 2024, and it had this female voice that was modeled after Scarlett Johansson because she declined to let them use her. Yeah, allegedly. I’m sorry. Yeah.

Warzel: Allegedly, sounded exactly like Scarlett Johansson. Scarlett Johansson thought that it sounded like Scarlett Johansson. She was very pissed off. Allegedly, allegedly.

Gilbert: But it’s very feminine coded, right? And so you have these assistants. I think this comes back to the Siri discourse, when Siri was first launched. And she, of course, had a female voice. When, you know, Alexa obviously has a woman’s name. Are we affirming these dynamics where women take care of men, by coding that into the platforms that we use? It certainly seems so.

Warzel: Something I wanted to ask you about was the rise of OnlyFans. Obviously, porn—we’ve talked about it. And through your work, there is this kind of … it always is coming back to porn. In terms of roles and expectations for women, and the way that it’s infusing in culture.

OnlyFans has been a really interesting revolution in adult content in that, you know, when I was reporting on the adult industry a little bit in the ’00s, or the 2010s rather, it felt like nothing was ever going to take down those tube sites. Those free tube sites that were, you know, squeezing production companies; also, you know, uploading a lot of stuff that hadn’t been verified, from performers that was quote unquote “amateur” content.

A lot of big issues. And also sort of a strange leviathan monopoly on the industry that was driving, you know, the money that they—that performers—could make way down. OnlyFans comes along; there’s a sort of democratization element and influencing element therein. And a lot of people have made great amounts of money on that. There’s been a feeling from certain people of, you know, an empowerment in terms of this type of sex work.

And yet, it also feels to me like it has added this influencer culture onto it, right? There are so many elements of influencer interaction with fans that the platform has set up. That I think, as you said, also set up these relationships. How have you been watching the rise of OnlyFans and thinking about it in terms of all this culture?

Gilbert: I find it so fascinating. In so many ways, I think the first way is thinking culturally. I’m always thinking about, like, how does culture inform desire? How does culture teach us things that we’re attracted to, or things that are acceptable, or things that we fantasize about? And when you look at a lot of porn from the ’90s and the 2000s, the women in those movies were a very narrow range of beauty. They were sort of mostly young, mostly thin, mostly blonde. Like, I’m not gonna go any more descriptive than that. But it was quite a narrow, I would say, physicality. And then OnlyFans comes along, and it really broadens, I think, the scope of just the kinds of desire that people felt licensed in having. I’m thinking about age. Like, a lot of the celebrities who have made names for themselves on OnlyFans, and who have really made significant amounts of money, are in their 50s. Which seems interesting to me, because in mainstream culture, those kinds of women are not typically portrayed as desirable, right? Like you’re in your 20s and your 30s, or in your 40s if you’ve had a good facelift.

But women in their 50s are not typically sex symbols in Hollywood. So suddenly you have this platform that is really … I don’t know, it’s sort of allowing a much broader definition of what is desirable in mainstream culture. And so that, to me, is fascinating and positive in lots of ways. I think OnlyFans has certainly made things a lot safer for sex workers in many ways. But again, I mean, in terms of what I was talking about with AI and chatbots, it’s the same kind of one-sided dynamic. Where you have men having these very intimate, and often parasocial, but very, very intimate and emotional relationships with women who are performing for them for money. It’s not coming from an honest place of connection. It’s coming from a place of sort of a one-sided power dynamic, where women perform and cater to men. And so while it’s fascinating in so many ways, I do think it’s affirming the same kinds of patterns that we see more and more in technology.

Warzel: I’ve had to go and look in some of these backwater areas for the reporting on the Grok stuff, but also just in general with AI-driven sexualized images, or what have you. And something that I’ve seen in some of these communities—I mean, apart from the awfulness—is these confessional posts from men who are saying, Guys, I’m kind of ruined by this. Right? This is like, I can generate my fantasy and my dream on demand now, and then tweak it in all these different ways and do it again and again and again. And I don’t feel anything when I look at women. Or something like that. Like, I’ve seen numerous confessional posts in that regard. And I think about that with the culture. And I don’t know how much you know about this—but there was a very long Harper’s story about this, but this culture of “gooning” and this idea of sensory overload. Marathon sessions of self-pleasuring and being bombarded by porn of all kinds. Where do you think this is all headed? You mentioned peaking in terms of, like, misogynist behavior. This is obviously a different category, but where do you feel like all this is headed?

Gilbert: Yeah. It seems so much like it’s an addiction. I was also looking at the Reddit page for Grok today, just to briefly see what people were talking about.

Warzel: My apologies.

Gilbert: I saw—no, no, no, it’s okay. But you know, there are posts by women saying, “I’ve discovered that my boyfriend has created undressed images of people who we know. Women who we know in real life.” Like, “He says he can’t help it; it’s a compulsion.” But you know, that’s one real-world consequence.

I have talked a lot over the past year about how the manosphere is setting men up for profound failure. Like, if you cannot see women as equal human beings, you cannot relate to them on a level of basic equality. No one is going to want to be in your life. No one is going to want to have an intimate, personal relationship with you. And we know that men need marriage and meaningful relationships as a health matter. Like, men live longer when they’re married. They’re happier when they’re married; they get less diseases when they’re married. That kind of emotional stability has a profound public-health impact. And so things like this, they are setting men up for these lives of profound loneliness. And, you know, not to sort of be like, What about the men? But it is an interesting aspect of it that I think can get lost sometimes when we’re talking about, rightfully, how awful and humiliating this is for the women involved. But like, it works both ways.

Warzel: To that point, I don’t want to focus on the men in all of this, but do you think that that’s a way, like a crack, in which somebody—like some of this can just be begun to be pulled a bit apart? I think so much about what the manosphere and a lot of these, you know, hypermasculine influencers are trying to sell, right? Which is this idea that someone’s putting something over on you, right? Like, you’re either being exploited by feminism, or whatever, you’re being subjected to something by some outside force, right? And this is a way to, like, push against it. Get a little bit of autonomy; be a man; do whatever. I feel like there’s a reverse version of that with this, right? Which is like, All of these people are setting you up. Like a reverse Alex Jones, right? Like, All these people are setting you up for this failure. They’re exploiting you to sell creatine powder, or whatever it is. And that is a way in which you can actually break free on this. Like, people are trying to subjugate you for profit, because they’re influencers, because there’s a reason to, or an intentional thing to gain. Do you feel like that is a possible way to switch the dynamic?

Gilbert: Yeah. I think it’s a way to frame it that gets people to pay attention. Certainly, maybe a different kind of person than someone who might read our stories at The Atlantic and be profoundly influenced.

Warzel: I’m always trying to figure out how to talk to the manosphere. Always, you know, direct to camera.

Gilbert: I think it is an argument that is both valid and may have more of an impact. I think the thing that I really am caught on, right now, is the thing that you and our colleague Matteo Wong wrote about. Which is that this is our red-line moment. Like, if we cannot agree that this is something that we will not do as a culture, there is no coming back from it. And particularly, I was thinking lot about Elon Musk’s comments when my great home country of Great Britain threatened—not even threatened—but there was the specter of possibly X being taken offline. And he responded that it would be the suppression of free speech.

And I was thinking about how we’ve always, as a culture, agreed—we’ve been unanimous on this—that there are certain kinds of speech that we will suppress. And that speech is, you know, child-sexual-abuse material. That is the kind of speech that we will not tolerate in society. We do not allow it. We legislate very strongly against it. There are so many taboos against it. It is something that is really profoundly frowned upon, and we’ve always agreed on that. So why now has it become something that suddenly people—politicians in my country—are saying might be protected free speech? Like, what has changed? I really just am so much in alignment with your argument that there’s sort of no coming back if we can’t draw this line now, and set up our boundaries.

Warzel: What has changed, at the risk of sounding extremely dumb and obvious. But, you know… a healthy society stops this. We are not stopping this. So what has changed? Is it just simply, we’re just not a healthy society? Or … what’s changed that this is a red line that people seem okay crossing, as long as they’re not, you know, they don’t have to wear it? You know, just as long as it just passes.

Gilbert: Yeah. There’s such cognitive dissonance in being alive right now in this moment of like, Epstein freakout, but also post–kind of QAnon, “save the children.” And suddenly people ... it’s really hard to make sense of. I think the thing that has changed is Elon Musk. And the money that he has, and the power that he has, and the position that he has. And the way that he has taken over X—in a way where it still operates just about enough like the traditional social-media site that it once was, that people feel like it might be, but it’s not anymore. It’s really 4chan with a couple of normies. Like, it’s been so profoundly radicalized. It’s just become a cesspool in a way that I think people who have stayed on it have become inured to. And it’s sort of harder to shock them, I think. And that has profoundly affected the people who are still on the platform, and the ramifications are going to affect all of us.

Warzel: I just did, at the front of this, a bit of a monologue about this moment. The red-line moment; the line in the sand. I think the hope always, with our work, when we’re talking about the culture or the technologies, and how they push people to a certain, they force cultural change in some way.

If a lawmaker—if someone in power—is listening to this, what’s your message for them on all this right now?

Gilbert: I would say: This is such an easy one for you. It’s so straightforward. Morally, it’s so simple. Ethically, it’s so simple. It’s so straightforward. If you can be the person who will take a stand, you will have so many people behind you. I know it’s hard, because Musk has money, and Musk has power, and money buys you more power. But this is very straightforward, and it’s not something that the public at large are divided on. And if you think that the public are divided, it’s possibly because you’ve spent too much time on X.

Warzel: I couldn’t have said it better myself. Sophie Gilbert, thank you for coming on to talk about the world’s most depressing stuff. I think the only way though to go forward is to just address it head on. So the only way out is through. And thank you for helping me get through.

Gilbert: Thank you for your piece. I know; I’m sorry that you’ve already spoken about it, but I wanted to bring it up, because it’s such a good piece and everyone should read it. And I think this kind of moral clarity right now is hard to come by and very important.

Warzel: Wonderful. Sophie, thanks so much.

Gilbert: Pleasure.

Warzel: That’s it for us here. Thank you again to my guest, Sophie Gilbert, for a wonderful and difficult conversation. If you liked what you saw here, new episodes of Galaxy Brain drop every Friday. You can subscribe to The Atlantic’s YouTube channel, or on Apple or Spotify or wherever it is that you get your podcasts. And if you enjoyed this, found some value in it, remember you can support the work of myself and all the other journalists at The Atlantic like Sophie by subscribing to the publication. And you can do so at TheAtlantic.com/Listener. That’s TheAtlantic.com/Listener. Thanks so much for listening and watching. And I’ll see you on the internet.