This is a transcript of a conversation between Jeremy Malcolm, Brandy Brightman and Aurélie Petit held on May 23, 2025 for the podcast Beyond the Filter, which is presented here edited for length and clarity.
Brandy: Hello and welcome to Beyond the Filter, a podcast about censorship. My name’s Brandy…
Jeremy: And my name’s Jeremy. In this episode, we’ll be talking with a special guest, Aurélie Petit, who is a social and visual researcher of animation and technology and a PhD candidate in the Film Studies Department at Concordia University Montreal. She’s just published an article in the Journal of Porn Studies titled “The Limits of Zero Tolerance Policies for Animated Pornographic Media.” Welcome to the podcast, Aurélie.
Aurélie: Hi, thank you for having me.
Jeremy: So congratulations on the article. … First, I wondered if you would relate the story that you open your article with about what happened on Twitch in 2023.
Aurélie: Yes, Twitch, you know, as a response to fans’ criticism that the platform was targeting sexuality where they felt it was unjustified—and a lot of queer people were being targeted, a lot of women, everybody who was showing too much nudity, like, if you could see your cleavage on Twitch, you would be considered basically doing porn—and so the platform started to really soften its policies over adult content. And especially when it was about animation, you know. And they were saying, if you were drawing, sculpting, or making animation on Twitch, and it was drawing naked people, it was fine. And in one day, so many people made porn, like animated characters in porn, that the day after, the platform decided to cancel its policies. In one day, which is crazy for like a platform like Twitch.
Jeremy: Well, I think that’s not the only example of something like that happening. Didn’t OnlyFans also turn their policies around in a couple of days or so? So, yeah, I think one of the themes of your paper is that these platforms maybe don’t have very well thought-through policies on animated content, versus real content that depicts the same thing. So, what did you discover in the platforms that you looked at? I think there were 30 platforms, right?
Aurélie: Yeah, so, you know, this example of Twitch, I used it to open the question of, as you say, that platforms don’t know what to do with animation. And it ended up having them write policies that are totally weird or go against common sense. And so, I was curious of this question of, how do you moderate animation? And especially how do you moderate animation on a pornographic platform that actually have to be like, is this “good sex” or is it “bad sex”? And so, I decided to find 30 pornographic websites and I took their policy documents and, you know, I was focusing on this question of age. So, I wanted to see, when they talk about age and, you know, child pornography, how do they include animation in this discussion?
Jeremy: Yeah, so I noticed it’s interesting that you used the term “animated child pornography” because child pornography as a term is not really used anymore, is it? We’ve started to move to child sexual abuse material. But speaking for myself, I always feel uncomfortable in using the term child sexual abuse material when there’s no child sexual abuse, right? When there’s no child. And in that way, even though child pornography is like an antiquated term, it’s more appropriate in some ways.
I also feel uncomfortable talking about child sexual abuse materials, you know, and talking about child exploitation, knowing that, of course, it’s a spectrum.
Aurélie: I also feel uncomfortable talking about child sexual abuse materials, you know, and talking about child exploitation, knowing that, of course, it’s a spectrum. Like, you know, computer-generated imagery, sometimes that is AI, where the child may be used as a model, you know, and it’s maybe part of an exploitation circle or trafficking, you know. But then you also have the Bart Simpsons of the world, as I call them, you know, and it’s like, it’s a spectrum. And what I was able to see in a lot of those policies is like, they don’t make a lot of space for this spectrum, you know? A very problematic, hyper-realistic animated image of a child in a sexual context was, at least on paper, put at the same level as a very cartoonish representation of a child.
Brandy: And I feel like animation is just fundamentally ambiguous because with live action, you’ve got an actor involved or a person or whatever, and they’ve got an unambiguous age. No matter what they look like, no matter what age they’re playing, they have a true age, which you can use to ultimately judge the content. But with animation, they don’t have a true age. They just have however old the creator decides to say they are on that day, and however old they look. And the age of the character may change on the creator’s whim, and you can’t really judge purely on appearance because in reality, you can’t really judge purely on appearance either. And the stylization of animation makes things even more ambiguous. A creator may have a character that they say is usually 15, but whenever they appear in pornography, they may just claim that they’re 18, even though they draw them exactly the same way.
Jeremy: Aging up.
Aurélie: Yeah. That’s why I’m more interested in talking about problematic representation, you know, than about age and gender. Like, we can have a conversation about, you know, representing young looking characters in pornography. And it’s a better conversation than trying to argue which age is a character. But, as a community we can wonder, do we want the imagery of pornography to be mostly young looking people. Like, do we want it to be the main representation of animation? Knowing that it’s not always the case, but it’s also often made by, you know, sexist animation studios, sexist creators. Like, it’s a more interesting conversation for me.
Jeremy: Isn’t there also a cultural history behind, particularly, as you mentioned, lolicon and shotacon? Those are particular genres that have a stylized look for their characters that isn’t directly related to age, right? And I think you found only two of the platforms that you looked at actually used that fandom terminology to talk about particular types of content that, by those names, you know, referring to them as shotacon and lolicon content.
Aurélie: Well, again, if we link it to this idea of problematic media representations, the history we have for the beginning of lolicon is that it was a genre that kind of always existed. You can trace it back to the golden age of manga, but then it really solidified where a group of men in Japan started to realize that like, yaoi [, gay male porn for women, was] taking a lot of space in the fandom, the conventions, so as an answer they decided to really popularize this representation of young girls. So this woman-exclusionary, queer-exclusionary story is very much part of it. So, for me, that’s kind of the history of it that I see. And then, because those representations match perfectly within the patriarchal society, they became the norm. But, at the origin, they were made in a sexual setting, just as a way to make women feel uncomfortable, which they succeeded at, you know. And loli has a complicated history now because, yes, it’s very present in pornography, but also this idea of the young, moey, cute, kawaii girl is also just very popular now. And of course, you also have lesbian and queer creators in Japan who are also making those kind of manga. And I don’t want to make them totally disappear from the conversation.
Jeremy: That’s part of the problem, isn’t it, with a zero tolerance approach, is that you can’t really draw those lines. And so, one of the main risks, I guess, of a zero tolerance approach is that you mentioned in your paper is that violators can actually be reported to authorities, right, over this sort of content. So, what consequences can platform users suffer if they’re reported to authorities over animated content?
Aurélie: Well, you know, the language that platforms use in their policies reveal a lot about the kind of moderation they want to perform for a reader or for online users. So, to say zero tolerance, it’s very stern legal language. When we’re showing we have zero tolerance over it, we are in compliance with the authorities, we even collaborate with them. It means that they can provide the authorities with all of the data they have about the user. And the problem comes when it’s animation or a cartoon, because they’re going to use the same language and they’re going to group animation with child sexual abuse materials.
Brandy: So I find it crazy to think that you could have someone posting a picture of a real child of being abused. And then on the other hand, you could have someone posting a racy picture of, like, Summer from Rick and Morty, and they could get treated exactly the same—that just sounds insane to me.
Aurélie: Yeah, I was talking with a content creator who used Patreon. So Patreon is actually interesting for adult content creators because their policies leave a lot of room for animation and pornography. You know, while they explicitly state “we do not want live action pornography,” they do allow animation, cartoons, and illustrations. But I was talking to this creator who had had his entire account deleted in one day without being able to negotiate, because suddenly they decided he was doing bestiality or it was incest or something, and he was like, “I was not doing anything that I had not done before.” But because they’re using those big terms, it’s very dangerous when it says bestiality, child sexual abuse materials. All of these are like paraphilic sexualities that we have a fair reason to believe should be banned and illegal [in real life]. But for animation, you know, he was doing 3D monster porn.
Jeremy: Yeah, I’ve heard that Patreon treats monster porn as bestiality. And I believe that actually comes not from Patreon themselves, but from their payment processor, which enforces that rule. And of course, that throws up a whole lot of questions about furry fandom and their right to express themselves in visual form.
Brandy: Yeah, I feel like we’re entering another spectrum of ambiguity. We’ve got the age spectrum of ambiguity. And then we’ve got the sentience spectrum of ambiguity, where you’ve got like animals on one side, human on the other, and then in the middle, you’ve got like these anthropomorphized animals and bestial aliens and things. Where do you draw the line there if you’re talking about bestiality?
Aurélie: Yeah, and that’s a problem for a platform. So that’s why I end up not even advocating for better tools, because I don’t think the answer is there, you know. The answer is not to have tools that are going to be more efficient at flagging, because it’s a cultural problem.
Jeremy: Yeah, there is something else that you refer to in your paper about more reliance being placed on AI for moderation. It’s not going to solve the problems, right? It’s just going to push them to a different level, where we have to assess are these tools going to do a better job at making the important differentiations between different sorts of content? Are they going to do a better job than humans?
Aurélie: Mm-hmm. Which even leave less room for negotiation. It’s exactly what happened with Twitch, you know, people who had hundreds of thousands of followers. You know, and it’s the same for the person I was talking to who was a Patreon content creator. That’s how he pays his rent. And he lost his platform in one day. And I feel like people are going to assume AI moderators are more objective when they’re really not.
Brandy: Like, I read an article the other day that an AI couldn’t even tell that like a stick bug wasn’t a stick where humans clearly could. I mean, it’s going to get better, but… I think we’ve brought up before that AI has its own biases. Was it in your article that you mentioned that it has a harder time recognizing like dark faces over pale faces?
If animation is always going to be wrongly flagged, then this myth of efficiency of those AI automated detection tools doesn’t exist anymore.
Aurélie: Yeah… But then if it doesn’t work, if animation is always going to be wrongly flagged, then this myth of efficiency of those AI automated detection tools doesn’t exist anymore.
Jeremy: So just to change the topic slightly, I want to ask you about what we should be doing to reduce the harms of content that maybe people don’t want to be exposed to other than reporting them to authorities, which is clearly disproportionate and harmful in itself. What else can we do about people being exposed to content that they don’t want to see?
Aurélie: Well, you know, there are some things that really surprised me when I started to think through this research. I went on Pornhub and I wrote “lolicon.” And then I got an automated pop-up that told me, “you know, “if you’re a pedophile and you need help, you can go on this website.” And I felt like I started to see the contour of the problem. Me, I was doing it as a researcher, but I was like, if I am a consumer and I see this, my first answer is going to be, “Pornhub doesn’t know what’s happening.” You know, “they don’t know what they’re talking about. They don’t understand that like, I’m just looking for cartoons. You know, I’m just looking for an image.” And it’s crazy. And like, I think I would feel very defensive. And there’s a very libertarian approach to, you know, animation and anime porn that has always existed in the anime fandom community, because for a long time, pornographic animation was unfairly targeted. And so people became very protective of it. And it created a lot of discourse that was dismissive of concern over age and gender that we can legitimately have. Again, it’s not saying, “let’s ban all animated pornography” if we just say, “it’s kind of crazy that all of those girls look so young.” It’s a conversation that can be nuanced and exist. So for me, it would be, first of all, for those platforms to start using those terms used by fans, like shotacon, lolicon, and actually understand and define them. Being able to explain in their policies where they’re using it, and why this content is actually banned. You know, instead of just saying, “we ban all child pornography (real, virtual, drawing, cartoons),” it’s to say, “we ban all like child sexual abuse materials. Within those child sexual abuse materials, we include animation. Here’s why.” I think it would be a beginning of an answer.
Jeremy: I think you’ve acknowledged that there is a spectrum, right? There is definitely material that is perhaps AI-generated, trained on real humans, and we don’t want to see it. And then there’s also really creative, artistic content that does reference characters who may be of indeterminate age. And the creators of that content may themselves be queer. They may have legitimate artistic points to make, and yet it’s all bundled together under the same category. Now, my fear is that if we do that, then people who have legitimate content are going to be forced to post that in dark spaces, in encrypted channels, maybe on the Tor network, places like that, where we also find real child sexual abuse material. So I feel like there has to be some middle ground where we can post content that fits within that part of the spectrum on the clear web, rather than pushing it into the dark web, which is only going to associate it more closely with real abuse content.
Aurélie: It’s actually interesting because today I translated another article I wrote for Porn Studies called “The Hentai Streaming Platform Wars,” which I can send to you if you want after. So it’s been that I was reading it again today, and it’s an article I published in December on the ecosystem of pornographic animation online, and all of those streaming platforms that exist and a lot of them are totally black boxes, you know, we don’t know where they come from, we don’t know who is running them, we don’t know how exactly they make money, but they have a lot of content. And a lot of the content is on the margin of legality and probably illegal in a lot of countries. So I think first we would need to have a pornographic platform that is dedicated to animation, because then this kind of platform would allow for much more nuance, because they would only be talking about animation. They would not have to even think about this question of, “but what do you do if it’s a real child?”, and I think it would solve a lot of problems.
Brandy: As someone who consumes a lot of fantasy content, it muddies the water even more. Like, I was watching an anime the other day and one of the characters looks like a child and is often mistaken for a child by a lot of the other characters, but they’re actually basically a hobbit, they’re like a middle-aged man with like a family of three. So it just made me start wondering, how would you classify the hentai if someone decided to make hentai of this character? He’s got the maturity and the chronological age of like a 35-year-old, but he looks about 11.
Jeremy: So how much of this is about media criticism and like media education, rather than making it about censorship as it is at present?
Aurélie: Zahra Stardust, she’s an Australian activist and tech writer and academic, I remember she stated this and she said it in her recent book Indy Porn that we insist so much on telling people that porn is not real, especially teenagers. But maybe we should start to tell people actually porn is real, you know, it is an industry. And it’s the same with those representations, like, you know, instead of saying like, oh, but it’s not real, is to say, actually, it is real, like someone is making it. It’s a question I used to tell my students in a class on porn, because I was teaching animation and we had a week on pornography. And I was like, are those characters sex workers? And of course they were all saying, no, they’re not. And I was like, okay, so where are the workers here? You know, where are the people? What do we think is behind this short animation? Someone made it. There was maybe a voice actor involved. To kind of make the industry appear, I think that’s part of education. And, once you start to think of the industry, you start to think also about the political economy of it, and the cultural economy. And like, what are the politics of the creators? What kind of content do you want to consume? And what kind of content do we want to distribute?
Brandy: Are you saying that the thing we should use to judge a piece of content is the character of the creator, who made it?
Aurélie: More to understand the context of the production. Because if you are born to understand that like, the Bart Simpson [porn], it’s a parody, it’s a commentary on how those comics are super family-friendly, and they’re actually not family-friendly. But they’re not supposed to be pornographic. It gives you some hints to understand them.
I think framing these as problematic rather than as illegal or even just not using terms that impute illegality when instead we should just be talking about them as being problematic is a lot more helpful.
Jeremy: Yeah, I think framing these as problematic rather than as illegal or even just not using terms that impute illegality when instead we should just be talking about them as being problematic is a lot more helpful. And really allows a lot more space for a conversation around these works rather than shutting down conversation as we otherwise would do by applying blanket categories of censorship to them.
Aurélie: Can I give you another example that is maybe more related to the question of age? In my thesis right now, I’m looking at this anime called Kite and it was distributed in the US in the 2000s. But the first time it was distributed, it was heavily edited, and some fans were mad because they were like, “oh, they took off all of the pornography part of it.” But among the pornography part of it, that the editors decided to take off during importation, there was a lot of scenes of rape against a child. It was animation, but you had a fan movement to bring back the unedited version. And again, it was a very libertarian attitude and super resistant to the idea of any kind of editing being done. But then you put it in perspective and you’re like, it was a very problematic media representation. Like, how do you think the younger woman in anime fandom felt to see fans being like, “it is actually super important for us to be able to see the rape of this child”? Obviously, a fictional child, but still. Sometimes it’s more important and about creative freedom of course, super important. But sometimes it’s like, let’s take a step back and actually wonder what are we fighting for. And, again, how does it make all the people feel when we do this?
Jeremy: So, you’ve mentioned your PhD research. Is there anything else in your research that you’re planning to publish articles about? Or is there any other research that is on the horizon for you?
Aurélie: I have another article that is about the metaphor that a lot of AI porn platforms use about, “oh, you can do whatever you want,” but actually there are policies and rules. A lot of those rules are good, but what does that mean when they’re pretending that there are no rules, and then how do you actually apply those rules to content that is not realistic? So I’m working on this and I’m hopefully finishing my thesis in the next couple of months.
Brandy: I feel like, yeah, AI is going to be a big fly in the ointment because we’ve got live action, and then we’ve got animated. But then AI is going to create this whole other subgenre of things that things that look live animated, but don’t actually involve people, which I think are going to have to have their own completely individual set of rules.
Jeremy: Yeah, I mean, one of the problems is that if there’s only two categories, which is real or fictional, then AI is always going to be the thin end of the wedge to regulate all fictional media. And so I think there is some merit in saying maybe there should be a third category of content in the way that we regulate it. And that is going to be incredibly contentious because as you may know, just just in the last few days, the US has put a moratorium on new state level regulation of AI. So we are kind of stuck in an unregulated state for a while.
Aurélie: If this is like the last part where I can give advice, it’s to look at how other communities who have always dealt with those questions have been dealing with them. Actually get interested in this history of the moderation of non-realistic content. Because animation has always existed. And animation has always been regulated. Look what worked, look what didn’t work. Talk to those people who are fans and who actually will be the consumers being impacted. And get curious. That’s why I’m defending also using terms like lolicon, which actually talk to this community, they understand what you mean.
I am a leading ICT policy advisor and advocate, driven by a vision of the potential for information and communication technologies to advance socially just outcomes, coupled with a strong awareness of how this vision intersects with political, legal and economic realities.
Over two decades I have applied my expertise and experience in private legal practice, in high-level management of innovative businesses, as a respected and high-profile civil society leader, and as a bold social entrepreneur. Throughout my career, my quiet but unwavering commitment to achieve equitable solutions in fiercely contested domains has been a constant.