Bluesky melts down over Jesse Singal

In the first week of December 2024, controversial journalist Jesse Singal joined upstart social network Bluesky. Bluesky had been experiencing massive user growth since the election result had been called for Donald Trump in November, as many users of X (formerly Twitter) looked to jump ship for a vessel not helmed by Trump ally Elon Musk.

Due to Singal’s record of journalism critiquing the case for youth gender transition and aggressively clashing with his critics, his presence on Bluesky created immediate tension within Bluesky’s existing user base, which skewed transgender. Many trans users had adopted Bluesky early while Musk was already making them feel unwelcome on X, such as by labelling the term “cisgender” as hate speech.

Singal soon became Bluesky’s most blocked user — but many didn’t simply want him blocked, they wanted him gone, and were prepared to raise the stakes until they got this. One of the tactics employed was to throw additional mud, accusing Singal not only of misconduct in his reporting on trans issues, but also of being a pedophile.

On 13 December, Roderick’s team made a ruling. Ignoring the pedophilia slurs (which his more level-headed critics on Bluesky recognized as risable and false), and focusing on more credible allegations that Singal had infringed the privacy of Bluesky users, the team ultimately decided that despite the outcry, he had done nothing to violate the Bluesky terms of service and could stay. But was this the right decision?

The case against Singal

I am the father of a wonderful trans daughter, who began her medical transition while she was underage. The policies that Singal’s journalism advocates would have made it more difficult for her to begin her transition. While I cannot speak to the experiences of other trans children, I do know that any such delay to my daughter’s transition would not have been what she needed.

While I accept Singal’s right to conduct journalism that I disagree with, I ultimately supported those who called for Singal to be banned from Bluesky, on the ground that he had stepped over the line by violating a rule against promoting material from hate groups, when he posted a screenshot from the deplorable doxxing and cyberstalking community, Kiwifarms.

When Cloudflare banned Kiwifarms in 2022, it identified the website as an exceptional case in which censorship was justified due to its promotion of real-world violence. By allowing Singal to post screenshots from Kiwifarms, other Bluesky users are now being emboldened to do the same, which can only lead to more doxxing and harassment of trans people.

With that said, Singal and I also have a history. In 2021, I had spoken to Singal’s colleague Katie Herzog on background for an episode of their podcast, Blocked & Reported, in which the two journalists investigated one of my earliest trust and safety clients, MAP Support Club (MSC) — a support group for teenagers and adults who identified as experiencing sexual attraction towards younger children, which they had chosen never to act upon.

The sensitivity and the importance of this group was both immediately apparent to me when I first assumed its trust and safety role, and I took the role very seriously:

  • I consulted child sexual abuse (CSA) prevention professionals about what safeguards would be needed, and raised funding from the Just Beginnings Collaborative to support those measures.
  • I secured a partnership with survivor-led CSA prevention group Stop It Now, to ensure that its helpline operators were available in the group to provide regular guided group support sessions.
  • I commissioned the development of software that would ensure that no illicit content was uploaded to the club’s chat forum, and collaborated with MSC’s administrators on the strengthening of the group’s safeguarding rules.
  • I engaged an independent team from Nottingham-Trent University to conduct an evaluation of the group’s safety and effectiveness.

Singal and Herzog conducted their investigation of the group in the face of a fierce social media backlash against it and its fiscal sponsor from those wrongly convinced of a conspiracy theory that it was a front for a grooming operation.

But the journalists found otherwise, concluding that “They are genuinely trying to make life better, to reduce the likelihood of children being harmed. And they’re also trying to save the lives of people who have a… shitty lot in life.” Independent academic experts have since reached the same conclusion, and the group remains active today, in partnership with professional clinicians.

Beyond this, the only evidence offered against Singal to support the slurs against him were a series of articles that he wrote that his critics perceived as being too sympathetic to pedophiles, in that they recommended “treating people with pedophilic interests like human beings who can be reasoned and empathized with.” Such a sympathetic framing can be uncomfortable to read, but it does align with the approach taken by mainstream public health professionals.

The case against Rodericks

The movement to have Singal expelled from Bluesky didn’t stop with him, but extended to Bluesky Head of Trust & Safety, Aaron Rodericks, over his inaction on the matter and his perceived favouritism towards Singal. And in echoes of an attack previously levied against Twitter Head of Trust & Safety Yoel Roth by Elon Musk himself, this included extending false allegations that Rodericks too was a pedophile.

As far as I know, I have never met Rodericks personally. However although my Bluesky presence was then (and still is) pretty small, he and I had been mutual followers since about early 2024 due to our shared industry connection. At that time, another brouhaha was brewing on Bluesky over how it was enforcing its guidelines against child abuse. While Bluesky was banning users who promoted or excused abuse, many argued that it ought to also ban those who liked or shared suggestive-but-legal artwork of fictional characters resembling children or animals, and those who admitted to struggling with pedophilic impulses.

One user who argued that Bluesky had the balance right and shouldn’t be engaging in a broader crackdown was a queer artist named Terra Wilder, who wrote (from a now-deactivated account):

Ok, what alternative should they have as a space to be social because they’re still human beings. Your idea is entirely unreasonable, they’ll never stop trying to enter spaces. Or should we just kill them all. That’ll definitely stop them from emerging forever. … Nobody wants to handle this as a realistic problem it is just gut disgust and no solution.

This promoted a pile-on of abuse against her in which she herself was accused of pedophilia, with the ultimate outcome that she attempted to take her own life and was admitted to hospital. I expressed outrage of this in an exchange of my own directed to the prominent 35-thousand follower account that had been leading the pile-on. Referring to Terra and to another user Jamie who had been subjected to a similar pile-on, I wrote:

Jamie is right. There is no credible case to be made that either they or the person who was hospitalised were pro-abuse. You just don’t want queer people like them to have community. But they correctly call you on your bullshit: this isn’t about abuse. It’s about you taking offense and lashing out.

In retaliation for me making these comments, the user in question posted false smears against me based on an article written by Anna Slatz, a notorious transphobic far-right journalist and Kiwifarms user who had once published a Nazi manifesto. Her secondary source was a smear website from a disgruntled former volunteer colleague of mine at the Internet Corporation for Assigned Names and Numbers (ICANN), who was dismissed from that organisation after stalking and abusing several other colleagues.

Rodericks was then drawn into the dispute by association simply because he followed me. Rather than ignoring this bullying, or at least investigating the sources behind it as he should have done, Rodericks capitulated and unfollowed.

Predictably, this wasn’t enough to satisfy these extremists, who continued to push for a mass crackdown on accounts associated (in their minds) with pedophilia or zoophilia, which Bluesky eventually implemented in November. One message from a Bluesky moderator (or perhaps an AI) to a user banned over furry art provided a sweeping justification for art censorship:

Depicting sexual acts between humans and animals, even as art, is deeply problematic. Animals cannot consent, and such depictions promote the exploitation and abuse of animals. Art has a powerful influence and can normalize harmful behaviors, unintentionally endorsing or promoting these acts. Therefore, it’s crucial to avoid representing such.

But predictably again, this crackdown resulted in many accounts of innocent trans people being targeted, which Bluesky itself acknowledged, reversing many of the bans by late November. While attempting to placate one faction of users, Rodericks had outraged another. So it goes in this profession.

Six months after Terra Wilder’s hospitalisation, many remain convinced that Rodericks, myself, and Jesse Singal are all engaged in a joint conspiracy against them aimed at promoting pedophilia and undermining trans people. It would be laughable if only the real-world consequences of such misinformation and bullying campaigns weren’t so serious.

Lessons for Bluesky and other platforms

I have always insisted that platforms have a social as well as a legal responsibility to avoid hosting sexual abuse content or facilitating grooming. In other blog articles, I have outlined some of the practical approaches that they can implement for finding and reporting CSAM, and for making their platforms safe for younger users.

But at the same time, not everything that causes users to cry “pedophile” should be actioned. In fact, much of the time when this word is uttered, it is used purely for its rhetorical effect, by users who are themselves engaged in antisocial behaviours such as targeted harassment.

The users who engage in such pedojacketing abuse may honestly feel that they are justified in doing so. Jesse Singal has, undoubtedly, made it more difficult for transitioners, and his reporting on pedophilia, while scientifically accurate, may be legitimately triggering for abuse survivors, who may feel that he expresses greater sympathy for pedophiles than for them.

Associating professionals with the stigmatised populations that they report on or work with is so common that there’s even a word for it; courtesy stigma. But that doesn’t make it OK. Too often, journalists and trust and safety professionals who are targeted are themselves marginalised. It is no coincidence that pile-on campaigns targeting these individuals attract participants from far-right groups such as Kiwifarms, even when they may have been initiated by progressives with good intentions.

As other platforms have also discovered, a small, very vocal faction of users who fling pedophilia smears can have an outsized negative influence over its moderation practices. Bluesky must learn to resist their influence. Its composable moderation architecture already provides effective mechanisms for users who are triggered by content that offends them. Bending further to those who use ugly false smears to get their way is not in the longer term interests of Bluesky as a healthy community.

✉️ Email me updates!

We don’t spam! Read more in our privacy policy

Share This Article
Facebook
Twitter
LinkedIn
Email

Leave a Reply

Your email address will not be published. Required fields are marked *

I am a leading ICT policy advisor and advocate, driven by a vision of the potential for information and communication technologies to advance socially just outcomes, coupled with a strong awareness of how this vision intersects with political, legal and economic realities.

Over two decades I have applied my expertise and experience in private legal practice, in high-level management of innovative businesses, as a respected and high-profile civil society leader, and as a bold social entrepreneur. Throughout my career, my quiet but unwavering commitment to achieve equitable solutions in fiercely contested domains has been a constant.

Latest Articles

Bluesky has been melting down in a fight over controversial journalist Jesse Singal, and pedophilia allegations have been flying freely.

Australia is one of the few countries of its size,

What should a social media platform do about content that