As a strong supporter and veteran of community-based platform governance, it gives me no pleasure to observe how badly the volunteers on English Wikipedia are managing the platform’s child safety issues. The root problem is that the Wikimedia Foundation – the legal entity that hosts Wikipedia and its sister projects – has delegated tasks to its community that simply shouldn’t be performed by untrained volunteers.
Wikipedia’s storied history in child protection is a legacy of the Foundation’s misguided approach. As detailed in a previous article, for long periods it hosted articles such as a catalogue of actual CSAM, and a euphemistic article on “Adult child sex”. Capitalizing on its growing notoriety, in 2010 Larry Sanger, ousted co-founder of Wikipedia, reported the platform to the FBI for allegedly hosting CSAM, while also referencing comments from the Wikimedia Foundation’s then Deputy Director seemingly suggesting that children could consent to sex with adults.
Wikipedia stumbles in addressing the problem
By this time, alarm bells were ringing, and founder Jimmy Wales personally intervened to ensure that a strong correction was made. From a trust and safety standpoint, the correct response to address Wikipedia’s crisis would have involved two steps: first, the development of a clear, evidence-based policy for assessing what content was acceptable on the platform and what was not. Second, ensuring that the platform had properly trained staff to enforce this policy according to its terms.
Wikipedia did at least succeed in executing the first step in this recommended response. The same year that it was reported to the FBI, it developed a written and tolerably clear child protection policy. But it stumbled in taking the second necessary step of following up with trained enforcement; instead, entrusting this role to untrained and mostly anonymous volunteers.
As described in my previous article, the most active of these volunteers, who post on “pedophile hunter” forums hosted off Wikipedia, have approached the task with a zealous, vigilante mindset. With little discrimination or tolerance, their targets have included anything that they perceive as evincing a tolerance for “pedophilia”, broadly defined – from a Netflix movie poster to an article on Puberty.
Notionally, certain disputes over child protection bans are subject to review by Wikipedia’s Arbitration Committee (ArbCom), as well as being ultimately subject to the organizational oversight of the Wikimedia Foundation’s legal team. However in practice neither body has been willing to intervene even in cases of the most egregious misapplication of the child protection policy, such as a series of decisions banning expert editors for adding factual and neutral information about child sexual abuse prevention.
How this became personal
I had been a Wikipedia editor since 2005, periodically writing and editing articles on a range of topics within my areas of expertise, including Internet governance, consumer protection, copyright, international relations, and child protection. In May this year, a colleague alerted me about a pending proposal to delete an article on the term “Minor Attracted Person” (archived here), and I decided to weigh in with my perspective.
There is no doubt that that term, and its abbreviation MAP, have become a divisive weapon in the culture wars over sex and gender, perhaps second only to the term “groomer.” This is a shame, because the professionals who predominantly use the term do so in order to draw an important distinction between people who are attracted towards children, and those who have acted on such an attraction. As Dr Gilian Tenbergen explained to Wikimedia Foundation, “It is unsurprising that professionals who are trying to prevent people with pedophilia from becoming child abusers would encourage the use of an identifier that isn’t considered synonymous with ‘child abuser’.”
As a professional who works in the field of CSA prevention, I thought that this view deserved to be aired in the discussion, and therefore posted a single paragraph comment referencing the term’s professional usage and arguing against deletion of the article. At that time I assumed that a decision would be taken one way or the other, and that would be the end of the matter. If only I had known what would happen next, I would never have weighed in to begin with. I should perhaps have heeded the words of former Wikipedia administrator Guy Chapman who wrote:
If you have a job or a family, do not get involved with controversial subjects on Wikipedia. There are people out there who are batshit insane and will genuinely try to get you fired from your job, call you at home, solicit burglars to raid your home, and invite predators to look at your kids.
As someone who had previously been subject to alt-right attacks over my work, this cuts close to home. Indeed, the backlash to my comment from Wikipedia’s volunteers was astonishing. In a out-of-control discussion resembling nothing so much as a 4chan thread, one labelled me “vile”, another insinuated that this very blog contained improper “child-related content” (before purging two links to it from the encyclopedia), and another huffed that “persons of this character” are not welcome on Wikipedia.
On that same day, my long-standing account was banned, and several subsequent attempts I made to have the decision substantively reviewed were stonewalled or summarily declined. Several other editors, including researchers who specialized in this field, were also banned in the sweep, along with multiple other articles including “Primary prevention of child sexual abuse” and “Allyn Walker” (I wrote about Allyn’s demonization previously).
Last week, I lodged my final appeal to Wikipedia’s administrators over this improper ban, which I reproduce here in full:
Please consult my user talk page for a brief history. As described there, I previously made a UTRS appeal #74743, which was closed with the suggestion “Due to the complexities therein, it would be best for user to email Trust and Safety.” I subsequently did email Trust and Safety, and set up a face-to-face meeting with Jan Eissfeldt, Global Head of Trust and Safety, which took place on July 13.
He and I discussed a growing concern among public health professionals that English Wikipedia’s child protection policy is being misapplied to censor information about child sexual abuse (CSA) prevention, and to intimidate and silence editors who write on this topic. My own ban from English Wikipedia is an example of the policy being misapplied in this way.
While receptive to this concern, Mr Eissfeldt expressed that there was no action that his office could take to correct the misapplication of the policy on English Wikipedia, which is an independently self-governed project. He suggested instead that a representative of the professionals involved could write a submission to the Wikimedia Foundation concerning its proposed new Child Sexual Exploitation Policy. That policy, and guidelines for its enforcement, would be disseminated to English Wikipedia and the other projects in due course.
Following Mr Eissfeldt’s advice, such a submission was written by a representative of the CSA prevention profession and sent on July 28. It reads in part: “It is entirely appropriate for the Foundation to ensure that its platform is not misused to sexualize minors or to advocate or promote inappropriate adult-child relationships. However, … many in Wikipedia’s community who have attempted to enforce its child protection policy have done so not against child abusers on the platform, but against legitimate editors and even child protection professionals, on the basis of misunderstandings about prevention science. … It is crucial to understand that professionals who address misunderstandings about pedophilia are in no way condoning, minimizing, or excusing the intolerable and illegal act of child sexual abuse. … When editors who contribute information about pedophilia and child sexual abuse are treated with suspicion, hostility, and ad hominem abuse as has been the case on English Wikipedia, this creates a chilling effect that can only dissuade knowledgeable editors from contributing to the Foundation’s projects. This in turn will reduce its quality and allow misinformation to flourish.”
While the new Child Sexual Exploitation Policy, when complete, will hopefully help to resolve these systemic problems on English Wikipedia in the future, the existing dispute over my wrongful ban from English Wikipedia under that policy still remains to be resolved. To summarize the reasons for my appeal:
- As a full-time professional trust and safety consultant and former Executive Director of a child protection organization, I have legitimate professional interest and expertise in this subject matter. I have spoken at professional conferences on the subject, and have professionally advised multiple small and large online platforms on best practices in the fight against child exploitation.
- Wikipedia’s child protection policy is aimed at preventing editors from misusing Wikipedia to pursue relationships with children, from advocating for or justifying such relationships, or admitting to their own pedophilia. Nothing that I have ever done, on or off Wikipedia, amounts to a violation of this policy in any way.
- It’s important that the policy is not interpreted so over-broadly that it would preclude on-wiki discussion of stigma-free interventions used to reduce rates of offending by individuals with pedophilia. Such policies enjoy a broad scientific consensus.
Scarcely one hour later, with no explanation this final appeal was denied.
The dangers of entrusting enforcement to the community
Child protection, more so than other trust and safety issues, is a sensitive and nuanced topic. One of the most sensitive aspects of it concerns how easily bad faith allegations can be levied online, which invoke child abuse simply to provoke a heated emotional reaction. These bad faith allegations are seldom directed against actual child abusers, but rather against LGBTQ+ people, against children in raucous online communities, and even against researchers, clinicians, and trust and safety professionals.
By entrusting enforcement of Wikipedia’s child protection policy to untrained community members, the Wikimedia Foundation has unwisely allowed its policies to become a subject of such culture war wrangling. An important lesson for the Foundation here is that while encouraging community members to produce and manage content is one thing, granting them authority over platform policies and their enforcement is quite another.
In particular, volunteer community members should not ordinarily be empowered to take direct action banning other members, especially not in a sensitive topic area where bad faith allegations and slurs are routinely exchanged. Imagine if verified users of X (Twitter) could ban other users from the platform on the basis of false allegations of pedophilia, without any oversight by trust and safety – this is essentially what the Wikimedia Foundation has accepted as a norm.
This doesn’t mean that community volunteers can’t be entrusted with moderation tasks at all. Indeed, one of my own clients relies heavily on volunteers in its trust and safety workflow. But they are provided with training, and their powers are very tightly limited: they can hide comments, issue warnings, and block users from posting content that would appear in public content feeds. But they can’t hard delete anything, and they certainly can’t ban users from the platform. Furthermore, these volunteers are specifically disallowed to take action on child protection issues, which must be escalated to a staff member.
The Wikimedia Foundation would do well to follow this model – and there is some prospect that it might. The Foundation is currently consulting on a new Foundation-wide child sexual exploitation policy, that would take precedence over Wikipedia’s community-developed policy. Along with this would come guidance for how Foundation projects should self-enforce the policy – and at what points they should defer to the Foundation’s professional trust and safety team.
This will be a welcome development. In the meantime, it saddens me that Wikipedia cannot be regarded as a reliable source of information on child protection, nor as a safe space for sexuality researchers, clinicians, or social workers whose work on child protection follows an evidence-based, public health approach.
Wikipedia’s experience is a cautionary reminder that when trust and safety functions are delegated to the community, the platform’s owner is giving up direct control of them. This means that they can and will be captured to advance particular political or social goals – especially when the functions relate to child protection.
Conclusion
Wikipedia’s record on child protection has swung from one extreme, in which pro-child abuse content was openly hosted for years, to another, in which child sexual abuse prevention professionals are being smeared and silenced. Both are predictable results of entrusting essential trust and safety responsibilities to untrained volunteers.
While Wikipedia has developed a sufficiently clear child protection policy, its enforcement has suffered due to a lack of expertise and an excess of emotion, turning what should be a predictable policy enforcement process into a culture war battle. Importantly, despite reforms instituted from the top down by Jimmy Wales, oversight from the Foundation has been ineffective in this case.
Users can help to moderate content, and there are numerous platforms that engage them effectively to do so. But child protection functions should always be handled by the platform’s own professional staff. The Wikimedia Foundation’s current efforts to establish a comprehensive child sexual exploitation policy along with guidance on its enforcement shows promise of its commitment to reversing its history of poor judgment in this area.
2 Responses
Wikipedia is not a “platform”.
From the AfD:
“Delete or redirect to pedophilia, which this is not meaningfully distinct from. casualdejekyll 18:41, 13 May 2023 (UTC)”
My opinion has not changed.
Technically it is a platform, but either way, I have no quarrel with editors making good faith decisions to merge or delete articles that are seen as being redundant. But that shouldn’t involve smearing and banning people who disagree.