Tina Nguyen is a senior reporter for The Verge, covering the Trump administration, Elon Musk’s takeover of the federal government, and the tech industry’s embrace of the MAGA movement.
Wikipedia isn’t replacing their human editors with artificial intelligence yet — but they’re giving them a bit of an AI boost. On Wednesday, the Wikimedia Foundation, the nonprofit that runs Wikipedia, announced that it was integrating generative AI into its editing process as a means to help its volunteer and largely unpaid staff of moderators, editors, and patrollers reduce their workload and focus more on quality control.
In a statement, Chris Albon, the Director of Machine Learning at the foundation, emphasized that he did not want AI to replace their human editors or end up generating Wikipedia’s content. Rather, AI would be used to “remove technical barriers” and “tedious tasks” that impeded editors’ workflow, such as background research, translation, and onboarding new volunteers. The hope, he said, was to give editors the bandwidth to spend more time on deliberation and less on technical support. “We will take a human-centered approach and will prioritize human agency; we will prioritize using open-source or open-weight AI; we will prioritize transparency; and we will take a nuanced approach to multilinguality,” he wrote.
The site already uses AI to detect vandalism, translate content, and predict readability, but until this announcement, it had not offered AI services to their editors. In recent years, the Wikimedia Foundation has attempted to make life easier for their volunteer workers, from adding new features to improve the editing experience, to offering them legal protection from right-wing harassment campaigns.
But the amount of information and content in the world is rapidly outpacing the number of active volunteers able to moderate it, and Wikipedia faces a future where AI would, quite literally, eat it alive. Earlier this month, the Wikimedia Foundation announced a new initiative to create an open access dataset of “structured Wikipedia content” — that is, a copy of Wikipedia content optimized specifically for machine learning — with the aim of keeping the bots off the site meant for human browsing. In recent years, the number of AI bots scraping the site has drastically scaled to the point that bot traffic has actually put a strain on their servers and increased bandwidth consumption by 50 percent.