Thanks to the rise of large-scale language models (LLMs) like OpenAI's GPT, AI-generated slop is taking over the realm of user-generated internet, but think about Wikipedia editors for a moment . In addition to the normal job of removing bad human edits, an increased percentage of our time is spent removing AI filler.
404 Media spoke to crowdsourced encyclopedia editor Ilyas Lebleu, who helped found the WikiProject AI Cleanup project. The group is trying to devise best practices for detecting machine-generated contributions. (And before you ask, AI won't help with this.)
In this context, a particular problem with AI-generated content is that it is almost always poorly sourced. LLM can generate large volumes of plausible-sounding text in an instant, so much so that entire fake entries have even been uploaded in order to sneak the hoax past Wikipedia's experts.