The future of Wikipedia in the age of AI: A decline in human contributions?
Wikipedia has long been a cornerstone of open knowledge, maintained by voluntary contributors who ensure the accuracy and breadth of its content. However, the advent of ChatGPT in late 2022 introduced a powerful alternative for quick fact-checking and information retrieval. The study suggests that when users can get direct, AI-generated responses, they are less likely to visit Wikipedia, particularly for topics where ChatGPT produces similar or superior content. This shift raises deeper questions about whether users value information differently when it is presented through an interactive chatbot rather than a traditional encyclopedia format.
The rise of AI-driven chatbots, particularly ChatGPT, has fundamentally changed how users seek and consume information. With large language models (LLMs) generating encyclopedic content in real-time, platforms like Wikipedia face evolving challenges in sustaining voluntary knowledge contributions. A recent study explores how ChatGPT's widespread adoption has impacted Wikipedia’s viewership and editorial activity, shedding light on the shifting dynamics of online knowledge sharing and the potential consequences for human-curated information ecosystems.
The study, "Wikipedia Contributions in the Wake of ChatGPT" by Liang Lyu, James Siderius, Hannah Li, Daron Acemoglu, Daniel Huttenlocher, and Asuman Ozdaglar, was presented at the ACM Web Conference 2025. It employs a difference-in-differences (DiD) analysis to measure the effect of ChatGPT’s launch on Wikipedia’s engagement. The study finds that Wikipedia articles with content similar to ChatGPT’s responses saw a greater decline in both viewership and edits compared to dissimilar articles. This finding suggests a potential paradigm shift, where AI-generated content is subtly reshaping how knowledge is curated, consumed, and even contested in digital spaces.
How ChatGPT is reshaping Wikipedia engagement
Wikipedia has long been a cornerstone of open knowledge, maintained by voluntary contributors who ensure the accuracy and breadth of its content. However, the advent of ChatGPT in late 2022 introduced a powerful alternative for quick fact-checking and information retrieval. The study suggests that when users can get direct, AI-generated responses, they are less likely to visit Wikipedia, particularly for topics where ChatGPT produces similar or superior content. This shift raises deeper questions about whether users value information differently when it is presented through an interactive chatbot rather than a traditional encyclopedia format.
By comparing Wikipedia articles that closely align with ChatGPT’s generated responses against those with distinct content, the researchers identified a significant drop in page views for articles ChatGPT could replicate well. While overall Wikipedia engagement remained stable, certain article categories - especially those focusing on commonly searched, factual, and widely covered topics - saw noticeable declines in user activity. This suggests that AI-generated knowledge is not only supplementing but also supplanting human-generated knowledge in some domains, potentially diminishing the perceived necessity of crowdsourced information repositories.
The effect on Wikipedia editing and knowledge contributions
Beyond passive viewership, Wikipedia’s sustainability relies on a community of editors who continually refine and update content. The study investigated whether reduced readership translated into lower editing activity. Interestingly, while a decrease in edits was observed for ChatGPT-replicable topics, the effect was not statistically significant across all categories. The researchers suggest that contributors may have intrinsic motivations, such as reputation-building or a commitment to open knowledge, which may help sustain Wikipedia’s editing culture despite shifting readership trends.
However, the study warns of long-term risks if AI-generated content continues to supplant human-curated contributions. Wikipedia’s extensive knowledge base serves as a training ground for AI models, and a decline in human-generated edits could diminish the quality and diversity of future AI outputs - a phenomenon known as "model collapse," where AI systems gradually lose touch with independently curated knowledge sources. This cyclical dependency - where AI relies on Wikipedia’s knowledge base while simultaneously discouraging human contributions - poses a structural challenge that could weaken the very foundation upon which AI models are trained.
Future challenges and adaptation strategies for Wikipedia
To remain relevant in an era of generative AI, Wikipedia must consider adaptive strategies that complement rather than compete with AI-generated knowledge. The study suggests that Wikipedia could benefit from AI-assisted content moderation, where language models assist human editors in fact-checking and enhancing articles rather than replacing them outright. This symbiotic relationship between human expertise and AI-driven efficiency could help sustain Wikipedia’s credibility and editorial integrity.
Additionally, Wikipedia’s community-driven nature provides a unique advantage over AI-generated content, as human editors ensure reliability, accuracy, and contextual depth - aspects where AI still struggles. Unlike AI, which can generate convincing but incorrect information, Wikipedia’s reliance on verifiable sources acts as a safeguard against misinformation. The study encourages Wikipedia to explore ways to integrate AI tools responsibly, ensuring that AI serves as an augmentative force rather than an authoritative replacement.
A potential avenue for adaptation could involve dynamic Wikipedia entries that interact with AI models - for example, integrating AI-driven summaries while preserving human editorial oversight. Such an approach could provide a best-of-both-worlds scenario where AI aids in digesting complex information while human editors maintain factual accuracy and ethical considerations.
Conclusion
The impact of ChatGPT on Wikipedia contributions reflects a broader transformation in how knowledge is created and consumed. While AI has made information more accessible, it also poses challenges for platforms that rely on voluntary human contributions. The study underscores the need for a balanced approach, where AI enhances rather than replaces collaborative knowledge-sharing ecosystems like Wikipedia. By embracing AI augmentation rather than competition, Wikipedia can continue to thrive as a trusted, human-curated knowledge repository in the digital age.
However, this transition requires careful oversight and proactive adaptation, as the risks of over-reliance on AI-driven content could have unintended consequences for the sustainability of human knowledge creation. The future of Wikipedia in an AI-dominated world depends not on resisting generative AI but on strategically evolving alongside it - leveraging AI as a tool while preserving the human intellectual contributions that make Wikipedia uniquely valuable.
- FIRST PUBLISHED IN:
- Devdiscourse

