Wikipedia, once a central stop for online information, is now confronting a quieter but significant shift in how people explore the web. Recent figures[1] from the Wikimedia Foundation reveal an eight-percent year-over-year decline in human visits to the encyclopedia, a change linked to the growing role of generative AI and the rise of social video as preferred sources for quick knowledge.
Hidden Traffic and Bot Reclassification
Earlier this year, Wikimedia engineers noticed irregular spikes in visits, especially from Brazil. The surge first appeared to represent genuine user interest, yet a deeper look revealed that many of those visits were from bots disguised as people. After refining its detection systems and recalculating data from March through August 2025, the foundation concluded that a large portion of what had seemed to be human activity was in fact automated scraping for AI and search engines.
Once this adjustment was made, the organization gained a clearer picture of real engagement. The downward trend that followed confirmed a decline in human visits rather than a sudden collapse of interest. It also exposed how intensively bots and crawlers continue to extract Wikipedia content to feed commercial systems, including AI tools and search summaries.
The New Gatekeepers of Knowledge
As search engines adopt AI features that deliver direct answers instead of external links, fewer users arrive at source sites such as Wikipedia. Younger audiences also spend more time on video-driven platforms like TikTok, YouTube, and Instagram for explanations that once came from text-based web searches. Similar drops in referral traffic have been seen across many publishers, showing a wider pattern in how people consume verified information.
Despite the decline, Wikipedia remains central to the digital knowledge economy. Most large language models, from those used in consumer chatbots to academic research tools, rely heavily on its content to ground their answers. Search and social platforms routinely integrate its information into their own systems. In effect, people are still reading Wikipedia every day, though often through layers of AI summaries or visual feeds that obscure the original source.
Risks to Volunteer Knowledge
While the reach of Wikipedia’s content has never been greater, the path through which readers encounter it has grown indirect. That separation carries risks. Fewer direct visits mean fewer volunteer editors contributing updates or verifying facts, and fewer small donors sustaining the nonprofit’s operations. For a platform that depends entirely on volunteer labor and individual donations, these are not minor shifts but potential structural challenges.
The Wikimedia Foundation argues that companies using its material have a shared responsibility to maintain the health of the ecosystem they depend on. Encouraging users to click through to the original pages not only keeps knowledge transparent but also ensures that the human work behind it continues.
Adapting to the Changing Internet
In response to these changes, Wikipedia is not standing still. The foundation has started enforcing stricter policies on how third parties reuse its material and is designing a new framework for attribution so that AI and search companies can credit content more visibly. Two new internal teams, Reader Growth and Reader Experience, are experimenting with ways to attract new audiences and improve engagement for existing ones.
Other projects aim to meet people where they already are. The Future Audiences initiative explores how Wikipedia’s material can appear responsibly on newer platforms through short videos, games, or chatbot integrations. The goal is to extend access without weakening the open-knowledge principles that made the site trustworthy.
Sustaining Human-Curated Knowledge
Miller and his team emphasize that maintaining the integrity of the encyclopedia now depends as much on public behavior as on technology. Clicking through to sources, verifying citations, and discussing the value of human-curated information all help sustain the open web. The foundation is inviting volunteers to test new tools, share feedback, and guide the next stage of Wikipedia’s evolution as it navigates an AI-dominated era.
After twenty-five years, the encyclopedia’s mission remains unchanged: free, accurate, and transparent knowledge for everyone. Yet sustaining that mission now requires cooperation from the same digital systems that have learned so much from it. Whether AI companies and users return that support will determine how freely human knowledge continues to flow on the internet.
Rival Visions of Online Truth
Critics have long argued that Wikipedia’s openness, while its greatest strength, also leaves it vulnerable to bias that reflects the leanings of its most active editors, since articles on politics, culture, and technology often depend on a small circle of contributors whose judgments about sources or wording can tilt an entry toward one interpretation while keeping others buried under technical discussion pages that few readers ever see, and this structural imbalance has led to recurring debates about whether the encyclopedia’s governance truly reflects a neutral consensus or simply the loudest voices in its volunteer community.
In recent years, public figures frustrated with what they view as selective moderation or uneven coverage have proposed rival knowledge systems, among them Elon Musk, whose idea for “Grokpedia” would combine his AI assistant Grok[2] with an open contribution model that[3], in theory, tracks edits transparently through blockchain-style provenance records and allows readers to rate factual reliability in real time, though it remains uncertain whether such a system could avoid the same ideological clustering that shaped Wikipedia’s own editor base. Examples of disputed neutrality are easy to find: pages about climate policy, Middle-East conflicts, or electric-vehicle economics often see rapid reversions and talk-page battles whenever new information challenges established wording, showing how community editing can both safeguard accuracy and entrench group bias at the same time.
The controversy underscores a central paradox of online knowledge… the more open a platform becomes, the more its internal hierarchies of trust determine what the world accepts as fact… and any successor that hopes to replace or refine Wikipedia will still need to confront that same human tendency toward narrative control disguised as consensus.
Read next: Creator Economy Shifts Offline as Brands Embrace IRL Events to Build Stronger Community Connections[4]
References
- ^ figures (diff.wikimedia.org)
- ^ Elon Musk, whose idea for “Grokpedia” would combine his AI assistant Grok (www.digitalinformationworld.com)
- ^ contribution model that (www.digitalinformationworld.com)
- ^ Creator Economy Shifts Offline as Brands Embrace IRL Events to Build Stronger Community Connections (www.digitalinformationworld.com)