User:Superb Owl/Future draft
From Wikipedia, the free encyclopedia
Please collaborate on this draft essay of ideas for the future of Wikipedia - hoping to turn into an essay or two depending on feedback
Short summaries
Search engine
While research is not yet a strength of AI (though some are working on this issue[3]), perhaps a tool could be custom-made for Wikipedians that focuses on reliable sources? Should not be too difficult to build a tool for editors that is more comprehensive and extensive than Wikipedia:Advanced Source Searching#Custom search engines which have a WP:Overreliance upon Google.
Building consensus
Wiki survey algorithms/implementations could make bigger discussions (such as elections and other bigger decisions made by the Wikipedia community). They can help editors to speak their mind more freely and openly (avoid groupthink), encourage more participation by providing a healthier discussion environment and more quickly find consensus.
Thoughts on AI
- Fear that the AI skepticism among many editors is a form of unilateral disarmament amidst a global information war, where adversaries of verifiable/neutral knowledge are able to use the latest technologies but its defenders are not
- Worried that there will be too much nostalgia/love for the hand-crafted process of Wikipedia by humans so that only hipsters and 'knowledge artisans' who enjoy hand-crafting articles will remain, long past the time when Wikipedia was a convenient source of reliable information. The Arion Press is quite cool, but Wikipedia is not an art project
- Given that only 0.6% of Wikipedia articles are considered Good articles, how do we get that number up to 3% let alone 6% of articles without substantial use of AI?
- If AI is so expensive that Wikipedia would have to limit its use, why not prioritize it for content where editor safety is most at risk: controversial topics, especially where transnational repression, stochastic terrorism, and other risks are known?
Focus on content that…
…is scarce in certain media due to capture by powerful interests
- …where there is low press freedom or academic freedom on the topic in much of the world
- Also, documenting trends on freedom of speech and academic freedom could help better understand which topics would be most valuable to prioritize.
…benefits the most from radical transparency (contentious/political topics)
As more and more people get information from less transparent sources (social media, AI chatbots, etc.), Wikipedia has a huge competitive advantage in being perhaps the most transparent source of information on the internet. By focusing on creating and improving content that is disputed and distributed a lot in low-transparency environments, it could bring back appreciative users to the site and maybe even help to grow the editor base.
Where to find articles that would benefit from radical transparency?
Articles on sources are especially helpful due to how contentious they can be as well as how helpful they are for Wikipedians deciding how much weight to give that particular source within Wikipedia on any given topic.
Certain WikiProjects like Freedom of speech and Human rights likely have a higher proportion of articles on contentious/political topics.
…creates high-quality articles in any language (instead of translating between languages)
This idea is a credit to a presenter at Wikimania 2025 who has started to focus on articles that do not exist in other languages.
If tools like WikiChat, an AI bot that refines its answers by drawing from the Top 25 Wikipedia languages, become more available this strategy makes more and more sense to focus less on translation and more on quality in whichever language is most likely to support the highest-quality article.
…gets more articles to GA/FA status to:
1) help train future AI tools
AI tools, if they are going to be helpful in specific tasks, benefit from high-quality training data and examples. Bringing articles up to these standards might make lots of tedious tasks a thing of the past more quickly than they otherwise would.
2) to help refine community guidelines on best practices
The other benefit is it gives editors time to come across issues around guidelines and interpretations of rules that can help to clarify and improve the standards that Wikipedia will use in the future.
…does not have an unresolved fork
Curious what the conversations have been between MDWiki and Wikipedia:WikiProject Medicine but until this fork has resolved, it seems less impactful believing that it will have to be merged at some point (especially given the decline in editors across either project). If MDWiki wants added restrictions for editors to medical content to merge the fork, then that may be a worthwhile compromise.
Other questions about MDWiki.org vs. Wikipedia’s medical coverage:
- is there any academic or independent analysis of MDWiki? Wikipedia has so much research done on its health information that it seems (as of September 2025) like the best place to contribute.
- there has never been a successful fork of Wikipedia - what would make MDWiki any different?
- who backs MDWiki? (990)
- Board: James Heilman chairs the board and reports 10 hrs/wk (Rest of board reports 1 hr/wk ea)
- Contributors/members (User:Doc James and User:Ozzie10aaaa seem to have had the two most medical edits on Wikipedia before 2020, though, at a glance, none of the other members seem to have been active in Wikipedia before)