Wikipedia Vetting AI Gets Smarter

wikiedit

Wikipedia is adopting a new artificial intelligence tool to weed out potentially low-quality edits. However, human users will retain the power to decide which edits remain active on the site.

Previous attempts to automatically rank edits and highlight possible malicious activity haven’t worked well, according to the Wikimedia Foundation. It says such tools have placed too emphasis on how active and experienced an editor is, with the effect of largely presuming a new editor is acting in bad faith. That in turn caused a significant increase in the number of people who made one or two edits and then gave up because they were being overturned.

The attempted solution is known as the Objective Revision Evaluation Service. It won’t pay any attention to who has actually made an edit, but rather will try to rank the edit itself, and more specifically the effect on the page. The algorithm is based around existing human ratings of the overall quality of pages and individual edits, the idea being to learn the characteristics of good and bad edits.

The idea is to not only use and refine ORES on Wikipedia itself, but to make the technology open source so that other people can create their own applications using the automated ranking data.

The ultimate goal is for ORES to act as a form of triage, taking care of everything that’s possible in an automated tool and leaving Wikipedia reviewers more time to carry out tasks which need a human touch. For example, if reviewers spend less time manually weeding out obvious vandalism, they may have more time to deal appropriately with situations where a new editor has made a “bad” edit in good faith and would benefit from a supportive message explaining the reasons for an edit being reverted.


Geeks are Sexy needs YOUR help. Learn more about how YOU can support us here.