Wikipedia and fact-checking

Wikipedia's volunteer editor community has the responsibility of fact-checking Wikipedia's content.[1] Wikipedia has been considered one of the major free open source webpages, where millions can read, edit and post their views for free. This can be both on the grounds of dissemination of misinformation and disinformation publications.

Wikipedia's practice of flagging unsubstantiated information with "Citation needed" warnings has become almost synonymous with the need for fact checking more generally

Therefore Wikipedia takes the effort to provide its users with the best possible verified sources. Fact-checking is an aspect of the broader reliability of Wikipedia.

Various academic studies about Wikipedia and the body of criticism of Wikipedia seek to describe the limits of Wikipedia's reliability, document who and how anyone uses Wikipedia for fact-checking, and what consequences result from the use of Wikipedia as a fact-checking resource. There are several low-quality article types on Wikipedia such as the self-contradictions.[2] These types of articles require improvement and noisy articles can be ruled out.

Major platforms including YouTube[3] and Facebook[4] use Wikipedia's content to confirm the accuracy of the information in their own media collections.

Platforms that fact-check with Wikipedia

Public trust and counter to fake news

Wikipedia serves as a public resource for access to genuine information. One such is the COVID-19 information, where people can rely on the Wikipedia's page for genuine information.[5] Seeking public trust is a major part of Wikipedia's publication philosophy.[6] Various reader polls and studies have reported public trust in Wikipedia's process for quality control.[6][7] In general, the public uses Wikipedia to counter fake news.[8]

YouTube using Wikipedia for fact-checking

YouTube fact-checking

At the 2018 South by Southwest conference, YouTube CEO Susan Wojcicki made the announcement that YouTube was using Wikipedia to fact check videos which YouTube hosts.[3][9][10][11] No one at YouTube had consulted anyone at Wikipedia about this development, and the news at the time was a surprise.[9] The intent at the time was for YouTube to use Wikipedia as a counter to the spread of conspiracy theories.[9] This is done by adding new information boxes under some YouTube videos, thereby, attracting conspiracy theorists.

Facebook fact-checking

Facebook uses Wikipedia in various ways. Following criticism of Facebook in the context of fake news around the 2016 United States presidential election, Facebook recognized that Wikipedia already had an established process for fact-checking.[4] Facebook's subsequent strategy for countering fake news included using content from Wikipedia for fact-checking.[4][12] In 2020, Facebook began to provide information from Wikipedia's info boxes into its own general reference knowledge panels to provide objective information.[13]

Fact-checking Wikipedia

Fact-checking is one aspect of the general editing process in Wikipedia. The volunteer community develops a process for reference and fact-checking through community groups such as WikiProject Reliability.[8] Wikipedia has a reputation for cultivating a culture of fact-checking among its editors.[14] Wikipedia's fact-checking process depends on the activity of its volunteer community of contributors, who numbered 200,000 as of 2018.[1]

The development of fact-checking practices is ongoing in the Wikipedia editing community.[6] One development that took years was the 2017 community decision to declare a particular news source, Daily Mail, as generally unreliable as a citation for verifying claims.[6][15] Through strict guidelines on verifiability, Wikipedia has been combating misinformation.[16] According to Wikipedia guidelines, all articles on Wikipedia's mainspace must be verifiable.

Some of the core content policies include verifiability, no original research, and a neutral point of view. Materials must also comply with the copyright policy. While some of the key principles are copyright and plagiarism, neutrality, notability, and original research. While linking to sites such as Scribd or YouTube should be done with due care to avoid linking to material violating copyright.

Reliable sources

Wikipedia states that "if no reliable sources can be found on a topic, Wikipedia should not have an article on it." Here are some of the key features to consider to count a source as reliable: the work, author of the work, and publisher of the work.

Reliable sources are:

  • Base articles
  • Published materials
  • Scholarship
  • News organizations
  • Vendor and e-commerce sources
  • Academic and peer-reviewed
  • University-level textbooks
  • Books published by respected publishing houses

Academic and peer-reviewed sources publications are the most reliable sources. Not all newspapers and magazines can be considered reliable even though they may have credible sources. Wikipedia is still under discussion to consider magazines as a reliable source.

Non-reliable Sources

  • Questionable sources
  • Self-published sources

It is advisable to be very cautious when sourcing content related to the medical field or a living person.

Wikipedia provides Wikipedia: Reliable sources/Noticeboard as a space to consult and discuss if a source is reliable.

Self-contradiction articles

An experiment was conducted on detecting self-contradiction articles on Wikipedia using a developed model called "Pairwise Contradiction Neural Network" (PCNN). [17]

Contributions to this experiment are as follows:

  • A novel Wikipedia dataset named WikiContradiction was created which is the first dataset for self-contradiction tasks on Wikipedia.
  • A novel model PCNN was developed and was fine-tuned via the WikiContradiction dataset.
  • The empirical results exhibit the PCNN model's promising performance as well as highlight the most contradicted pairs.
  • The complied WikiContradiction dataset can be used as a training resource for improving Wikipedia's articles.
  • This can further contribute to fact-checking and claim verification as well.

Limitations

When Wikipedia experiences vandalism, platforms that reuse Wikipedia's content may republish that vandalized content.[18] Vandalism is prohibited in Wikipedia.

Wikipedia suggests these steps for inexperienced beginners to handle vandalism: access, revert, warn, watch and finally report. [19]

In 2018, Facebook and YouTube were major users of Wikipedia for its fact-checking functions, but those commercial platforms were not contributing to Wikipedia's free nonprofit operations in any way.[18] In 2016, journalists described how vandalism in Wikipedia undermines its use as a credible source.[20]

Self-contradiction limitations

Two main limitations of the self-contradiction PCNN model are the subjectivity of self-contradiction and not being able to deal with lengthy documents.

See Also

References

  1. Timmons, Heather; Kozlowska, Hanna (27 April 2018). "200,000 volunteers have become the fact checkers of the internet". Quartz.
  2. Hsu, Cheng; Li, Cheng-Te; Saez-Trumper, Diego; Hsu, Yi-Zhan. "WikiContradiction: Detecting Self-Contradiction Articles on Wikipedia". 2021 IEEE International Conference on Big Data (Big Data): 427–436. doi:10.1109/BigData52589.2021.9671319.
  3. Glaser, April (14 August 2018). "YouTube Is Adding Fact-Check Links for Videos on Topics That Inspire Conspiracy Theories". Slate Magazine.
  4. Flynn, Kerry (5 October 2017). "Facebook outsources its fake news problem to Wikipedia—and an army of human moderators". Mashable.
  5. Benjakob, Omer (4 August 2020). "Why Wikipedia is immune to coronavirus". Haaretz.
  6. Iannucci, Rebecca (6 July 2017). "What can fact-checkers learn from Wikipedia? We asked the boss of its nonprofit owner". Poynter Institute.
  7. Cox, Joseph (11 August 2014). "Why People Trust Wikipedia More Than the News". Vice.
  8. Zachary J. McDowell; Matthew A. Vetter (July 2020). "It Takes a Village to Combat a Fake News Army: Wikipedia's Community and Policies for Information Literacy". Social Media + Society. 6 (3): 205630512093730. doi:10.1177/2056305120937309. ISSN 2056-3051. Wikidata Q105083357.
  9. Montgomery, Blake; Mac, Ryan; Warzel, Charlie (13 March 2018). "YouTube Said It Will Link To Wikipedia Excerpts On Conspiracy Videos — But It Didn't Tell Wikipedia". BuzzFeed News.
  10. Feldman, Brian (16 March 2018). "Why Wikipedia Works". Intelligencer. New York.
  11. Feldman, Brian (14 March 2018). "Wikipedia Is Not Going to Save YouTube From Misinformation". Intelligencer. New York.
  12. Locker, Melissa (5 October 2017). "Facebook thinks the answer to its fake news problems is Wikipedia". Fast Company.
  13. Perez, Sarah (11 June 2020). "Facebook tests Wikipedia-powered information panels, similar to Google, in its search results". TechCrunch.
  14. Keller, Jared (14 June 2017). "How Wikipedia Is Cultivating an Army of Fact Checkers to Battle Fake News". Pacific Standard.
  15. Rodriguez, Ashley (10 February 2017). "In a first, Wikipedia has deemed the Daily Mail too "unreliable" to be used as a citation". Quartz.
  16. "Wikipedia:Verifiability", Wikipedia, 2022-04-18, retrieved 2022-04-19
  17. Hsu, Cheng; Li, Cheng-Te; Saez-Trumper, Diego; Hsu, Yi-Zhan (2021-12-15). "WikiContradiction: Detecting Self-Contradiction Articles on Wikipedia". 2021 IEEE International Conference on Big Data (Big Data). Orlando, FL, USA: IEEE: 427–436. doi:10.1109/BigData52589.2021.9671319. ISBN 978-1-6654-3902-2.
  18. Funke, Daniel (18 June 2018). "Wikipedia vandalism could thwart hoax-busting on Google, YouTube and Facebook". Poynter. Poynter Institute.
  19. "Wikipedia:Vandalism", Wikipedia, 2022-04-16, retrieved 2022-04-19
  20. A.E.S. (15 January 2016). "Wikipedia celebrates its first 15 years". The Economist.

Further consideration

  • Wikipedia:WikiProject Reliability, the English Wikipedia community project which self-organizes fact-checking
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.