Media Bias/Fact Check

Source: Wikipedia, the free encyclopedia.
Media Bias/Fact Check
Founded2015; 9 years ago (2015)
HeadquartersGreensboro, North Carolina
OwnerDave M. Van Zandt[1]
URLmediabiasfactcheck.com Edit this at Wikidata
Current statusActive

Media Bias/Fact Check (MBFC) is an American website founded in 2015 by Dave M. Van Zandt.[1] It considers four main categories and multiple subcategories in assessing the "political bias" and "factual reporting" of media outlets.[2][3]

It is widely used, but has been criticised for its methodology.[4] Scientific studies[5] using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017,[6] with NewsGuard[7] and with BuzzFeed journalists.[8]

Methodology

Four main categories are used by MBFC to assess political bias and factuality of a source. These are: (1) use of wording and headlines (2) fact-checking and sourcing (3) choice of stories and (4) political affiliation. MBFC additionally considers subcategories such as bias by omission, bias by source selection, and loaded use of language.[2][9] A source's "Factual Reporting" is rated on a seven-point scale from "Very high" down to "Very low".[10]

Chart showing the degree of bias and factual ratings given to Consumer Reports

Political bias ratings are American-centric[9][11] and are "extreme-left", "left", "left-center", "least biased", "right-center", "right", and "extreme-right".[12]

The category "Pro-science"[3] is used to indicate "evidence based" or "legitimate science". MBFC also associates sources with warning categories such as "Conspiracy/Pseudoscience", "Questionable Sources" and "Satire".[3]

Fact checks are carried out by independent reviewers who are associated with the

International Fact-Checking Network (IFCN) and follow the International Fact-Checking Network Fact-checkers’ Code of Principles, which was developed by the Poynter Institute.[13][9]
A source may be credited with high "Factual Reporting" and still show "Political bias" in its presentation of those facts, for example, through its use of emotional language.[14][15][16]

Reception

Media Bias/Fact Check is widely used in studies of mainstream media, social media, and disinformation.[17][6][18][19] The occurrence and patterns of misinformation differ depending on the platform involved. Media Bias/Fact Check has been used in both single- and cross-platform studies of platforms including TikTok, 4chan, Reddit, Twitter, Facebook, Instagram, and Google Web Search.[20]

A comparison of five fact checking datasets frequently used as "groundtruth lists" has suggested that choosing one groundtruth list over another has little impact on the evaluation of online content.[6][18] In some cases, MBFC has been selected because it categorizes sources using a larger range of labels than other rating services.[6] MBFC offers the largest dataset covering biased and low factual news sources. Over a 4-year span, the percentage of links that could be categorized with MBFC was found to be very consistent. Research also suggests that the bias and factualness of a news source are unlikely to change over time.[6][18]

When MBFC factualness ratings of ‘mostly factual’ or higher were compared to an independent fact checking dataset's ‘verified’ and ‘suspicious’ news sources, the two datasets showed “almost perfect” inter-rater reliability.[6][18][21] A 2022 study that evaluated sharing of URLs on Twitter and Facebook in March and April 2020 and 2019, to compare the prevalence of misinformation, reports that scores from Media Bias/Fact Check correlate strongly with those from NewsGuard (r = 0.81).[7] Another study reports high agreement between ratings from Media Bias/Fact Check and BuzzFeed journalists.[8]

The site has been used by researchers at the University of Michigan to create a tool called the "Iffy Quotient", which draws data from Media Bias/Fact Check and NewsWhip to track the prevalence of "fake news" and questionable sources on social media.[22][23]

Writers at the Poynter Institute, which develops PolitiFact,[24] have stated that "Media Bias/Fact Check is a widely cited source for news stories and even studies about misinformation, despite the fact that its method is in no way scientific."[4] In 2018, a writer in the Columbia Journalism Review described Media Bias/Fact Check as "an armchair media analysis"[25] and characterized their assessments as "subjective assessments [that] leave room for human biases, or even simple inconsistencies, to creep in".[26] A study published in Scientific Reports wrote: "While [Media Bias/Fact Check's] credibility is sometimes questioned, it has been regarded as accurate enough to be used as ground-truth for e.g. media bias classifiers, fake news studies, and automatic fact-checking systems."[17]

See also

References

  1. ^ a b "About". Media Bias/Fact Check. Retrieved 2019-03-30.
  2. ^
    PMID 37239568
    .
  3. ^ .
  4. ^ a b Funke, Daniel; Mantzarlis, Alexios (December 18, 2018). "Here's what to expect from fact-checking in 2019". Poynter.
  5. PMID 37719749
    .
  6. ^ . Retrieved 8 June 2023.
  7. ^ .
  8. ^ .
  9. ^ a b c "Methodology". Media Bias/Fact Check. 7 June 2023. Retrieved 8 June 2023.
  10. .
  11. ^ Baly, Ramy; Karadzhov, Georgi; Alexandrov, Dimitar; Glass, James; Nakov, Preslav (2018). "Predicting Factuality of Reporting and Bias of News Media Sources". Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, Belgium: Association for Computational Linguistics. pp. 3528–3539.
  12. ^ Main, Thomas J. (February 1, 2022). "Both the Right and Left Have Illiberal Factions. Which Is More Dangerous?". The Bulwark. Retrieved February 18, 2022.
  13. ^ "PIEGraph FAQ". University of North Carolina at Chapel Hill.
  14. .
  15. ^ Solender, Andrew (12 June 2018). "How One Website Sets Out to Classify News, Expose 'Fake News'". InsideSources. Retrieved 7 June 2023.
  16. .
  17. ^ .
  18. ^ . Despite the varied labeling and validation procedures used and domains listed by fake news annotators, the groundtruth selection has a limited to modest impact on studies reporting on the behaviors of fake news sites
  19. .
  20. .
  21. .
  22. ^ Dian Schaffhauser. "U-M Tracker Measures Reliability of News on Facebook, Twitter -- Campus Technology". Campus Technology. Retrieved 2018-12-03.
  23. ^ Paul Resnick; Aviv Ovadya; Garlin Gilchrist. "Iffy Quotient: A Platform Health Metric for Misinformation" (PDF). School of Information - Center for Social Media Responsibility. University of Michigan. p. 5.
  24. ^ "Who Pays For PolitiFact? | PolitiFact". www.politifact.com. Retrieved 14 June 2023.
  25. S2CID 244413957
    .
  26. ^ Tamar Wilner (January 9, 2018). "We can probably measure media bias. But do we want to?". Columbia Journalism Review.

External links