Media Bias/Fact Check (MBFC) is an American website founded in 2015 by Dave M. Van Zandt.[1] It considers four main categories and multiple subcategories in assessing the "political bias" and "factual reporting" of media outlets.[2][3]
It is widely used, but has been criticised for its
methodology.[4] Scientific studies[5] using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017,[6] with
NewsGuard[7] and with
BuzzFeed journalists.[8]
Methodology
Four main categories are used by MBFC to assess political bias and factuality of a source. These are: (1) use of wording and headlines (2)
fact-checking and sourcing (3) choice of stories and (4) political affiliation. MBFC additionally considers subcategories such as bias by omission, bias by source selection, and loaded use of language.[2][9] A source's "Factual Reporting" is rated on a seven-point scale from "Very high" down to "Very low".[10]
Political bias ratings are American-centric[9][11] and are "extreme-left", "left", "left-center", "least biased", "right-center", "right", and "extreme-right".[12]
The category "Pro-science"[3] is used to indicate "evidence based" or "legitimate science". MBFC also associates sources with warning categories such as "Conspiracy/Pseudoscience", "Questionable Sources" and "Satire".[3]
Fact checks are carried out by independent reviewers who are associated with the
International Fact-Checking Network (IFCN) and follow the International Fact-Checking Network Fact-checkers’ Code of Principles, which was developed by the
Poynter Institute.[13][9]
A source may be credited with high "Factual Reporting" and still show "Political bias" in its presentation of those facts, for example, through its use of emotional language.[14][15][16]
Reception
Media Bias/Fact Check is widely used in studies of mainstream media, social media, and disinformation.[17][6][18][19] The occurrence and patterns of misinformation differ depending on the platform involved. Media Bias/Fact Check has been used in both single- and cross-platform studies of platforms including TikTok, 4chan, Reddit, Twitter, Facebook, Instagram, and Google Web Search.[20]
A comparison of five fact checking datasets frequently used as "groundtruth lists" has suggested that choosing one groundtruth list over another has little impact on the evaluation of online content.[6][18] In some cases, MBFC has been selected because it categorizes sources using a larger range of labels than other rating services.[6] MBFC offers the largest dataset covering biased and low factual news sources. Over a 4-year span, the percentage of links that could be categorized with MBFC was found to be very consistent. Research also suggests that the bias and factualness of a news source are unlikely to change over time.[6][18]
When MBFC factualness ratings of ‘mostly factual’ or higher were compared to an independent fact checking dataset's ‘verified’ and ‘suspicious’ news sources, the two datasets showed “almost perfect”
inter-rater reliability.[6][18][21] A 2022 study that evaluated sharing of URLs on Twitter and Facebook in March and April 2020 and 2019, to compare the prevalence of misinformation, reports that scores from Media Bias/Fact Check correlate strongly with those from
NewsGuard (r = 0.81).[7] Another study reports high agreement between ratings from Media Bias/Fact Check and BuzzFeed journalists.[8]
The site has been used by researchers at the
University of Michigan to create a tool called the "Iffy Quotient", which draws data from Media Bias/Fact Check and
NewsWhip to track the prevalence of "fake news" and questionable sources on social media.[22][23]
Writers at the
Poynter Institute, which develops
PolitiFact,[24] have stated that "Media Bias/Fact Check is a widely cited source for news stories and even studies about misinformation, despite the fact that its method is in no way scientific."[4] In 2018, a writer in the Columbia Journalism Review described Media Bias/Fact Check as "an armchair media analysis"[25] and characterized their assessments as "subjective assessments [that] leave room for human biases, or even simple inconsistencies, to creep in".[26] A study published in Scientific Reports wrote: "While [Media Bias/Fact Check's] credibility is sometimes questioned, it has been regarded as accurate enough to be used as
ground-truth for e.g. media bias classifiers, fake news studies, and automatic fact-checking systems."[17]