Etiquette



DP Etiquette

First rule: Don't be a jackass.

Other rules: Do not attack or insult people you disagree with. Engage with facts, logic and beliefs. Out of respect for others, please provide some sources for the facts and truths you rely on if you are asked for that. If emotion is getting out of hand, get it back in hand. To limit dehumanizing people, don't call people or whole groups of people disrespectful names, e.g., stupid, dumb or liar. Insulting people is counterproductive to rational discussion. Insult makes people angry and defensive. All points of view are welcome, right, center, left and elsewhere. Just disagree, but don't be belligerent or reject inconvenient facts, truths or defensible reasoning.

Saturday, March 28, 2020

Assessing Fact Accuracy of Information Sources: MBFC


Saddlebill stork

I routinely use Media Bias/Fact Check primarily to look for the fact reliability of a site. It is also useful to get a feel for how biased a site is. I've checked probably about 40 different sites over the years using MBFC. Their fact accuracy ratings seem to reasonably correspond with the content a source puts out. For that reason, I tend to accept their ratings a reasonably good indicator of a source's quality. For example, the New York Times gets a high accuracy rating with a center-left bias.





Some very biased sources get high or very high fact accuracy ratings, but the extreme sites tend to get mixed or lower fact ratings. Propaganda sites such as RT News tend to get mixed or lower fact accuracy ratings. MBFC comments: “Overall, we rate RT Questionable based on promoting pro-Russian propaganda, promotion of conspiracy theories, numerous failed fact checks and a lack of author transparency.”  RT was given a very low fact accuracy rating.


 



Given all of the sites out there and all of the misinformation, I tend to distrust and ignore sites with a mixed, low or very low fact accuracy rating. Occasionally one of those sites get the facts of a story right, but the info needs to be verified by other sources. That takes time.

Often when a source a person relies on has a low or very low fact accuracy rating, I that point out and link to the MBFC assessment. The most common response to that is a direct attack on MBFC as a biased, lying, amateur and/or bullshit operation funded by George Soros, the Koch Brothers, Hitler, Stalin, etc. This discussion is here to provide me with a link I can use when someone’s sacred ox gets gored and complaining instantly erupts over how awful MBFC is.

Criticisms addressed
Criticism 1: David Van Zandt has been criticized as a democratic, republican or whatever else propagandist because he is the head of the New School, whatever that is. Van Zandt says this about that: “Dave is a registered Non-Affiliated voter who values evidence based reporting. For the record, he also is not the President of the New School, that is a different Dave Van Zandt.” MBFC was founded in 2015 by van Zandt.

Criticism 2: Donors control the fact and bias ratings thus the entire MBFC site is nothing but a steaming pile of lies and biased propaganda that either must be ignored or civilization will collapse. I wrote to van Zandt yesterday asking about who his main donors are. He responded with this: “Long story short is we do not have large donors to list. We primarily (95%) generate revenue through 3rd party advertising (ie. Google Adsense, we don't pick the ads). We will not be found on Charity Navigator because we are not a charity. We are a for profit or at least break even enterprise.”

Here's that part of the email string.


Van Zandt also emailed me that he put up a page on funding after I raised the issue of funding and it's use as an excuse to dismiss MBFC as reliable: “You're welcome! I seriously thank you. Your mail was the one that got me motivated enough to put up a funding page. Here it is"

https://mediabiasfactcheck.com/funding/

Dave”

Criticism 3: A post on the Columbia Journalism Review bitterly criticized van Zandt as an amateur armchair analyst who doesn't know diddly about squat and he should be shot dead and his estate billed for the bullet and the assassin’s expenses and service charges. Well, OK, the article didn't say anything about being shot dead, but the tone of it was consistent with that. It was a vicious attack by an arrogant academic, Tamara Wilmer, that takes herself far too seriously.

It turns out, that the CJR hit piece criticized van Zandt’s bias ratings, not his fact accuracy ratings. The CJR article, We can probably measure media bias. But do we want to?, includes this: “The armchair academics: Amateur attempts at such tools already exist, and have found plenty of fans. Google “media bias,” and you’ll find Media Bias/Fact Check, run by armchair media analyst Dave Van Zandt. The site’s methodology is simple: Van Zandt and his team rate each outlet from 0 to 10 on the categories of biased wording and headlines, factuality and sourcing, story choices (“does the source report news from both sides”), and political affiliation.


A similar effort is “The Media Bias Chart,” or simply, “The Chart.” Created by Colorado patent attorney Vanessa Otero, the chart has gone through several methodological iterations, but currently is based on her evaluation of outlets’ stories on dimensions of veracity, fairness, and expression.

Both efforts suffer from the very problem they’re trying to address: Their subjective assessments leave room for human biases, or even simple inconsistencies, to creep in. Compared to Gentzkow and Shapiro, the five to 20 stories typically judged on these sites represent but a drop of mainstream news outlets’ production.”

I wrote to the CJR editors and complained about the crappiness of Wilmer’s hit piece. They never responded back. I take that as evidence that my criticisms of Wilmer are valid.

MBFC says this about its bias ratings: “When determining bias, there isn’t any true scientific formula that is 100% objective. There are objective measures that can be calculated, but ultimately there will be some degree of subjective judgement to determine these. On each page we have put up a scale with a yellow dot that shows the degree of bias for each source. Each page also has a “detailed report” section that gives some details about the source and an explanation of their bias. When calculating bias we are not just looking at political bias, but also how factual the information is and if they provide links to credible, verifiable sources. Therefore, the yellow dot may indicate political bias or how factual a source is, or in many cases, both.”

When I compare MBFC’s bias ratings with how I would rate a site, the two are about the same most of the time. In my opinion, Ms. Wilner’s criticism doesn't amount to a hill of beans. She demands high level precision in something that has inherent subjectivity in it. Van Zandt admits this and that’s about the best that can be done.

Also, bias is much less important than fact accuracy. For most people, it is often easier to spot and deal with bias, e.g., loaded words and phrases, than it is to spot flawed reasoning, lies and partisan, misleading statements about facts, which are often subtle.


Conclusion
In my opinion, MBFC is a reliable source to get a good feel for both the fact accuracy and bias for many news and information sites. For people who don't want to believe MBFC, that is their choice. I will continue to rely on MBFC.

No comments:

Post a Comment