Study after study has found that partisan beliefs and bias shape what we believe is factually true.
Now the Pew Research Center has released a new study that takes a step back. They wondered: How good are Americans at telling a factual statement from an opinion statement — if they don't have to acknowledge the factual statement is true?
By factual, Pew meant an assertion that could be proven or disproven by evidence. All the factual statements used in the study were true, to keep the results more consistent, but respondents didn't know that.
An opinion statement, in contrast, is based on values and beliefs of the speaker, and can't be either proven or disproven.
Pew didn't provide people with definitions of those terms — "we didn't want to fully hold their hands," Michael Barthel, one of the authors of the study, told NPR. "We did, at the end of the day, want respondents to make their own judgment calls."
The study asked people to identify a statement as factual, "whether you think it's accurate or not," or opinion, "whether you agree with it or not."
They found that most Americans could identify more than three out of five in each category — slightly better than you'd expect from random luck.
(You can see how your evaluations stack up in Pew's quiz.)
In general they found people were better at correctly identifying a factual statement if it aligned with or supported their political beliefs.
For instance, 89 percent of Democrats identified "President Barack Obama was born in the United States" as a factual statement, while only 63 percent of Republicans did the same.
Republicans, however, were more likely than Democrats to recognize that "Spending on Social Security, Medicare, and Medicaid make up the largest portion of the U.S. budget" is a factual statement — regardless of whether they thought it was accurate.
And opinions? Well, the opposite was true. Respondents who shared an opinion were more likely to call it a factual statement; people who disagreed with the opinion, more likely to accurately call it an opinion.
Pew was able to test that trend more precisely with a followup question: If someone called a statement an opinion, they asked if the respondent agreed or disagreed with that opinion.
If the opinion was actually an opinion, responses varied.
"If it wasn't an opinion statement — it was a factual statement that they misclassified — they generally disagreed with it," Barthel says.
Some groups of people were also more successful, in general, than others.
The "digitally savvy" and the politically aware were more likely to correctly identify each statement as opinion or factual. People with a lot of trust in the news media were also significantly more likely to get a perfect score: While just over a quarter of all adults got all five facts right, 39 percent of people who trust news swept that category.
But, interestingly, there was much less of an effect for people who said they were very interested in news. That population was slightly more likely to identify facts as facts — but less savvy than non-news-junkies at calling an opinion an opinion.
The results suggest that confirmation bias is not just a question of people rejecting facts as false — it can involve people rejecting facts as something that could be proven or disproven at all.
But Barthel saw a silver lining: In almost all cases, he said, a majority of people did classify a statement correctly — even with the trends revealing the influence of their beliefs.
"It does make a little bit of difference," he said. "But normally, it doesn't cross the line of making a majority of people get this wrong."
300x250 Ad
300x250 Ad