Why people think they’re right, even when they are wrong
Study finds people may incorrectly believe they have all the facts
Ohio State University
Researchers found that people naturally assume they have all the information they need to make a decision or support their position, even when they do not.
The researchers called it the “illusion of information adequacy.”
“We found that, in general, people don’t stop to think whether there might be more information that would help them make a more informed decision,” said study co-author Angus Fletcher, a professor of English at The Ohio State University and member of the university’s Project Narrative.
“If you give people a few pieces of information that seems to line up, most will say ‘that sounds about right’ and go with that.”
The study was published today in the journal PLOS ONE. Fletcher completed the work with co-authors Hunter Gehlbach, an educational psychologist at Johns Hopkins University’s School of Education, and Carly Robinson, a senior researcher at Stanford University’s Graduate School of Education
The study involved 1,261 Americans who participated online.
They were split into three groups who read an article about a fictional school that lacked adequate water. One group read an article that only gave reasons why the school should merge with another that had adequate water; a second group’s article only gave reasons for staying separate and hoping for other solutions; and the third control group read all the arguments for the schools merging and for staying separate.
The findings showed that the two groups who read only half the story – either just the pro-merging or the just the anti-merging arguments – still believed they had enough information to make a good decision, Fletcher said. Most of them said they would follow the recommendations in the article they read.
“Those with only half the information were actually more confident in their decision to merge or remain separate than those who had the complete story,” Fletcher said.
“They were quite sure that their decision was the right one, even though they didn’t have all the information.”
In addition, participants who had half the information said that they thought that most other people would make the same decision they did.
There was one piece of good news from the study, Fletcher said. Some of the participants who had read only one side of the story later read the arguments for the other side. And many of those participants were willing to change their minds about their decision, once they had all the facts.
That may not work all the time, especially on entrenched ideological issues, he said. In those cases, people may not trust new information, or they may try to reframe it to fit their preexisting views.
“But most interpersonal conflicts aren’t about ideology. They are just misunderstandings in the course of daily life,” Fletcher said.
These findings offer a complement to research on what is called naïve realism, the belief people have that their subjective understanding of a situation is the objective truth, Fletcher explained. Research on naïve realism often focuses on how people have different understandings of the same situation.
But the illusion of information adequacy shows that people may share the same understanding – if they both have enough information.
Fletcher, who studies how people are influenced by the power of stories, said people should make sure they have the full story about a situation before they take a stand or make a decision.
“As we found in this study, there’s this default mode in which people think they know all the relevant facts, even if they don’t,” he said.
“Your first move when you disagree with someone should be to think, ‘Is there something that I’m missing that would help me see their perspective and understand their position better?’ That’s the way to fight this illusion of information adequacy.”
Journal
PLoS ONE
Method of Research
Experimental study
Subject of Research
People
Article Title
The illusion of information adequacy
Article Publication Date
9-Oct-2024
Neglecting “unknown unknowns” may influence decision-making
Study proposes a new bias: the tendency to assume one has adequate information to make a decision
New experimental data support the idea that people tend to assume the information they have is adequate to comprehend a given situation, without considering that they might be lacking key information. Hunter Gehlbach of Johns Hopkins University and colleagues present these findings in the open-access journal PLOS ONE on October 9, 2024.
When navigating alternative perspectives, people may demonstrate psychological biases that influence their ability to understand others’ viewpoints. For instance, in the bias of naïve realism, people presume their own subjective perspective is objective truth.
Gehlbach and colleagues now propose the existence of a related bias, which they call the illusion of information adequacy: the failure to consider the possibility that one might be missing key information. For instance, one driver might honk at a car stopped in front of them, only to then see a pedestrian crossing the road—a possibility they hadn’t considered.
To demonstrate the illusion of information adequacy, the researchers presented 1,261 study participants with a hypothetical scenario in which they had to recommend whether two schools should be merged or not, as well as answer questions about their perceptions. Some participants received information about the benefits of merging, some about the benefits of staying separate, and some about both.
In line with the illusion of information adequacy, participants who—unbeknownst to them—lacked either the pro-merge or the pro-separate information tended to assume that the information they had was just as adequate as others’ information, that they were just as well equipped to make a thoughtful recommendation, and that most others would make a similar decision. Indeed, people lacking pro-merge information tended to recommend the schools remain separate, and vice versa.
Notably, a subgroup of participants who later received the information they initially lacked tended to stick with their original decisions. However, this subgroup’s combined final recommendations did mirror the recommendations of the subgroup that initially received all the information.
The authors suggest that the ability to navigate other perspectives might be improved by encouraging people to consider whether they may be lacking key information. Meanwhile, additional research could deepen understanding of this type of bias.
The authors add: “A major source of misunderstanding and conflict in our daily lives arises from this paradox: We know that, in theory, there are plenty of things that we don't know we don't know. Yet, in practice, we almost always behave as though we have adequate information to voice our opinions, make good decisions, and pass judgment on others. A little more intellectual humility about what we do and don't know would serve us well.”
#####
In your coverage please use this URL to provide access to the freely available article in PLOS ONE: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0310216
Citation: Gehlbach H, Robinson CD, Fletcher A (2024) The illusion of information adequacy. PLoS ONE 19(10): e0310216. https://doi.org/10.1371/journal.pone.0310216
Author Countries: U.S.A.
Funding: Dr. Hunter Gehlbach received start-up funds from Johns Hopkins University School of Education. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Journal
PLoS ONE
Method of Research
Experimental study
Subject of Research
People
Article Title
The illusion of information adequacy
Article Publication Date
9-Oct-2024
Are ideas contagious?
The COVID-19 pandemic gave the global medical community the opportunity to take giant strides forward in understanding how to develop vaccines and implement public health measures designed to control the spread of disease, but the crisis also offered researchers the chance to learn more about another kind of contagion: ideas.
Mathematician and assistant professor of biology Nicholas Landry, an expert in the study of contagion, is exploring how the structure of human-interaction networks affect the spread of both illness and information with the aim of understanding the role social connections play in not only the transmission of disease but also the spread of ideas and ideology.
In a paper published this fall in Physical Review E with collaborators at the University of Vermont, Landry explores a hybrid approach to understanding social networks that involves inferring not just social contacts but also the rules that govern how contagion and information spread.
“With the pandemic, we have more data than we’ve ever had on diseases,” Landry said. “The question is, What can we do with that data and how much data do you need to figure out how people are connected?”
The key to making use of the data, Landry explained, is to understand their limitations and understand how much confidence we can have when using epidemic models to make predictions.
Landry’s findings suggest that reconstructing underlying social networks and their impacts on contagion is much more feasible for diseases like SARS-CoV-2, Mpox or rhinovirus but may be less effective in understanding how more highly infectious diseases like measles or chickenpox spread.
However, for extremely viral trends or information, Landry suggests it may be possible to track how they spread with more precision than we can achieve for diseases, a discovery that will better inform future efforts to understand the pathways of both contagion and misinformation.
Journal
Physical Review E
Article Title
Reconstructing networks from simple and complex contagions
Article Publication Date
7-Oct-2024
No comments:
Post a Comment