A new study shows that people who are overconfident in their knowledge are more likely to hold anti-scientific views.
The experts interviewed thousands of people about their views on current scientific topics such as changing of the climateCovid, vaccination, homeopathy and genetically modified foods.
They found that people who most disagree with the scientific consensus on these issues know less but think they know more.
Scientists warn that being overconfident makes us less likely to change our minds about a subject, even when we are presented with overwhelming scientific evidence.
The team says the consequences of “anti-consensus views” on these topics are “terrible” and include property destruction, malnutrition, illness, financial hardship and death.
The researchers asked people about their views on “anti-consensus” scientific topics – those that are commonly divisive these days, such as Covid and vaccinations. Pictured are anti-vaccination activists protesting in Albany, New York on June 14, 2020.
The new study was led by Professor Nicholas Light, a behavioral scientist at Portland State University’s School of Business in Oregon.
SUBJECTS WITH “ANTI-CONSENSUS” VIEWS
– Changing of the climate
– Atomic Energy
– Genetically modified foods
– Big Bang
– homeopathic medicine
“Basically, the people who are most radically opposed to consensus are the most self-confident in their knowledge,” he explained.
“There can be a problem of overconfidence that gets in the way of learning because if people think they know a lot, they have minimal motivation to learn more.”
One problem may be that people consciously or subconsciously consider their views to be more important than scientific truth.
Thus, in order to re-educate them, some initial steps may be required to reduce their own self-confidence.
“People with more radical anti-science views may first have to learn about their relative ignorance of issues before being taught the specifics of established scientific knowledge,” said Professor Light.
“The challenge then is to find suitable ways to convince opponents of the consensus that they are probably not as knowledgeable as they think.”
Humans are constantly striving to better understand the world, the team says, but doing so often requires a willingness to change or abandon old truths.
For example, in 1543, the Polish mathematician Nicolaus Copernicus put forward the theory that the Earth, along with other planets, revolves around the Sun.
The theory was radical at the time because most people considered the Earth to be the center of the universe.
Since then, scientific evidence on various subjects has been so consistent, overwhelming, or clear that a scientific consensus has formed, but some subjects create “anti-consensus views”.
Polish mathematician Nicolaus Copernicus (pictured here) introduced the theory that the Earth, along with other planets, revolves around the Sun.
For example, there are significant differences of opinion between scientists and the public about whether genetically modified foods are safe to eat, whether humans have evolved over time, or whether climate change is due to human activities.
For the study, Prof. Light and his colleagues examined assessments of their knowledge and self-confidence among US citizens.
For example, participants were asked about their readiness to get vaccinated against Covid and their knowledge of how such a vaccine would work.
In general, if people’s attitudes toward a problem deviated from the scientific consensus, their assessment of their own knowledge of the problem increased, but their actual knowledge decreased, their team found.
For example, the less a person agreed with the Covid vaccine, the more they thought they knew about it, but their actual knowledge was likely lower.
Overall, the team found that people who were most strongly opposed to consensus were the most overconfident in their knowledge when it came to five of the eight subjects.
“Our results show that this pattern is quite general,” said Professor Light. “However, we didn’t find them for climate change, evolution, or the big bang theory.”
Researchers have found that as people’s subjective knowledge (estimations of their own knowledge) grows, so does their resistance to scientific consensus. Here are four of the eight questions. All four of these questions showed a link between resistance to consensus and overconfidence in one’s knowledge.
The extent to which attitudes towards an issue are linked to political or religious identity can influence the existence of a model.
“For example, on climate change, positions in line with science tend to be held by liberals, while on issues such as genetically modified foods, liberals and conservatives tend to diverge in support or opposition,” said Professor Light. .
“It may be that when we know that our internal groups feel strongly about an issue, we don’t think much about what we know about the issue.”
In his article published in the journal Scientific achievementsthe researchers warn of the “terrible” consequences that “anti-consensus views” could potentially have.
For example, death and illness can be caused by a refusal to vaccinate or dependency on homeopathic remedies, while a refusal of genetically modified (GM) foods can lead to malnutrition.
The Royal Society notes that since the first widespread commercialization of GM foods in the 1990s, there has been no evidence of harmful effects associated with the consumption of any approved GM crop.
SOCIAL NETWORKS THREATS ‘SCIENTIFIC CREDIBILITY’, REPORT SAYS
The report shows British confidence in science after the Covid pandemic is high, but misinformation on social media continues to pose a “threat to scientific credibility”.
The 3M State of Science Index, released in June, shows that 90% of people in the UK trust science in 2022, up from 85% in 2019.
This statistic is also comparable to the 88% of Europeans and 89% of people worldwide who trust science in 2022.
In the UK, 57% of Britons say they now value science more in the wake of the pandemic, likely thanks to scientists’ efforts to develop Covid vaccines.
However, misinformation is “widespread” on social media and threatens the future of public understanding of science, the report says.