Tuesday, March 02, 2021

Study examines what makes people susceptible to fake health news

Source credibility, how info is presented have little influence; tagging unverified content makes difference

UNIVERSITY OF KANSAS

Research News

LAWRENCE -- A new study from University of Kansas journalism & mass communication researchers examines what influences people to be susceptible to false information about health and argues big tech companies have a responsibility to help prevent the spread of misleading and dangerous information.

Researchers shared a fake news story with more than 750 participants that claimed a deficiency of vitamin B17 could cause cancer. Researchers then measured if how the article was presented -- including author credentials, writing style and whether the article was labeled as "suspicious" or "unverified" -- affected how participants perceived its credibility and whether they would adhere to the article's recommendations or share it on social media. The findings showed that information presentation did not influence how people perceived it and that only social media efficacy played a role in whether respondents said they would share it.

Hong Tien Vu, assistant professor of journalism & mass communications, and Yvonnes Chen, associate professor of journalism & mass communications at KU, co-wrote the study. They will present their work, funded by a KU General Research Fund grant, at the 2021 International Communication Association Conference.

Vu and Chen shared eight versions of an article verified as false with respondents that claimed a lack of vitamin B17, which does not exist, could be a cause of cancer. In one version, it included a doctor's byline, including a short description of her medical credentials. In another version, the author was described as a mother of two with a background in creative writing who was a lifestyle blogger in another. Some versions followed a journalistic style, while others used more casual language.

"We wanted to test two skills that are often employed in media literacy training programs around the world, author credentials and writing style, as well as flagging," Vu said. "The results suggest relying on audience members to do the work to determine fake news may be a long way to go. When people have to evaluate the credibility of information, it requires mental work. When surfing the web in general, we tend to rely on big tech companies to verify information."

Respondents who showed higher levels of social media efficacy, or were more savvy in using the technology, evaluated information more carefully and reported they would be less likely to share the article. Health orientation, or whether or not respondents were interested in or sought out health information, did not play a role in discerning accuracy of information. It is significant, however, as those highly interested in health information are more likely to share news they find, whether credible or not, the authors said.

Results showed that author credentials and how the story was written did not have significant differences on how people perceived its credibility, whether they would adhere to its recommendations or share it. However, those who saw the article presented with any sort of flagging stating it was not verified information were significantly less likely to find it credible, adhere to recommendations or share it.

While the study took place before the COVID-19 pandemic, its findings are especially relevant, as misinformation and politicized information about the pandemic have proliferated. It shows seemingly innocuous misinformation can be dangerous as well.

"One problem with fake news studies is the topic becomes so politicized," Vu said. "Fake news can be about something that is not politicized or polarizing as well. Talking about vitamin B17 seems to be harmless, but people believed it. People can spend time, money and efforts on trying to find a cure, and that can be very dangerous if you don't follow a doctor's advice and come across false information."

The fact that any sort of flagging information significantly affected readers' perceptions and intentions to share show how important it is for big technology companies such as social media platforms to verify information or label content that has false, unverified or dangerous information, the authors wrote.

"Whenever we see information that has been flagged, we immediately raise our skepticism, even if we don't agree with it. Big tech companies have a very important role to play in ensuring a healthy, clean information environment," Vu said.

No comments:

Post a Comment