John W. Ayers, PhD, MA1; Brian Chu, BS2; Zechariah Zhu3; et alEric C. Leas, PhD, MPH4; Davey M. Smith, MD, MAS1; Mark Dredze, PhD5; David A. Broniatowski, PhD, SM6
Author Affiliations Article Information
JAMA Intern Med. Published online June 7, 2021. doi:10.1001/jamainternmed.2021.2498
The dangers of misinformation spreading on social media during the COVID-19 pandemic are known.1 However, software that allows individuals to generate automated content and share it via counterfeit accounts (or “bots”)2 to amplify misinformation has been overlooked, including how automated software can be used to disseminate original research while undermining scientific communication.
We analyzed conversations on public Facebook groups, a platform known to be susceptible to automated misinformation,3 concerning the publication of the Danish Study to Assess Face Masks for the Protection Against COVID-19 Infection (DANMASK-19) to explore automated misinformation.4 We selected DANMASK-19 because it was widely discussed (it was the fifth most shared research article of all time as of March 2021 according to Altmetric5) and demonstrated that masks are an important public health measure to control the pandemic.
Methods
Sample
We obtained the names of 563 Facebook groups in which a link to the publication of DANMASK-19 on the Annals of Internal Medicine website was posted and downloaded all available posts (N = 299 925) from these groups using CrowdTangle (crowdtangle.com). We limited our study period to the 5 days following the publication of DANMASK-19 (November 18, 2020, through November 22, 2020) because media interest is typically greatest initially. This study was exempted as not human participants research by the University of California, San Diego Human Research Protections Program. Additional details are provided in the eAppendix in the Supplement.
Measures
When identical links are posted in close succession, it suggests that automated software was used.2,3 We identified the subsets of Facebook groups that were the most or least likely to be affected by automation by calculating the frequency that identical links were posted to pairs of Facebook groups and the time that elapsed between these posts for all links (n = 251 656) shared during the study period. Adapting past operationalizations,3 a pair of Facebook groups that (1) hosted identical links 5 or more times and (2) at least half of these links being posted within less than 10 seconds would be considered the most affected by automation. Comparatively, Facebook groups in which the total time elapsed between identical links was in the top 90th percentage of time between postings were considered the least affected by automation. Facebook groups that were most affected by automation had a mean (SD) of 4.28 (3.02) seconds between shares of identical links compared with 4.35 (11.71) hours for those least affected by automation.
To quantify the extent to which Facebook groups were subject to misinformation, all posts that linked to DANMASK-19 in the groups most or least affected by automation were qualitatively coded by 2 authors (B.C. and Z.Z.) for 2 types of misinformation: (1) whether the primary conclusion of DANMASK-19 was misrepresented (eg, mask wearing harms the wearer) and (2) whether conspiratorial claims were made about DANMASK-19 (eg, claims of covert political/corporate control). A separate outcome for not including either form of misinformation was computed. Table 1 presents example posts. Coders disagreed on 3.9% of labels (Cohen κ = 0.76) and resolved disagreements unanimously with the first author (J.W.A.).
Analysis
The percentage of posts that linked to DANMASK-19 that included each type of misinformation or neither type was calculated separately for the sets of Facebook groups most and least affected by automation along with prevalence ratios comparing these percentages. Statistical significance was set to P < .05 and 95% confidence intervals were bootstrapped. Analyses were computed with R, version 3.6.1 (R Foundation).
Results
A total of 712 posts that provided direct links to DANMASK-19 were shared in 563 public Facebook groups. Of these, 279 posts (39%) that linked to DANMASK-19 were in Facebook groups most affected by automation, of which 17 were deleted and unavailable for further analysis. Sixty-two posts (9%) were made in Facebook groups that were least affected by automation, and 3 were deleted.
Among posts made to groups most affected by automation, 19.8% (95% CI,14.9%-24.5%) claimed masks harmed the wearer, 50.8% (95% CI, 44.6%-56.5%) made conspiratorial claims about the trial, and 43.9% (95% CI, 37.4%-49.6%) made neither claim (Table 2). In contrast, among posts made to groups least affected by automation, 8.5% (95% CI, 1.7%-15.2%) claimed masks harmed the wearer, 20.3% (95% CI, 10.2%-30.5%) made conspiratorial claims about the trial, and 72.9% (95% CI, 59.3%-81.4%) made neither claim.
The percentage of posts linking to DANMASK-19 that claimed that masks harmed the wearer was 2.3 (95% CI, 1.0-6.5) times higher in Facebook groups that were most affected by automation vs groups that were least affected by automation; conspiratorial claims (prevalence ratio, 2.5; 95% CI, 1.5-4.5) were also higher in Facebook groups that were most affected by automation. Making neither claim was more common in Facebook groups that were least affected by automation (prevalence ratio, 0.6; 95%CI, 0.5-0.7).
Discussion
A campaign that presumably used automated software6 promoted DANMASK-19 on Facebook groups to disseminate misinformation. The limitations of the study include that the entities responsible for organizing this automated campaign cannot be determined, only public Facebook groups were studied, and only a single high-profile study over a few days was evaluated.
Scientific journals are easy targets of automated software. Possible approaches to prevent misinformation due to dissemination of articles by automated software include legislation that penalizes those behind automation; greater enforcement of rules by social media companies to prohibit automation; and counter-campaigns by health experts.
Article Information
Accepted for Publication: April 15, 2021.
Published Online: June 7, 2021. doi:10.1001/jamainternmed.2021.2498
Corresponding Author: John W. Ayers, PhD MA; #333 CRSF 9500 Gilman Drive, La Jolla, CA 92093 (ayers.john.w@gmail.com).
Author Contributions: Drs Ayers and Broniatowski had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of the data analysis.
Concept and design: All authors.
Acquisition, analysis, or interpretation of data: Ayers, Chu, Zhu, Leas, Smith, Broniatowski.
Drafting of the manuscript: Ayers, Chu, Zhu, Leas, Smith, Broniatowski.
Critical revision of the manuscript for important intellectual content: All authors.
Statistical analysis: Chu, Leas, Broniatowski.
Obtained funding: Smith.
Administrative, technical, or material support: Chu, Smith, Dredze, Broniatowski.
Supervision: Smith, Dredze, Broniatowski.
Conflict of Interest Disclosures: Dr Ayers reported owning equity in Directing Medicine, Health Watcher, and Good Analytics outside the submitted work. Dr Leas reported receiving consulting fees from Health Watcher and Good Analytics for similar work outside the context of these analyses. Dr Smith reported an endowment from the John and Mary Tu Foundation Endowment and grants from the National Institutes of Health during the conduct of the study and consulting fees from FluxErgy, Bayer, Arena Pharmaceuticals, and Linear Therapies outside the submitted work. Dr Dredze reported personal fees from Bloomberg LP and Good Analytics outside the submitted work. No other disclosures were reported.
Funding/Support: This work was supported by the Burroughs Wellcome Fund, National Institutes of Health grant AI036214, The John S. and James L. Knight Foundation to the George Washington University Institute for Data, Democracy, and Politics, and The John and Mary Tu Foundation.
Role of the Funder/Sponsor: The funding organizations had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
Data Sharing Statement: The data used in the study are public in nature and are available from CrowdTangle. CrowdTangle prohibits providing raw data to anyone outside of a CrowdTangle user’s account. Anyone with a CrowdTangle account may access these corresponding data. Researchers may request CrowdTangle access at https://help.crowdtangle.com/en/articles/4302208-crowdtangle-for-academics-and-researchers.
Additional Contributions: We appreciate discussions and editing with Benjamin M. Althouse, PhD, ScM (Gates Foundation), Alicia Nobles, PhD (University of California, San Diego) and Christopher Longhurst MD (University of California San Diego). They were not compensated for their contributions.
References
1.
Saitz R, Schwitzer G. Communicating science in the time of a pandemic. JAMA. 2020;324(5):443-444. doi:10.1001/jama.2020.12535
ArticlePubMedGoogle ScholarCrossref
2.
Broniatowski DA, Jamison AM, Qi S, et al. Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate. Am J Public Health. 2018;108(10):1378-1384. doi:10.2105/AJPH.2018.304567PubMedGoogle ScholarCrossref
3.
Giglietto F.,, Righetti N.,, Rossi L.,, & Marino G., It takes a village to manipulate the media: coordinated link sharing behavior during 2018 and 2019 Italian elections. Information, Communication & Society. 2020;23(6):867-891. doi:10.1080/1369118X.2020.1739732Google ScholarCrossref
4.
Bundgaard H, Bundgaard JS, Raaschou-Pederson DET, et al. Effectiveness of adding a mask recommendation to other public health measures to prevent SARS-CoV-2 infection in Danish mask wearers: a randomized controlled trial. Ann Intern Med. 2021;174(3):335-343. doi:10.7326/M20-6817PubMedGoogle ScholarCrossref
5.
Altmetric. Article detail page for effectiveness of adding a mask recommendation to other public health measures to prevent SARS-CoV-2 infection in Danish mask wearers. Accessed December 17, 2020. https://www.altmetric.com/details.php?domain=www.acpjournals.org&citation_id=94531651
6.
Allem JP, Ferrara E. Could social bots pose a threat to public health? Am J Public Health. 2018;108(8):1005-1006. doi:10.2105/AJPH.2018.304512
No comments:
Post a Comment