insider@insider.com (Katie Canales)
That includes vaccines for illnesses other than the coronavirus disease for the first time.
The ban is a departure from the industry's historical hands-off approach to content moderation.
YouTube is banning all anti-vaccine content on its platform, including misinformation about approved vaccines for common illnesses in addition to COVID-19, the company said Wednesday.
The Google-owned social media platform will remove any video that attempts to describe well-known vaccines that are approved by federal health officials as being harmful, it said in a blog post first reported by the Washington Post. That includes content claiming vaccines can cause autism, cancer, infertility, or can allow the recipient of the vaccine to be tracked via microchip.
YouTube previously had banned false information surrounding the coronavirus vaccines in October 2020. The company said it will still allow discussion around vaccine policies, new vaccine trials, and personal accounts of receiving the vaccine.
A YouTube spokesperson also confirmed to Insider that the company will remove the accounts of high-profile anti-vaxxers like Robert F. Kennedy Jr., the nephew of former President John F. Kennedy, and anti-vaccine activist and author Joseph Mercola.
Kennedy Jr. was one of 12 people that a recent report found to be the most prolific spreaders of COVID-19 disinformation online.
Wednesday's expansion of rules related to vaccine content marks a major change in how the company handles content on its service.
"Developing robust policies takes time," Matt Halprin - YouTube's vice president of global trust and safety - told the Post. "We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge."
YouTube and other social media companies have long taken a hands-off approach to moderating content. But pressure has increased from regulators and the general public in recent years, especially amid the pandemic and 2020 presidential election, for platforms to more actively police disinformation on their websites.
Facebook and Twitter have also moved to limit the spread of COVID-19 vaccine misinformation online. Still, false content has still leaked through - private groups devoted to discussing and taking proven COVID-19 treatments like the horse drug Ivermectin proliferated, Insider reported in early September.
Companies also began cracking down on former President Donald Trump's false statements in 2020, thrusting the topic of social media platforms' content moderation into an ongoing political war.
Read the original article on Business Insider
YouTube said it is banning all content claiming that approved vaccines do not work or are harmful.
That includes vaccines for illnesses other than the coronavirus disease for the first time.
The ban is a departure from the industry's historical hands-off approach to content moderation.
YouTube is banning all anti-vaccine content on its platform, including misinformation about approved vaccines for common illnesses in addition to COVID-19, the company said Wednesday.
The Google-owned social media platform will remove any video that attempts to describe well-known vaccines that are approved by federal health officials as being harmful, it said in a blog post first reported by the Washington Post. That includes content claiming vaccines can cause autism, cancer, infertility, or can allow the recipient of the vaccine to be tracked via microchip.
YouTube previously had banned false information surrounding the coronavirus vaccines in October 2020. The company said it will still allow discussion around vaccine policies, new vaccine trials, and personal accounts of receiving the vaccine.
A YouTube spokesperson also confirmed to Insider that the company will remove the accounts of high-profile anti-vaxxers like Robert F. Kennedy Jr., the nephew of former President John F. Kennedy, and anti-vaccine activist and author Joseph Mercola.
Kennedy Jr. was one of 12 people that a recent report found to be the most prolific spreaders of COVID-19 disinformation online.
Wednesday's expansion of rules related to vaccine content marks a major change in how the company handles content on its service.
"Developing robust policies takes time," Matt Halprin - YouTube's vice president of global trust and safety - told the Post. "We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge."
YouTube and other social media companies have long taken a hands-off approach to moderating content. But pressure has increased from regulators and the general public in recent years, especially amid the pandemic and 2020 presidential election, for platforms to more actively police disinformation on their websites.
Facebook and Twitter have also moved to limit the spread of COVID-19 vaccine misinformation online. Still, false content has still leaked through - private groups devoted to discussing and taking proven COVID-19 treatments like the horse drug Ivermectin proliferated, Insider reported in early September.
Companies also began cracking down on former President Donald Trump's false statements in 2020, thrusting the topic of social media platforms' content moderation into an ongoing political war.
Read the original article on Business Insider
No comments:
Post a Comment