YouTube announced an update to its community guidelines this week—they will no longer tolerate misleading and inaccurate content about vaccines.
The platform had previously banned content that contained false claims about COVID-19 vaccines under its COVID-19 misinformation policy, but this new change now covers misinformation about all vaccines.
“We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” the company said in a blog post announcement.
This is a much-needed and proactive move on YouTube’s part, no doubt. But what took so long? The pandemic has been going on since 2020, but the anti-vaccine movement has been gaining traction for much longer.
YouTube says videos that spread misinformation on any approved vaccines, not just Covid-19 shots, will now be remov… https://t.co/y4NnDzEXfh
— CNN Breaking News (@cnnbrk)
YouTube has long been a breeding ground for conspiracy theorists and radicals to spread misinformation, and anti-vaccine rhetoric has always been a part of that. According to a report by NBC News, it appears YouTube was always “slow to act” because the videos garnered notable traffic to the site. YouTube says they didn’t act sooner because it was focusing on misinformation about coronavirus vaccines specifically at first.
The video platform has been home to many anti-vaccine sentiments since the onset of the pandemic, and the universal outrage over its role in spreading vaccine misinformation was the catalyst for enacting its strictest policy yet. There is so much conspiratorial, invalid content regarding vaccines that YouTube has already scrubbed more than a million videos that go against its new policy.
Among the videos that have been removed include those from prominent figures like Florida Gov. Ron DeSantis, Sen. Rand Paul, Joseph Mercola, Erin Elizabeth, Sherri Tenpenny and major vaccine opponent Robert F. Kennedy Jr. and his “Children’s Health Defense” organization.
Under pressure to combat misinformation on its platform, YouTube announced it will block all anti-vaccine content o… https://t.co/zSu7OTS6La
— Reuters (@Reuters)
Though parents and kids today are among the first “digitally native” generations, being able to identify misinformation and propaganda isn’t always easy when it comes to YouTube. Organizations like Common Sense Media have many resources available for children and families to help identify “fake news,” however. Hopefully, with the new community guidelines, future generations won’t be as susceptible to misinformation.
In the blog post, YouTube says users are still permitted to share content related to their personal experiences with the COVID vaccine (and others), but only if those videos still adhere to the community guidelines and do not promote “vaccine hesitancy.”
The company is forthright about the time it will take to fully enforce its new policy as it expands to all vaccine misinformation.
“Developing robust policies takes time,” Matt Halprin, YouTube’s vice president of global trust and safety, tells The Washington Post. “We wanted to launch a policy that is comprehensive, enforceable with consistency and adequately addresses the challenge.”