To end vaccine misinformation, Google-owned streaming giant YouTube has announced that it is expanding medical misinformation policies on the platform.
The company said that it is expanding medical misinformation policies on YouTube with new guidelines on currently administered vaccines that are approved and confirmed to be safe and effective by local health authorities and the World Health Organisation (WHO).
“Today’s policy update is an important step to address vaccine and health misinformation on our platform, and we’ll continue to invest across the board in the policies and products that bring high-quality information to our viewers and the entire YouTube community,” the company said in a blogpost.
As per YouTube, content that falsely alleges that approved vaccines are dangerous and cause chronic health effects claims that vaccines do not reduce transmission or contraction of disease, or contain misinformation on the substances contained in vaccines will be removed.
This would include content that falsely says that approved vaccines cause autism, cancer or infertility, or that substances in vaccines can track those who receive them.
“Our policies not only cover specific routine immunisations like for measles or Hepatitis B but also apply to general statements about vaccines,” the company said.
According to YouTube, their Community Guidelines already prohibit certain types of medical misinformation.
“We have long removed content that promotes harmful remedies, such as saying drinking turpentine can cure diseases,” the company said.
“At the onset of Covid-19, we built on these policies when the pandemic hit, and worked with experts to develop 10 new policies around Covid-19 and medical misinformation,” it added.
Since last year, the streaming giant removed over 130,000 videos for violating our Covid-19 vaccine policies.