YouTube on September 29 announced a ban on all anti-vaccine content. The platform already has in place policies to prevent misinformation regarding COVID-19 vaccines, but is now expanding this to all vaccines that are approved by local health authorities and the World Health Organisation (WHO) for use against any disease. Along with WhatsApp and Facebook, YouTube has been a major source of vaccine misinformation long before the pandemic hit, making this new ban all the more significant. What type of content is banned? "Specifically, content that falsely alleges that approved vaccines are dangerous and cause chronic health effects, claims that vaccines do not reduce transmission or contraction of disease, or contains misinformation on the substances contained in vaccines will be removed," YouTube said. YouTube said that this policy not only covers routine immunisations like ones for measles or Hepatitis B but also applies to general statements about vaccines. Here's an examples list of disallowed content from YouTube's Vaccine misinformation policy page: Claims that vaccines cause chronic side effects such as cancer, diabetes, and other chronic side effects Claims that vaccines do not reduce risk of contracting illness Claims that vaccines contain substances that are not on the vaccine ingredient list, such as biological matter from foetuses (e.g. foetal tissue, foetal cell lines) or animal byproducts Claims that vaccines contain substances or devices meant to track or identify those who've received them Claims that vaccines alter a person's genetic makeup Claims that the MMR vaccine causes autism Claims that vaccines are part of a…
