Facebook says it will start removing debunked misinformation about the upcoming COVID-19 vaccines in the coming weeks. The removals will be part of Facebook’s policy on coronavirus content that could lead to “imminent physical harm”, that includes false claims around safety, efficacy, ingredients or side effects of the vaccines.
For instance, claims that the COVID-19 vaccines contain microchips or anything else not on the official vaccine ingredient list will be removed. Conspiracy theories about specific populations being used to test the vaccine without consent will be removed. “We will not be able to start enforcing these policies overnight. Since it’s early and facts about COVID-19 vaccines will continue to evolve, we will regularly update the claims we remove based on guidance from public health authorities as they learn more,” Facebook said.
Facebook’s action against vaccine misinformation has so far been limited to prohibiting ads that discourage vaccination, or promote vaccine-related content that has been identified as hoaxes by the WHO and US Centre for Disease Control.
YouTube had said it will remove false claims about such vaccines, while Twitter had said it’s working through plans before a medically approved vaccine becomes available.
Over the past weeks, major pharmaceutical companies like Pfizer and AstraZeneca said they have developed effective COVID-19 vaccines. The UK has approved Pfizer’s vaccine and said it would start vaccinating the most vulnerable people as soon as early next week.
In India, Phase-3 clinical trials for Covaxin, a vaccine being developed by Bharat Biotech in partnership with the ICMR, commenced in several states this week. Pune-based Serum Institute of India is also conducting clinical trials of vaccine candidate Covishield, developed by AstraZeneca and Oxford University. The government is discussing vaccine distribution strategy.
In August, a report by activist group Avaaz found that health misinformation attracted as many as 3.8 billion views on Facebook in the last year, peaking during the COVID-19 pandemic, with 460 million views in April alone. Despite Facebook’s efforts to curb such misinformation, it managed to add warnings labels to only 16% of posts containing health misinformation. The report found that misinformation content, though fact-checked, could escape detection by Facebook by being republished, in full or part, and translated into other languages.
An article titled “Gates’ Globalist Vaccine Agenda”, posted by a anti-vax organisation, had 3.7 million interactions on Facebook. Later, it was republished either partially or in its entirety, quoted from and linked to, on posts by many more websites. These posts accumulated over 4.7 million views in six different translations.