![]() ![]() The most significant, sweeping change occurred in early 2018 when Facebook announced its pushback against branded content which left many businesses sweating. One of our big focus areas for 2018 is making sure the time we all spend on Facebook is time well spent.We built… In a moment of transparency from Mark Zuckerberg himself, Facebook seemingly laid down the gauntlet against marketers and brands at large. Posted by Mark Zuckerberg on Thursday, January 11, 2018 The 2018 update to the Facebook algorithm was designed to center content around individuals’ friends and family members, rather than prioritizing spam from businesses. This put legitimate companies and brands in a bind as they have had to adapt their Facebook marketing strategies accordingly. Facebook algorithm changes and milestonesįast forward to present-day and the Facebook algorithm is still evolving. In the meantime, we will continue doing our part to show people reliable information about COVID-19 vaccines from health experts and help people get vaccinated.Below is a quick snapshot of some of the changes Facebook has made recently. That’s why we’re continuing to work with external experts and governments to make sure that we are approaching these issues in the right way and making adjustments if necessary. Tracking and combating vaccine misinformation is a complex challenge, made more difficult by the lack of common definitions about what constitutes misinformation, and the reality that guidance from scientific and health experts has evolved and will continue to evolve throughout the pandemic. None of this is to suggest that our work is done or that we are satisfied. Since the beginning of the pandemic across our entire platform, we have removed over 3,000 accounts, Pages and groups for repeatedly violating our rules against spreading COVID-19 and vaccine misinformation and removed more than 20 million pieces of content for breaking these rules. We have worked closely with leading health organizations since January 2020 to identify and remove COVID-19 misinformation that could contribute to a risk of someone spreading or contracting the virus. There is no justification for their claim that their data constitute a “representative sample” of the content shared across our apps.įocusing on these 12 individuals misses the forest for the trees. Further, there is no explanation for how the organization behind the report identified the content they describe as “anti-vax” or how they chose the 30 groups they included in their analysis. They are in no way representative of the hundreds of millions of posts that people have shared about COVID-19 vaccines in the past months on Facebook. The report upon which the faulty narrative is based analyzed only a narrow set of 483 pieces of content over six weeks from only 30 groups, some of which are as small as 2,500 users. This includes all vaccine-related posts they’ve shared, whether true or false, as well as URLs associated with these people. In fact, these 12 people are responsible for about just 0.05% of all views of vaccine-related content on Facebook. The remaining accounts associated with these individuals are not posting content that breaks our rules, have only posted a small amount of violating content, which we’ve removed, or are simply inactive. We’ve applied penalties to some of their website domains as well so any posts including their website content are moved lower in News Feed. We have also imposed penalties on nearly two dozen additional Pages, groups or accounts linked to these 12 people, like moving their posts lower in News Feed so fewer people see them or not recommending them to others. That said, any amount of COVID-19 vaccine misinformation that violates our policies is too much by our standards - and we have removed over three dozen Pages, groups and Facebook or Instagram accounts linked to these 12 people, including at least one linked to each of the 12 people, for violating our policies. Moreover, focusing on such a small group of people distracts from the complex challenges we all face in addressing misinformation about COVID-19 vaccines. There isn’t any evidence to support this claim. ![]() People who have advanced this narrative contend that these 12 people are responsible for 73% of online vaccine misinformation on Facebook. In recent weeks, there has been a debate about whether the global problem of COVID-19 vaccine misinformation can be solved simply by removing 12 people from social media platforms.
0 Comments
Leave a Reply. |