Facebook Corporate Blog: “Today, we’re launching new ways to inform people if they’re interacting with content that’s been rated by a fact-checker as well as taking stronger action against people who repeatedly share misinformation on Facebook. Whether it’s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we’re making sure fewer people see misinformation on our apps. Since launching our fact-checking program in late 2016, our focus has been on reducing viral misinformation. We’ve taken stronger action against Pages, Groups, Instagram accounts and domains sharing misinformation and now, we’re expanding some of these efforts to include penalties for individual Facebook accounts too.”
“Starting today, we will reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of our fact-checking partners. We already reduce a single post’s reach in News Feed if it has been debunked. We currently notify people when they share content that a fact-checker later rates, and now we’ve redesigned these notifications to make it easier to understand when this happens. The notification includes the fact-checker’s article debunking the claim as well as a prompt to share the article with their followers. It also includes a notice that people who repeatedly share false information may have their posts moved lower in News Feed so other people are less likely to see them”.