Facebook has announced to push down all posts by users who repeatedly share misinformation and fake content across its platforms, as it expands its fact-checking program to individuals from Pages, Groups, Instagram accounts, and domains.
The social network said in a statement late on Wednesday that the new rule applies to false or misleading content about Covid-19 and vaccines, climate change, elections, or other topics so that fewer people see misinformation on its family apps.
“Starting today, we will reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content that has been rated by one of our fact-checking partners. We already reduce a single post’s reach in News Feed if it has been debunked,” Facebook informed.
The company currently notifies people when they share content that a fact-checker later rates.
Now, Facebook has redesigned these notifications to make it easier to understand when this happens.
The notification includes the fact-checkers article debunking the claim as well as a prompt to share the article with their followers.
“It also includes a notice that people who repeatedly share false information may have their posts moved lower in News Feed so other people are less likely to see them,” the social network said.
The company launched its fact-checking program in late 2016.
“We’ve taken stronger action against Pages, Groups, Instagram accounts and domains sharing misinformation and now, we’re expanding some of these efforts to include penalties for individual Facebook accounts too,” the company noted.