FACEBOOK: In its long-running efforts to be remembered as something other than the world’s largest misinformation megaphone, Facebook has used a variety of strategies, ranging from spinning its own misleading PR narratives to actual UI changes.
It announced a new strategy today: not only will posts containing misinformation be made less visible, but so will the individual users who share them.
For several years, the social media giant has worked on fact-checking partnerships to disincentivize the spread of viral misinformation, labelling offending posts rather than removing them.
In some cases, it has taken small steps toward concealing things that have been discovered to be false or polarizing, such as discontinuing recommendations for political groups during the 2020 election.
READ ALSO: Become a Pro creating Articles on WordPress
Users, on the other hand, were free to post whatever they wanted with no repercussions. That is no longer the case!
“Beginning today, we will reduce the distribution of all posts in News Feed from an individual’s Facebook account if they repeatedly share content rated by one of our fact-checking partners,” the company said in a press release.
While clearly false posts are already demoted in the News Feed rankings, users who regularly share misinformation will now see all of their content pushed down the dashboard’s endless scroll.
It remains to be seen what tangible impact this increased enforcement will have. Individual Facebook users were previously exempt from this type of scrutiny, but Instagram users were not.
Despite this, vaccine misinformation has spread on the photo-sharing app. As I’ve previously argued, no matter how sophisticated its systems are, Facebook is simply too large to monitor.