It has been reported that Facebook allocates a trustworthiness score to some members to help it manage misinformation issues such as some members continually flagging/reporting stories as fake if they don’t agree with the content.
It is not publicly known exactly how the score is arrived at, but it has been reported recently in the Washington Posts that Facebook’s ‘Misinformation Team’ will be making use of the metric, a system that has taken a year to develop.
It is understood that the system, which Facebook denies amounts to a reputation score, is part of an initiative announced 2 years ago to find a way to deal with issues around fake news and fighting misinformation.
These include both making news with dubious/fake content appear lower in users’ news feeds, and stopping people from indiscriminately flagging news as fake in order to control and influence news and opinions.
Repeat Flaggers In The Spotlight
The scoring system will have a focus on stopping some Facebook members from simply flagging/reporting stories they don’t agree with.
Some commentators have speculated that this part of the scoring system works by correlating any false news reports with the decisions of independent fact-checkers, and by giving higher scores (and presumably higher news feed positions) to a user who makes a single complaint that is substantiated, than to a user who makes lots of complaints, only some of which are substantiated.
Not The First Time
Facebook is not the first and only platform to us such scoring systems for members. For example, Uber rates customers on scores they’ve given to drivers, Twitter has been reported as having used a reputation score to help recommend which members to follow, and a pilot scheme in China is allocating a social credit score to citizens based on their online behaviour.
The Facebook scoring system has been criticised by some people who say that Facebook’s own trustworthiness is unregulated, the scoring system is automated and not transparent, and could amount to another way of Facebook using peoples’ data in a way they may not expect or want (bearing in mind the Facebook / Cambridge Analytica scandal).
What Does This Mean For Your Business?
We are used to the idea that decisions that affect businesses are made using algorithms and automatic scoring systems i.e. search engine rankings. If the new Facebook scoring system works as it should and for the purpose that Facebook has stated, then it may contribute to better management of misinformation, which can only benefit the economy and businesses.
Unfortunately, how Facebook can be trusted to use our data behind the scenes is a sore subject at the moment, and it could be said that mistrust of Facebook and its motives with this move is expected and healthy. Since the Cambridge Analytics revelations, and findings that Facebook was used to distribute dubious, politically influential posts of Russian origins leading up to the US election, Facebook has to at least be seen/reported to be doing more to manage misinformation on its platform.
Unfortunately for Facebook, the scoring system is unlikely to appeal to President Trump, who has warned that it is dangerous for tech/social media companies such as Facebook to regulate themselves. Some commentators have suggested that this concern is partly based on a fear that conservative voices may be silenced by such measures.