
Fake news are all over the web, and there is no single solution to end them.
With the many ways the social giant Facebook has done to prevent the spread of fake news: from algorithms to human curators, the company still couldn't eliminate them. Among its strategies, Facebook also has what it calls "trust values" that are assigned to users.
By rating users' trustworthiness, Facebook wants to determine whether certain news story might be fake, and whether or not a user is trustworthy.
According to a Facebook's spokesperson:
Facebook trust rank can be applied to any Facebook users, with a decimal score between zero and one.
The trust rank however, is only applicable to those users who have opted to flag a post as "false news." Once flagged, Facebook generates a score for that user reporting something as potentially fake. So here, the metric judges a person based on his/her history of reports.
When a person consistently flags a news source as fake when Facebook itself doesn’t judge the source to be untrustworthy, then Facebook judges that person to be untrustworthy.
On the other hand, if a large number of "trustworthy" users flagging a post, Facebook may push that post higher in the queue to be reviewed by fact-checkers.
The company uses several flags to identify which people on the site are more trustworthy than others, and insisted to keep the criteria of this trust rank a secret, just in case people trying to trick the system.
According to Facebook product manager Tessa Lyons:

This trust ranking is previously a unreported rating system. At the F8 conference, CEO Mark Zuckerberg told a group of reporters that the company was working on a ranking system, with those ranked as untrustworthy would be less visible in News Feeds.
Lyons who is in charge of fighting misinformation, said that Facebook, just like others in the tech industry, has long relied on its users to report problematic content. But since Facebook has given people more options, saw more users falsely reporting items as untrue.
It’s "not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher," said Lyons.
Users' trustworthiness here isn't meant to be the absolute indicator of a person's credibility. Facebook stressed out that the trust ranking is only one of the many strategies it has in fighting fake news and fake reporting. The score is one measurement among thousands of behavioral clues that Facebook is taking account as it seeks to understand risks.
It seems like Facebook has grown too big, even for itself to handle.