The News Feed is Facebook’s biggest source for money income and controversies. For that, the social giant is making yet another change to its algorithms.
In ">an announcement, the social giant said that it's leveraging more signals, such as user surveys, to get more context about the posts people really want to see and who they want to see them from.
"Today, we are announcing two ranking updates based on surveys we’ve conducted: one prioritizes the friends someone might want to hear from most and the other prioritizes the links a person might consider most worthwhile," explained Ramya Sethuraman, Product Manager, Jordi Vallmitjana, Product Manager, and Jon Levin, Technical Program Manager, in a Facebook Newsroom post.
The move marks another evolution of Facebook, which changes from a public platform where users can connect and share just about everything, to a more closed-off personal space that value privacy.
And with Facebook focusing on meaningful interactions between people, the social media company can hit two birds with one stone, as it can also distance itself from strings of issues that plagued its platform in the past, such as fake news, hate speech, extremist contents and more.
These all have eroded the public's trust, and Facebook wants to change this.
Facebook CEO Mark Zuckerberg previously said that his company’s direction would shift from being a social network where people broadcast information to large groups of people - a town square - to a service that functions as a digital equivalent of the living room, where people communicate with smaller trusted groups.
And Facebook in changing its News Feed algorithms by including criteria like surveys into account, means that Facebook is considering users' opinion regarding what content it should show.
Usually, the social network use information gathered from users, like how often they interact with a given friend, how many mutual friends they have and whether they mark someone as a close friend, to predict the best contents to show.
This time, in addition to understanding these signals, "we’ve begun surveying people on Facebook to ask them to list the friends they are closest to."
"We look at the patterns that emerge from the results, some of which include being tagged in the same photos, continuously reacting and commenting on the same posts and checking-in at the same places — and then use these patterns to inform our algorithm."
"This direct feedback helps us better predict which friends people may want to hear from most," explained Facebook.
The move follows previous attempt, when Facebook started to put less priority on unoriginal contents, misinformation, and other forms of ‘borderline content‘ that brings down “the overall quality of discourse” on its platform.
Facebook's algorithms rely on filtering methods to serve recommendations tailored to users' preferences. And by employing user surveys, this can be difficult for the company, which usually and mostly works by extrapolating similarities between users’ tastes in order to make new predictions.
With more than 2 billion active users on its platform, Facebook has become a critical tool for publishers and users alike.
But the tech giant’s massive influence over the culture and the internet, as well as politics, makes its profit-at-all-costs agenda to hurt user engagement. So, by favoring posts from closest friends and other content users might like, the changes are intended to encourage users to interact more with the platform.
According to Facebook: