Exaggerated And Misleading Health Claims Are Not Welcome On Facebook

People have been using the internet, and the social media, to get information regarding healthcare and remedies.

While many are legitimate and do present solutions, many others are plainly false claims. This is why Facebook as the giant social media of the web, said that it made changes to its algorithm to reduce the visibility to posts that have misleading health information, exaggerated or sensation health claims.

According to Facebook Product Manager Travis Yeh:

“We know that people don’t like posts that are sensational or spammy, and misleading health content is particularly bad for our community."

"Pages should avoid posts about health that exaggerate or mislead people and posts that try to sell products using health-related claims.”

As the largest social media, Facebook is one of those places where many people share bogus health claims.

Some people share those kind of posts for the money by deceiving people. While in fact some solutions can actually be good and beneficial for one person, but that doesn't mean the solution is safe or acceptable for everyone else.

The social media company said that the updated rules apply to posts like miracle cures, magic weight loss pills and anti-vaccine arguments “based on unsound health claims.”

Since banning the post may not be the best way to curb the issue, Facebook instead changes the way these posts are ranked in its News Feed. By making those posts rank far lower, this will result in less people seeing them.

This is a similar move to what Facebook did in the past in limiting the influence of clickbait posts.

Anti-vax

To make this happen, Facebook's algorithms have been trained to identify phrases commonly used by these types of posts to predict which may include sensational health claims.

Targeting only specific post with specific keywords, Facebook assures that the update won't have any major impact on users' News Feed.

The move is another step forward by the company that is under growing pressure to rid its platforms from fake news and misinformation. Health claims were highlighted because they pose concerns by some media reports.

This isn’t the first time Facebook wants to improve the quality of its News Feed by limiting health claim posts.

Along with platforms like YouTube and Pinterest, Facebook has implemented updates to its algorithms to limit the spread of anti-vaccination content, which often proliferates online and can contribute the kind of vaccine skepticism and hesitancy that has allowed some diseases to resurface.

Back in March 2019, Facebook announced that it’s working with groups like the World Health Organization and the Centers for Disease Control and Prevention to identify incorrect vaccine claims so that it can “take action” against those that influence them on its platform.

Published: 
06/07/2019