Social Media Is 'Giving People A Voice,' And 'Basically Giving People The Ability To Share What They Want'

Mark Zuckerberg
CEO and founder of Meta Platforms, Inc.

As the internet becomes increasingly crowded, social media is becoming a home to so many different opinions and thoughts, aggregated into feeds that wish to be seen.

Meta Platforms Inc., the company behind Facebook, Instagram, Threads, WhatsApp, Meta Quest, Ray-Ban Meta smart glasses, Orion augmented reality glasses, and other digital platforms, devices, and services. This is making it one of the most powerful and influential entity the internet has ever seen.

And Mark Zuckerberg, its founder and CEO, is leading the tech titan towards how it must deliver information.

This time, in a move that has sparked widespread controversy, Meta announced the termination of its third-party fact-checking program in the United States. This decision comes after a long history of debate surrounding the role of social media platforms in curbing the spread of misinformation.

Mark Zuckerberg
Mark Zuckerberg in a podcast with Joe Rogan.

In a podcast conversation that nearly lasted three hours, Mark Zuckerberg explained to Joe Rogan why he’s pivoting his social media platforms, as he puts it, back to their roots of free expression.

"You only start one of these companies if you believe in giving people a voice."

"The whole point of social media is basically giving people the ability to share what they want. It goes back to our original mission to give people the power to share and make the world more open and connected."

Meta first introduced its fact-checking program in 2016, during the time it was still called "Facebook."

The primary reason for introducing this program was to counter the spread of misinformation and false information on its platforms

Most notably, the company launched it during the critical time, which was close to the 2016 U.S. presidential election, the time where the internet in the U.S. was at its peak chaotic bias.

At the time, Meta was facing significant criticism and public scrutiny over the spread of fake news and misinformation on its platform, and that there were widespread concerns that false information was influencing public opinion and the democratic process.

Meta introduced its third-party fact-checking program, partnering with reputable media organizations to verify the accuracy of content shared on its platforms, in order to fight the accusations that Facebook was being weaponized to spread fake news during the U.S. presidential election.

This eventually led to the formation of an Oversight Committee, more moderation, and other levers to help users control what content they saw, and to alert Meta when they believed it was toxic or misleading.

The thing is, things didn't always work, and the the policies have not sat well with everyone. Some critics argue that the policies are not strong enough, while others believe they lead to too many mistakes, and others say the controls are too politically biased.

Zuckerberg said that fact checking not only make misinformation, disinformation and malinformation less visible, because it also buries facts.

Meta summarized this in a post on its website:

"Meta’s platforms are built to be places where people can express themselves freely. That can be messy. On platforms where billions of people can have a voice, all the good, bad and ugly is on display. But that’s free expression."

In other words, Zuckerberg's argument is that, the company's fact-checking program has not been effective in addressing misinformation and has stifled free speech.

Instead, Meta wants to replace the program with a system of "Community Notes," which is essentially a crowd-sourced moderation tool that allows users to provide context and flag misleading content.

The feature that relies on the willingness of users to correct the wrong, is similar to the approach used by Elon Musk's social media platform X.

Using Community Notes, Zuckerberg hopes that Meta's platforms can be less political biased and and less excessive in terms of censorship.

Meta believes this approach will reduce bias and promote free expression while still addressing misinformation.

With accountability changing with political tides, especially after Donald Trump becomes the president elect of the U.S., set to replace Joe Biden, Community Notes that relies on user-generated content to flag and provide context for potentially misleading posts, allow the company to take a more hands-off approach.

The decision to abandon fact-checking has been met with strong criticism from various sources, including fact-checking organizations and journalists.

Political figures, including President Joe Biden, called the move "shameful."

Concerns have been raised about the potential for a significant increase in the spread of false information, particularly during elections and other critical periods. Critics also argue that community notes may not be sufficient to address complex or nuanced misinformation campaigns.

Some have also expressed that the move could lead to a rise in misinformation and propaganda, especially in countries with low levels of digital literacy.

Mark Zuckerberg

In other words, critics argue that relying on users to fact-check content may not be as effective in combating false information. After all, Meta is much, much larger than X, and a magnitude more influential that any other social platforms out there.

Meta's decision raises questions about the future of fact-checking on social media platforms.

As the digital landscape continues to evolve, the debate over the role of social media platforms in addressing misinformation is likely to intensify.

Meta's decision to abandon its fact-checking program marks a significant shift in the company's approach to misinformation. The consequences of this decision remain to be seen, but it is clear that the fight against misinformation on social media is far from over.

Then, there is the fact that Mark Zuckerberg is the CEO of a for-profit company, who needs to make his company remain relevant, even when bombarded by the likes of TikTok and the rise of Large Language Models which occupy more of people's time.

Zuckerberg certainly knows that fake news spread a lot faster than facts. And if it ever lower its filter, even a slight increase in fake news swirling on its platforms could translate to more users' time spent on scrolling, clicking, commenting, and viewing ads. And more engagement equals to more profit.