Google 'Project Owl' Is The Search Engine's Way to Fight Fake News And Hate Speech

Google

The search engine giant Google has announced some major changes to its algorithms. Announced on April 25th, 2017, they are meant to fight fake news and hate speech on its search result.

The update comes with the codename 'Project Owl'. What it does, is giving consumers two new ways to report what they see as problems in Google's search results. Not just algorithms, the company is also using teams of humans in its effort to get its algorithms to show more reliable information.

Google crawls and indexes the web, and it does that on daily basis or at anytime it thinks necessary. Since the web is populated with pretty much the goods and the bads, Google's search engine is also picking up the wrong information to show to its users. As a result, the search engine has been under fire for search results that show offensive, fake or even outrageous information about various topics.

For instance, Google prominently displayed links to pro-Nazi sites in response to queries about the Holocaust. Or when people started typing phrases about women or racial groups, Google would suggest queries like "Why are women so dumb?"

These certainly won't appeal larger groups of people and Google has caught a lot of negative media attention.

Although Google doesn't really like to interfere in how its algorithms work, the problem has become big enough that Google needs to dive straight inside the problem to solve it.

Google - auto-complete

While Project Owl works mostly behind the scene and away from users, the biggest change that is visible is the way it changes Google's auto-complete functions.

Google's auto-complete is the search engine suggestion feature option to complete a query. Be it misspelled or incomplete. With Project Owl, Google is allowing users for the first time to interfere in how its algorithms work by telling Google if they see something wrong or objectionable.

The example can be seen below. The tool comes with little line that says report inappropriate predictions:

Google - auto-complete

If users click on the line, a box will pop up that allows them to directly tell Google about the prediction. The other is for the information boxes that Google is providing on its main page in response to many queries.

There are also more ways to tell Google about misleading information. For example, if a Google's Knowledge Graph box appears to respond to a question about "why is the sky blue."

Google - snippet

Rolling out to all users gradually, Google uses feedback options to start giving more weight to certain answers in what it calls "more authoritative" information. Previously, Google's results can include false information but that information stays on top because it was popular. But Google is now depending more on authoritative sites, giving them a chance to appear ahead of ones that are popular but misleading.