Google's Panda Algorithm to Lower the Rank of Low Quality Websites

Google Panda update

Google introduced a new algorithm change for its search engine ranking. The change is aiming to lower the rank of "low-quality sites", and return higher-quality sites near the top of the search results.

Google's Panda has received several updates since the original rollout in February 2011, and the effect went global in April 2011.

The change Google made has given a surge in the rankings of news websites and social networking sites, and a drop in rankings for sites containing large amounts of advertising. This change has affected the rankings of almost 12 percent of all search results.

Google Panda is just as serious a matter for online marketers as it was when it first released. The Panda is a change to Google's search results algorithm with a goal to lower the rank of low-quality sites with thin content (e.g. "content farms").

Google Panda was built through an algorithm update that used artificial intelligence in a more sophisticated and scalable way than previously possible. Google's Panda is used to look for similarities between websites people found to be high quality and low quality.

By scanning previous indexed websites, the Panda update will see how relevant an information is, how it is ranked in the results page, and how it is related to other similar information on different sites.

With the new algorithm, the older ranking factors, like the PageRank, have been downgraded in importance.

Recovering from the Panda

Since introducing the Panda update, Google's blog is full of complaints. Almost immediately after the initial launch, there were some interesting ramifications for online marketers that continue to hold.

To help affected publishers, Google published an advisory on its blog, thus giving some direction for self-evaluation of a website's quality. Google has provided a list of 23 bullet points on its blog answering the question of "What counts as a high-quality site?" that is supposed to help webmasters "step into Google's mindset".

Google says it only takes a few poor quality, or duplicate content, pages to hold down traffic on an otherwise solid site. Google recommends either removing those pages, blocking them from being indexed by Google, or rewriting them. However, the head of webspam at Google, Matt Cutts, warns that rewriting duplicate content so that it is original may not be enough to recover from Panda. The rewrites must be of sufficient high quality.

High quality content brings "additional value" to the web. Content that is general, non-specific, and not substantially different from what is already out there should not be expected to rank well: "Those other sites are not bringing additional value. While they’re not duplicates they bring nothing new to the table."

Avoiding the Panda Updates

To avoid any necessary plunge in the search engine results page, websites must consider to make themselves useful, both to benefit themselves, and the search engine. Below are some tips:

  • Create authoritative and engaging content.
  • Have natural links.
  • Have the website clean from errors.
  • Have simple sitemaps.
  • Having a relevant content with the page that is delivering it.
  • Organize a relevant ratio of advertisements, and text to image.

The Panda update is also aiming to decrease the movement of black hat SEO practices and other violations. The update can give newly made website that have useful content to compete better in the World Wide Web.