'Evergreen' Chromium-Powered Googlebot Allows Google To Better Crawl Modern Websites

When it comes to the web, there are a lot to be seen and learned.

Even Google which probably has the most complete database of the web's information is still learning. As the most popular search engine, Google crawls the web from link to link, jumping from one page to another, and get all the necessary information it needs.

Websites however, most importantly new ones, can be made with a variety of technologies. For Google's web crawlers that need to read and fetch contents, some web technologies and web design/development techniques can prevent the crawlers to see everything.

As a result, some information may not be fetched, or missed.

For this reason, on its 2019 Google I/O Developer Conference, the company said that it is launching an upgraded version of its web crawler Googlebot, which is "evergreen".

What this means, the crawler will always be up-to-date on the latest version of Chromium, the open source browser that Google’s popular Chrome web browser is built with.

"Moving forward, Googlebot will regularly update its rendering engine to ensure support for latest web platform features," wrote Martin Splitt, from Google Webmasters Trends Analyst team, on a blog post.

The goal is to make Googlebot more capable of crawling the everchanging web.

With more websites on the web are updated and created. The pages they make with the numerous feature and functionalities, make them more like a web applications with purposes, than websites five to ten years ago. Using the upgraded Googlebot, Google can make sure that the crawler can crawl most modern websites with less issues.

For example, the crawler can access thousands of features that a modern browser can access, including ES6 and newer JavaScript features, IntersectionObserver for lazy-loading and Web Components v1 APIs.

With this update, Google should essentially be able to access and consume more of the web's content without making special workarounds.

As a result, webmasters and web owners aren't anymore required to spend as much time working on solutions for Google to index their contents. Instead, they can put more focus on generating quality contents Google needs to rank them higher up.

But still, Googlebot can't see everything.

According to Google, "there are still some limitations, so check our troubleshooter for JavaScript-related issues and the video series on JavaScript SEO."

Google has listed some Search-related JavaScript issues that webmasters and web owners should be considerate of.

On the YouTube video shown above, Martin Splitt explains how JavaScript influences SEO, and how webmasters and web owners can optimize their JavaScript-powered website to be friendlier to Google.

For what matters, this update means that websites can live their lives to the fullest to better reach goals and purposes.

And that is by creating helpful contents/services for their visitors and audience, and spending less time working on technical details and solving Search-related issues that should be Google's headaches in the first place.

But still, webmasters and web owners should never forget how to build search engine-friendly websites that Google loves crawling and indexing.

Published: 
10/05/2019