Our team of SEO Web Design gurus are standing by to assist you achieve your online marketing goals.

(541) 799-0040‬

SEO Web Design, LLC aims to improve business by delivering effective solutions based on innovative technologies and professional designs. Discover the variety of services we offer and convince yourself on the basis of the latest works that we've done. We love building fresh, unique and usable websites optimized specifically for your niche.

Responsive Web Design

SEO / SEM / Social Media

Conversion Rate Optimization

Email Marketing

Online Presence Analysis

Web Hosting
SEO Web Design / SEO  / Why the evergreen Googlebot is such a big deal [Video]

Why the evergreen Googlebot is such a big deal [Video]

[embedded content]

The evergreen Googlebot[1] was a huge leap forward in Google’s ability to crawl and render content. Prior to this update, Googlebot was based on Chrome 41 (released in 2015) so that the search engine could index pages that would still work for users on older versions of Chrome. The drawback, however, was that sites with modern features may not have been supported. This discrepancy created more work for site owners that wanted to take advantage of modern frameworks while still maintaining compatibility with Google’s web crawler.

Always up-to-date. “Now, whenever there is an update, it pretty much automatically updates to the latest stable version, rather than us having to work years on actually making one version jump,” said Martin Splitt, search developer advocate at Google, during our crawling and indexing session of Live with Search Engine Land[2]. Splitt was part of the team that worked on making Googlebot “evergreen,” meaning that the crawler will always be up-to-date with the latest version of Chromium; he also unveiled it at the company’s I/O developer conference in 2019.

Twice the work. Before the advent of the evergreen Googlebot, one common workaround was to use modern frameworks to build a site for users, but to serve alternate code for Googlebot. This was achieved by identifying Googlebot’s user agent, which included “41” to represent the version of Chrome it was using.

This compromise meant that site owners would have to create an alternate version of their content meant specifically for Googlebot. Doing this would’ve been laborious and time consuming.

Googlebot’s user agent, revisited. Part of the issue of updating Googlebot’s user agent to reflect the latest version of Chromium was that some sites were using the above-mentioned technique to identify the web crawler. An updated user agent might have resulted in a situation where a site owner (that wasn’t aware of the change) did not serve any code to Googlebot, which could have resulted in their site not getting crawled, and subsequently indexed and ranked.

To prevent disruption of its services, Google communicated the user agent change in advance and worked with technology providers to ensure that sites would still get crawled as usual. “When we actually flipped . . . pretty much no fires broke out,” Splitt said.

Why we care. The evergreen Googlebot can access more of your content without the need for workarounds. That also means fewer indexing issues for sites running modern JavaScript[3]. This enables site owners and SEOs to spend more of their time creating content instead of splitting their attention between supporting users and an outdated version of Chrome.

Want more Live with Search Engine Land? Get it here:

About The Author

George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.

Powered by WPeMatico

Search Engine Land is the leading industry source for daily, must-read news and in-depth analysis about search engine technology.