Our team of SEO Web Design gurus are standing by to assist you achieve your online marketing goals.

(541) 799-0040‬

SEO Web Design, LLC aims to improve business by delivering effective solutions based on innovative technologies and professional designs. Discover the variety of services we offer and convince yourself on the basis of the latest works that we've done. We love building fresh, unique and usable websites optimized specifically for your niche.

Responsive Web Design

SEO / SEM / Social Media

Conversion Rate Optimization

Email Marketing

Online Presence Analysis

Web Hosting
SEO Web Design / SEO  / Common oversights that can impede Google from crawling your content [Video]

Common oversights that can impede Google from crawling your content [Video]

[embedded content]

“I don’t know why people are reinventing the wheel,” said Martin Splitt, search developer advocate for Google, during our crawling and indexing session of Live with Search Engine Land[1]. As more and more techniques are developed to provide SEOs and webmasters with flexible solutions to existing problems, Splitt worries that relying on these workarounds, instead of sticking to the basics, can end up hurting a site’s organic visibility.

“We have a working mechanism to do links . . . so, why are we trying to recreate something worse than what we already have built-in?” Splitt said, expressing frustration over how some developers and SEOs are diverging from the standard HTML link in favor of fancier solutions, such as using buttons as links and forsaking the href attribute for onclick handlers. These techniques may create problems for web crawlers, which increases the likelihood that those crawlers skip your links.

Another common issue arises when SEOs and developers block search engines from accessing certain content using the robots.txt file, still expecting their JavaScript API to direct the web crawler. “When you block us from loading that, we don’t see any of your content, so your website, as far as we know, is blank,” Splitt said, adding “And, I wouldn’t know why, as a search engine, would I keep a blank website in my index.”

Why we care. “Oftentimes, people are facing a relatively simple problem and then over-engineer a solution that seems to work, but then actually fails in certain cases and these cases usually involve crawlers,” Splitt said. When simple, widely accepted techniques already exist, site owners should opt for those solutions to ensure that their pages can get crawled and subsequently indexed and ranked. The more complex a workaround is, the higher the chances are that the technique will lead to unforeseen problems down the road.

Want more Live with Search Engine Land? Get it here:

About The Author

George Nguyen is an Associate Editor at Third Door Media. His background is in content marketing, journalism, and storytelling.

Powered by WPeMatico

Search Engine Land is the leading industry source for daily, must-read news and in-depth analysis about search engine technology.