Is Your Website Accidentally Telling Google To Delete You From Their Search Results? It Happened To Craigslist; It Might Be Happening To You.

Is your website telling Google to ignore you? It might be if you make changes to your website without consulting a Technical SEO expert. You see, beyond the wonderful world of optimizing content and attracting links, there is a hidden layer called software. That’s right, very complex code is what makes search engines and your website function. Sometimes they don’t play nicely together without you even realizing something is wrong.

Craigslist made a simple and innocent change to their website, that 9/10 web developers wouldn’t have any idea could have affected their search rankings. In fact, 9/10 SEO’s wouldn’t have realized either. But…It had massive ramifications. Basically, this innocent changes caused Google to delete Craigslist from their search results every midnight. This wasn’t a bug in Craigslist or Google, this was simply a software engineer not realizing how Google interprets their instructions to Search bots…or Google trying to solve a problem, but not anticipating this possible site change.

SearchTempest, a Craigslist search engine first discovered this. From the post it seems that SearchTempest is powered by Google custom search, and since Google was deleting craigslist everynight, their Craigslist tool stopped functioning properly, so they switched to Bing’s tool and wrote a blog post about it. Matt Cutts, the head of Google’s webspam team, investigated it and uncovered the problem.

In simple terms, Google didn’t want to show Craigslist or other classified website posts that are old and stale in their Search results, so they built rules that a website can use to tell Google which posts are fresh and which are stale. Apparently, Craigslist changed how their website caches posts, and didn’t consider how Google’s rules interpret that change, and now Craigslist’s website is actually instructing Google to remove their content every day at midnight.

If you can handle the technical details, read below, if not skip to the bottom.

Here is Matt Cutts explanation, ”

Hi Nathan, my name is Matt Cutts and I’m an engineer in the search quality group at Google. Thanks for asking about this; it helped the indexing team uncover an issue in how we’re indexing Craigslist, and we’re in the process of fixing it right now.

To understand what happened, you need to know about the “Expires” HTTP header and Google’s “unavailable_after” extension to the Robots Exclusion Protocol. As you can see athttps://googleblog.blogspot.com/2007/07/robots-exclusion-protocol-now-with-even.html , Google’s “unavailable_after” lets a website say “after date X, remove this page from Google’s main web search results.” In contrast, the “Expires” HTTP header relates to caching, and gives the date when a page is considered stale.

A few years ago, users were complaining that Google was returning pages from Craigslist that were defunct or where the offer had expired a long time ago. And at the time, Craigslist was using the “Expires” HTTP header as if it were “unavailable_after”–that is, the Expires header was describing when the listing on Craigslist was obsolete and shouldn’t be shown to users. We ended up writing an algorithm for sites that appeared to be using the Expires header (instead of “unavailable_after”) to try to list when content was defunct and shouldn’t be shown anymore.

You might be able to see where this is going. Not too long ago, Craigslist changed how they generated the “Expires” HTTP header. It looks like they moved to the traditional interpretation of Expires for caching, and our indexing system didn’t notice. We’re in the process of fixing this, and I expect it to be fixed pretty quickly. The indexing team has already corrected this, so now it’s just a matter of re-crawling Craigslist over the next few days.

So we were trying to go the extra mile to help users not see defunct pages, but that caused an issue when Craigslist changed how they used the “Expires” HTTP header. It sounded like you preferred Google’s Custom Search API over Bing’s so it should be safe to switch back to Google if you want. Thanks again for pointing this out.”


The Lesson Here is That You Shouldn’t Make Changes To Your Website Without Consulting a Technical SEO Expert Who Understands How Google’s Bots Will Interact With Your New Website Code. 

The simple reality is, even if you think SEO is dead, and only want to focus on creating killer content, there are many technical concerns you need to be aware of or else you might accidentally tell Google to DELETE your website.

It amazes me how many people build websites, or make big changes to their website without considering the Search Engines and SEO ramifications. SEO is not a tactic you employ to get traffic. It is a holistic layer that should be integrated with and should be considered with every single thing you do online, or for that matter offline as well.

About David Melamed

David Melamed is the Founder of Tenfold Traffic, a search and content marketing agency with over $50,000,000 of paid search experience and battle tested results in content development, premium content promotion and distribution, Link Profile Analysis, Multinational/Multilingual PPC and SEO, and Direct Response Copywriting.

Trackbacks

  1. […] if you really want. (Hint: Share it on Google+) Want some examples? A few years ago craigslist accidentally told Google to delete their listings from the index. Basically, they didn’t follow Google’s […]

Speak Your Mind

*

This site uses Akismet to reduce spam. Learn how your comment data is processed.