Weekly SEO Recap: AJAX crawling and Panda

After this week’s launch of our first Basic SEO training, it’s now time to sit back and relax. For about two minutes. The next milestone is our upcoming Yoast SEO release which we’ve determined, because of the sheer amount of new features in it, will be called Yoast SEO 3.0. Luckily, the search engines took it reasonably slow on us this week, so this recap was easy to write. As I was writing it, it turned out I was giving more advise on dealing with Google Panda, but that can never be a bad thing I guess.

Joost's weekly SEO recap

Google discontinues AJAX crawling

Several months after hinting at it, Google has stopped encouraging people to use their AJAX crawling guidelines from 2009. I have to say, this is done better than Google normally treats its APIs. Sites built using the methods from 2009, including the use of so called _escaped_fragment URL’s will still be indexed, but Google recommends that when you change (parts of) your site, you remove the need for these.

Google has a history of dropping support for API’s at far too short notice for normal companies to comply, so this is actually a rather good gesture. What annoys me is that they don’t announce a specific cut-off date. All they say is:

If your current setup is working fine, you should not have to immediately change anything. If you’re building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls.

Some corporate websites out there have been built using this AJAX crawling system and won’t go away or be changed for years to come. A specific cut-off date would be very useful for sites like these, even (or maybe even: especially) when that date is 2 or 3 years in the future.

Google Panda: delete content the right way

It was if the devil was playing with it. On October 6th, I posted about Google Panda and how we suggest sites deal with that. In this post I included advice to delete low quality pages. Two days later, Gary Ilyes of Google, tweets the following in response to Jennifer Slegg of The SEM Post:

Of course, people were confused since I’d just said something “else”. He later went on to clarify, after several SEOs were pressing him. That’s when it became clear why he said what he said:

“We see too many people cut the good”. That’s why he says don’t delete. Don’t throw the baby out with the bathwater. If someone still might find content useful, don’t delete it. But that’s very similar to what I said.

So how do you fix your Google Panda problems?

I don’t want to reiterate what I said in this post about Google Panda, which you should read if you’re into this, but let’s make sure it’s well understood. Fixing your site’s Google Panda issues is about several things:

  1. Improve content that’s not good enough but might be useful.
  2. Remove content that will never be useful or that you can’t properly maintain.
  3. Improve your site structure so that content that might be useful can be (more easily) found.

It’s not one of those three things, it’s all of them. I’ve seen plenty of sites that had content that was auto-generated, auto-generated tag pages, search pages, etc. Those are pages that no one is ever going to find interesting. You can’t keep those pages, you really should delete them.

That’s it, see you next week!

joost signature