A brief history of Google’s algorithm updates

These days, the way we do SEO is somewhat different from how things were done ca. 10 years ago. There’s one important reason for that: search engines have been continuously improving their algorithms to give searchers the best possible results. Over the last decade, Google, as the leading search engine, introduced several major updates, and each of them has had a major impact on best practices for SEO. Here’s a — by no means exhaustive — list of Google’s important algorithm updates so far, as well as some of their implications for search and SEO.

2011 – Panda

Obviously, Google was around long before 2011. We’re starting with the Panda update because it was the first major update in the ‘modern SEO’ era. Google’s Panda update tried to deal with websites that were purely created to rank in the search engines, and mostly focused on on-page factors. In other words, it determined whether a website genuinely offered information about the search term visitors used. 

Two types of sites were hit especially hard by the Panda update:

  1. Affiliate sites (sites which mainly exist to link to other pages).
  2. Sites with very thin content.

Google periodically re-ran the Panda algorithm after its first release, and included it in the core algorithm in 2016. The Panda update has permanently affected how we do SEO, as site owners could no longer get away with building a site full of low-quality pages.

2012 – Venice

Venice was a noteworthy update, as it showed that Google understood that searchers are sometimes looking for results that are local to them. After Venice, Google’s search results included pages based on the location you set, or your IP address.

2012 – Penguin

Google’s Penguin update looked at the links websites got from other sites. It analyzed whether backlinks to a site were genuine, or if they’d been bought to trick the search engines. In the past, lots of people paid for links as a shortcut to boosting their rankings. Google’s Penguin update tried to discourage buying, exchanging or otherwise artificially creating links. If it found artificial links, Google assigned a negative value to the site concerned, rather than the positive link value it would have previously received. The Penguin update ran several times since it first appeared and Google added it to the core algorithm in 2016.

As you can imagine, websites with a lot of artificial links were hit hard by this update. They disappeared from the search results, as the low-quality links suddenly had a negative, rather than positive impact on their rankings. Penguin has permanently changed link building: it no longer suffices to get low-effort, paid backlinks. Instead, you have to work on building a successful link building strategy to get relevant links from valued sources.

2012 – Pirate

The Pirate update was introduced to combat illegal spreading of copyrighted content. It considered (many) DMCA (Digital Millennium Copyright Act) takedown requests for a website as a negative ranking factor for the first time.

2013 – Hummingbird

The Hummingbird update saw Google lay down the groundwork for voice-search, which was (and still is) becoming more and more important as more devices (Google Home, Alexa) use it. Hummingbird pays more attention to each word in a query, ensuring that the whole search phrase is taken into account, rather than just particular words. Why? To understand a user’s query better and to be able to give them the answer, instead of just a list of results.

The impact of the Hummingbird update wasn’t immediately clear, as it wasn’t directly intended to punish bad practice. In the end, it mostly enforced the view that SEO copy should be readable, use natural language, and shouldn’t be over-optimized for the same few words, but use synonyms instead. 

2014 – Pigeon

Another bird-related Google update followed in 2014 with Google Pigeon, which focused on local SEO. The Pigeon update affected both the results pages and Google Maps. It led to more accurate localization, giving preference to results near the user’s location. It also aimed to make local results more relevant and higher quality, taking organic ranking factors into account. 

2014 – HTTPS/SSL

To underline the importance of security, Google decided to give a small ranking boost to sites that correctly implemented HTTPS to make the connection between website and user secure. At the time, HTTPS was introduced as a lightweight ranking signal. But Google had already hinted at the possibility of making encryption more important, once webmasters had had the time to implement it. 

2015 – Mobile Update

This update was dubbed ‘​Mobilegeddon​’ by the SEO industry as it was thought that it would totally shake up the search results. By 2015 more than 50% of Google’s search queries were already coming from mobile devices, which probably led to this update. The Mobile Update gave mobile-friendly sites a ranking advantage in Google’s mobile search results. In spite of its dramatic nickname, the mobile update didn’t instantly mess up most people’s rankings. Nevertheless, it was an important shift that heralded the ever-increasing importance of mobile.

2015 – RankBrain

RankBrain is a state-of-the-art Google algorithm, employing machine learning to handle queries. It can make guesses about words it doesn’t know, to find words with similar meanings and then offer relevant results. The RankBrain algorithm analyzed past searches, determining the best result, in order to improve. 

Its release marks another big step for Google to better decipher the meaning behind searches, and serve the best-matching results. In March 2016, Google revealed that RankBrain was one of the three most important of its ranking signals. Unlike other ranking factors, you can’t really optimize for RankBrain in the traditional sense, other than by writing quality content. Nevertheless, its impact on the results pages is undeniable.

2016 – Possum 

In September 2016 it was time for another local update. The ​Possum update​ applied several changes to Google’s local ranking filter to further improve local search. After Possum, local results became more varied, depending more on the physical location of the searcher and the phrasing of the query. Some businesses which had not been doing well in organic search found it easier to rank locally after this update. This indicated that this update made local search more independent of the organic results.

Read more: Near me searches: Is that a Possum near me? »

2018 – (Mobile) Speed Update

Acknowledging users’ need for fast delivery of information, Google implemented this update that made page speed a ranking factor for mobile searches, as was already the case for desktop searches. The update mostly affected sites with a particularly slow mobile version.

2018 – Medic

This broad core algorithm update caused quite a stir for those affected, leading to some shifts in ranking. While a relatively high number of medical sites were hit with lower rankings, the update wasn’t solely aimed at them and it’s unclear what its exact purpose was. It may have been an attempt to better match results to searchers’ intent, or perhaps it aimed to protect users’ wellbeing from (what Google decided was) disreputable information.

Keep reading: Google’s Medic update »

2019 – BERT

Google’s BERT update was announced as the “biggest change of the last five years”, one that would “impact one in ten searches.” It’s a machine learning algorithm, a neural network-based technique for natural language processing (NLP). The name BERT is short for: Bidirectional Encoder Representations from Transformers.

BERT can figure out the full context of a word by looking at the words that come before and after it. In other words, it uses the context and relations of all the words in a sentence, rather than one-by-one in order. This means: a big improvement in interpreting a search query and the intent behind it.

Read on: Google BERT: A better understanding of complex queries »

Expectations for future Google updates

As you can see, Google has become increasingly advanced since the early 2010s. Its early major updates in the decade focused on battling spammy results and sites trying to cheat the system. But as time progressed, updates contributed more and more to search results catered to giving desktop, mobile and local searchers exactly what they’re looking for. While the algorithm was advanced to begin with, the additions over the years, including machine learning and NLP, make it absolutely state of the art. 

With the recent focus on intent, it seems likely that Google Search will continue to focus its algorithm on perfecting its interpretation of search queries and styling the results pages accordingly. That seems to be their current focus working towards their mission “to organize the world’s information and make it universally accessible and useful.” But whatever direction it takes, being the best result and working on having an excellent site will always be the way to go!

Keep on reading: Should I follow every change Google makes? »

Feeling a bit overwhelmed by all the different names and years? Don’t worry! We made a handy infographic that shows when each Google update happened and briefly describes what the purpose was.

Google's algorithm updates 2011-2020

The post A brief history of Google’s algorithm updates appeared first on Yoast.

Weekly SEO recap: it’s not Panda, not Penguin, it’s… Brands!

This week we had quite a bit of news. But…. I’ve written about it all, already. So I’m gonna be a lazy guy and point you straight at the three posts I wrote this week on Google changes:

Joost's weekly SEO Recap

The week started with us not knowing much yet, but it’s good to read this to get an idea of what everybody thought had happened:

Google update: Real time Penguin? Or something else?

Then we realized it wasn’t really Penguin, so we moved on. We got the news that Google had made Google Panda a part of its core algorithm:

Google Panda part of Google’s core algorithm

And finally, we figured out what had changed in Google and what this update was really about: brand terms. Read this post for the full view:

Google core algorithm update: brand terms

In all honesty, this is what we’d call, in Dutch “een storm in een glas water”, which translates as: “a storm in a glass of water”, basically: much ado about nothing.

That’s it, see you next week!

joost signature

Google core algorithm update: brand terms

Over the weekend we saw an incredibly big shuffle in Google search results. I wrote about it earlier this week, as we were researching what happened. I’ll be honest: we’re still researching. But let me update you on what we know and don’t know about this Google core algorithm update.

Google Core Algorithm update

What we know

We know a few things about this update now. Despite all the promises about a Google Penguin update early this year, this is not it. It’s also not Google Panda. But there’s news about Google Panda anyway, which I’ve written a separate post on:

Read more: ‘Google Panda part of Google’s core algorithm’ »

How we know that this is not Penguin or Panda? Googler Gary Ilyes said so. New Google star Zineb Ait tweeted that this update didn’t have a name but was “just one of the frequent quality updates” (in French).

What did this Google core algorithm update change?

So… What changed? We don’t know. The changes are… Weird. We’ve been using a couple of datasets to look at this update, but most of all we’re looking at SearchMetrics. They publish a list of winners and losers every week and this week the changes seem to have happened mostly for news sites, specifically for brand terms. For instance, check this list of the keywords that the biggest loser in the US, the Atlantic, lost its position for:

Keywords the Atlantic lost traffic for

Almost all of these are brand terms.

Bartosz has written a good post (with tons of interesting screenshots if you don’t have access to SearchMetrics), that touches on some of the things I had seen too. He calls it a “brand bidding update”, which I don’t think is right. I do agree with him that the change was in the type of results that Google shows for brand queries. The switch seems to have been from news articles to more “timeless” URLs.

Slugs and/or site structure?

You won’t believe this, and it’s a correlation only (so don’t say I’ve said this is true), but I’m seeing a high correlation between the keyword(s) being the first word(s) of the slug (the URL) and the ranking. It can’t be that simple though. It’s very possible it has to do with a better site structure for the winners versus the losers. Some of the biggest winners are category URLs on sites that have good optimization for their categories and good internal linking, like Mashable. So… This might be a good time to revisit your Category SEO tactics:

Read more: ‘Using category and tag pages for your site’s SEO’ »

Visibility impacted, but traffic?

SearchMetrics (and many similar tools) calculate a score based on the amount of traffic for a specific term and the position you’re ranking on. The idea is that if you rank, for instance, #3 for a term, you’ll receive a certain proportion of the traffic for that term. This is a very valuable way of looking at a site as a site’s visibility score usually has a high correlation to a site’s traffic.

The problem with this visibility score is when searches are mostly navigational. For instance, we rank in the top 7 for [[google analytics], but we get close to 0 traffic for that term. The reason is that 99.9% of people searching for [[google analytics], actually want to go to Google Analytics.

This means that the actual changes in terms of traffic for this update, even though the changes in visibility are huge, will differ highly per term and will, very often, be negligible. This is in my opinion something in the SearchMetrics visibility score that has to be changed, and something I’ve discussed with my good friend (and SearchMetrics founder and CTO) Marcus Tober before.


The impact of this Google core algorithm update in terms of search results and visibility was huge, the impact on actual traffic might not be as big. There are definitely things we’ll need to figure out over the coming weeks / months though, like what the importance of site structure and URLs are. Interesting times!


Google Panda part of Google’s core algorithm

Panda has become a part of Google’s core algorithm. This post discusses what that means for SEOs and webmasters. Amidst (or actually slightly before) all the kerfuffle about the changes to Google’s core algorithm this week, Jennifer Slegg posted a huge article about Google Panda, that’s really worth a look. It was vetted by people on Google’s webmaster team, and really does contain a lot of nice info. Most of it wasn’t new to me, but the fact that Panda had become a part of Google’s core algorithm  was.

Google Panda became a part of Google's Core algorithm

Bit of history

Google Panda, when it first came out (and was still called the “Farmer update”), was an algorithm that was run every once in a while. “Once in a while” was a few times a year, and being hit by Panda would thus mean that you could be waiting months before you got “redeemed”.

This meant that we could point at a specific “update” of Panda. This, in turn, meant we could see by the date on which changes happened to a site’s traffic whether that change in traffic was related to Google Panda or not and could study what had changed in how the Panda perceived sites. This has now changed slightly.

“Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals.”

Not real time

The fact that Google Panda is now part of the core algorithm doesn’t mean that it updates in real time. It just means that it’s no longer a spam filter that’s applied after the core ranking algorithm has done its work. What it does mean, is that we can’t say “this was Panda version X”. It still means we can keep track of changes (like the one happening last weekend) and say “this was because of the early January 2016 changes”. We will have to figure out, over time, whether we can reliably see if an update was in fact an update of the Panda part of the algorithm or not.

One of the questions I still have is whether they’ve changed how they run Panda in terms of timing too. I could understand it if they ran it bit by bit on sections of their index in stead of running it over their entire index all at once. This would mean there would be no pinpointing of dates at all anymore. Whether that’s the case is still unclear.

Much ado about nothing

In all, for normal users, this doesn’t change all that much if anything at all. A lot of it is just “SEO Semantics”. You have to make a site that’s high quality, with a good site structure and a good user experience. Easy does it, right?

If you haven't read it yet, read: ‘Google core algorithm update: brand terms’ »

What happened to my rankings!?

One of the most asked questions in our site reviews is “What happened to my rankings!?”. Although we have the possibility of using all kind of great tools to analyze your website, this questions remains one of the toughest ones to answer.

Google isn’t always clear about why some sites rank better than others. In this article, I’d like to illustrate how we go about found drops in rankings or traffic, by using an example from our Platinum SEO reviews.

we'll check what happened to your site's ranking

Analyzing the drop

Perhaps something happened to your website at a certain point that caused your rankings to drop or traffic to plummet. Using for instance Google Analytics, you should try to pinpoint the date of the change. The tool we usually use for that, is Searchmetrics.com. Based upon collected data, so not that useful for newer sites, Searchmetrics tells you at what point things changed. Here’s a real life example:


That simply hurts. This website took a hit around May 18, perhaps a bit earlier already. Luckily, they were able to gain back a lot of their traffic, but things haven’t been quite like before.

What caused the drop

What we do in this case, is align the graphs in Searchmetrics with Google’s algorithm updates. Usually we turn to Moz, as their overview is accurate and has some convenient links to articles about that drop.

Panda 4.0 (#26) — May 19, 2014
Google confirmed a major Panda update that likely included both an algorithm update and a data refresh. Officially, about 7.5% of English-language queries were affected. While Matt Cutts said it began rolling out on 5/20, our data strongly suggests it started earlier.

Hm, that sounds about right. Right? There was a Payday Loan update on the 16th as well, so let’s not jump to conclusions. What could have triggered a Panda penalty in this case?
majestic-exampleThe website at hand is packed with news, has built up quite some authority over the years and its link profile looks awesome (see image on the right). In MajesticSEO, which we use alongside Searchmetrics for deeper insights on backlinks, that deep purple color indicates loads of linking domains. One might argue on the trust flow of these sites, but a lot of these circle around 20-30 and that seems alright (there is always room for improvement, right). This  shouldn’t have anything to do with that drop.

Panda is about quality. So we checked a number of things, like content quality, design quality, code quality. All seemed right. At that moment, we were tearing our hair out and drinking heavily, as we just could not figure out what happened.

Ghost banners

Ghost banner exampleJust the other day, I was at Conversion Hotel and an obvious subject was ghost buttons. Don’t use these, they’re scary. The website we were analyzing might have had a penalty because of another bad spirit, being ghost banners. Google is a machine. Google recognizes certain sized elements on your website, that have a link and might perhaps be a banner. Think along the lines of a popular posts section, a list of product images or (and even worse) links to related articles on other websites. If your website is packed with these, Google could quite easily mistake these for banners.

Does this mean changing something like a sidebar would help you get rid of that possible Panda penalty? That isn’t guaranteed. Too many factors we can’t influence might play a part in this. What we do in our Platinum SEO review, to our best knowledge and using our combined experience, is analyze your website and identify every possible issue your website might have. The review leaves you with an extensive to-do list. If you follow-up on that list, or have someone follow-up on that list, you will be able to serve your website the best way possible to Google and other search engines.

There is no number one ranking guarantee. You have a part in that optimization, and Google has a part in picking up on changes. But you’ll be able to give it your very best go, using loads of know-how we provide and loads of market knowledge you already have.

Our Platinum SEO review gives you a complete overview of all the things you can do SEO wise to improve your website for search engines.

And it is available now (until December 11) for a mere $1,799 instead of the usual $2,499. So if you have experienced any strange drops in your rankings or traffic, we’d be happy to have a look!

Keyword density in a post-Panda world

The Yoast SEO plugin helps you to optimize your text for the keyword you want to be found for. In Yoast SEO 3.0 we made some big changes in our content analysis. In this post we’ll discuss the adaption of our Yoast SEO keyword density check and the possibility to optimize for multiple keywords in Yoast SEO Premium.

keyword stuffing is not a good idea in a post-Panda world

Keyword stuffing is not a great SEO strategy. You’ll be hit by Google Panda (or another update) in no time. Optimizing your text for specific keywords however, is something you definitely can do! This is the reason we have our focus keyword functionality in Yoast SEO. If you go to far though, over-optimization is around the corner. Over-optimization can be seriously dangerous, which is why our Yoast SEO plugin has some safeguards (in the form of red bullets) of doing so.

Keyword density check is much stricter now

Google prefers nice, readable texts. Your text should be well structured and attractively written. Texts with a high keyword density do not read nicely. They are, in fact, terrible to read!  Instead of using your focus keyword over and over, you should use synonyms if possible. Google actually recognizes synonyms to keywords now. With Google’s ability to recognize synonyms, optimizing for a single focus keyword becomes more and more silly.

We therefore decided to update the keyword density check in Yoast SEO. In the new Yoast SEO the keyword density for your post has to be between 0.5 and 2.5%. In the old analysis, you could get away with a keyword density of as high as 4.5%. We’ve come to the conclusion that that’s just too high in this post-Panda world!

If you want to check your old posts and make sure their keyword densities are within our new guideline, you can do so. When you upgrade you’ll see (or have seen) a notice about recalculating SEO scores. This is one of the things we recalculate at that point. This does mean that a post that was green before can now suddenly turn red… If you can’t find that notice, you can find the tool under SEO → Tools.

Multiple keywords

In Yoast SEO premium we have a new feature which enables you to optimize for more than one focus keyword. You could use this in optimizing for two related keywords, allowing you to rank in Google on different keywords. You could also use this to optimize for two synonyms. Optimizing a post for two or three synonyms simultaneously while still requiring a 1% keyword density as a minimum, would lead to over-optimization and thus angry Pandas. This was one of the reasons to lower our “required” keyword density to 0.5%. We are actually working on some new functionality now, allowing you to treat synonyms and multiple keywords differently in our Content SEO analysis. As that has multiple implications that’ll take a while to get right.

Your SEO strategy should never focus on one single keyword. You really do need a proper keyword strategy. Sometimes it’s useful to try to make a single post or page rank for multiple (related) keywords. Perhaps you have a shop in ballet accessories and are writing a post about ballet shoes. But, you’d also like this post to rank for [dance shoes], as [dance shoes] is a more general (and common) search term. Our multiple keywords functionality is actually really well fitted to help you optimize for more than one keyword like this. It also allows you to focus on multiple angles and words, reducing the risk that you over-optimize your texts.

Until the end of the year, Yoast SEO Premium, Yoast_SEO_In_Icon_71_x2which has this multiple keywords functionality, costs only $69 per year for support, upgrades and updates.

How to properly delete a page from your site

Whenever you delete a page (or post) from your site, you also delete one or more URLs. That old URL, when visited, will usually return a ‘404 not found’ error, which is not the best thing for Google or your users. Is that what you really wanted to happen? You could redirect that deleted page to another page, or maybe – if you really want the content gone from your site – serving a 410 header would actually be a better idea. This post explains the choices you have and how to implement them.

Did you know Yoast SEO Premium has an awesome redirect manager that makes the redirection of deleted posts a breeze? Try it out!

Redirect or delete a page completely?

The first thing you have to work out is whether or not the content you deleted has an equivalent somewhere else on your site. Think of it this way: if I clicked on a link to the page you deleted, would there be another page on your site that gives me the information I was looking for? If that’s true for most of those following the link, you should redirect the deleted URL to the alternative page.

In general, I’d advise you to redirect a page even when only a handful of the visitors would benefit from it. The reasoning is simple: if the other option is for all your visitors to be sent to a “content not found” page, that’s not really a great alternative either…

Create a redirect

There are several types of redirects, but a 301 redirect is what’s called a permanent redirect, and this is what you should use when you redirect that deleted page URL to another URL. Using a 301 redirect means Google and other search engines will assign the link value of the old URL to the URL you redirected your visitors to.

Deleting content completely

If there really is no alternative page on your site with that information, you need to ask yourself whether it’s better to delete it or keep it and improve it instead. But if you’re absolutely sure you want to delete it, make sure you send the proper HTTP header: a ‘410 content deleted’ header.

404 and 410 HTTP headers

The difference between a 404 and a 410 header is simple: 404 means “content not found”, 410 means “content deleted” and is, therefore, more specific. If a URL returns a 410, Google knows for sure you removed the URL on purpose and it should, therefore, remove that URL from its index much sooner.

Our Yoast SEO Premium plugin for WordPress has a redirects module which lets you set 410 headers. The redirect manager is the perfect tool for working with redirects, automatically asking you what you want to do with a URL when you delete it or change the permalink. Of course, you can set any type of redirect.

The problem with serving 410 content deleted headers is that Google’s support for it is incomplete. Sure, it will delete pages that serve a 410 from its index faster, but Google Search Console will report 410s under “Not found” crawl errors, just like 404s. We’ve complained to Google about this several times but unfortunately, they have yet to fix it.

Collateral damage when deleting a page

When you delete one or more posts or pages from your site, there’s often collateral damage. Say you deleted all the posts on your site that have a specific tag. That tag now being empty, its archive’s URL will also give a 404. Even when you handle all the URLs of those posts you deleted properly (by redirecting or 410ing them) the tag archive will still give a 404, so you should make sure to deal with that URL too.

Even when you didn’t delete all the posts in a tag, the tag archive might now have 5 instead of 12 posts. If you display 10 posts per page in your archives, page 2 of that archive will now no longer exist, and thus give a 404 error. These aren’t the biggest problems in the world when you delete one or two posts, but if you’re dealing with a Google Panda problem and because of that deleting lots of poor content, creating a lot of 404s like this can take your site down even further, so proceed with care!

Read more: Which redirect should I use? »

The post How to properly delete a page from your site appeared first on Yoast.

Weekly SEO Recap: AJAX crawling and Panda

After this week’s launch of our first Basic SEO training, it’s now time to sit back and relax. For about two minutes. The next milestone is our upcoming Yoast SEO release which we’ve determined, because of the sheer amount of new features in it, will be called Yoast SEO 3.0. Luckily, the search engines took it reasonably slow on us this week, so this recap was easy to write. As I was writing it, it turned out I was giving more advise on dealing with Google Panda, but that can never be a bad thing I guess.

Joost's weekly SEO recap

Google discontinues AJAX crawling

Several months after hinting at it, Google has stopped encouraging people to use their AJAX crawling guidelines from 2009. I have to say, this is done better than Google normally treats its APIs. Sites built using the methods from 2009, including the use of so called _escaped_fragment URL’s will still be indexed, but Google recommends that when you change (parts of) your site, you remove the need for these.

Google has a history of dropping support for API’s at far too short notice for normal companies to comply, so this is actually a rather good gesture. What annoys me is that they don’t announce a specific cut-off date. All they say is:

If your current setup is working fine, you should not have to immediately change anything. If you’re building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls.

Some corporate websites out there have been built using this AJAX crawling system and won’t go away or be changed for years to come. A specific cut-off date would be very useful for sites like these, even (or maybe even: especially) when that date is 2 or 3 years in the future.

Google Panda: delete content the right way

It was if the devil was playing with it. On October 6th, I posted about Google Panda and how we suggest sites deal with that. In this post I included advice to delete low quality pages. Two days later, Gary Ilyes of Google, tweets the following in response to Jennifer Slegg of The SEM Post:

Of course, people were confused since I’d just said something “else”. He later went on to clarify, after several SEOs were pressing him. That’s when it became clear why he said what he said:

“We see too many people cut the good”. That’s why he says don’t delete. Don’t throw the baby out with the bathwater. If someone still might find content useful, don’t delete it. But that’s very similar to what I said.

So how do you fix your Google Panda problems?

I don’t want to reiterate what I said in this post about Google Panda, which you should read if you’re into this, but let’s make sure it’s well understood. Fixing your site’s Google Panda issues is about several things:

  1. Improve content that’s not good enough but might be useful.
  2. Remove content that will never be useful or that you can’t properly maintain.
  3. Improve your site structure so that content that might be useful can be (more easily) found.

It’s not one of those three things, it’s all of them. I’ve seen plenty of sites that had content that was auto-generated, auto-generated tag pages, search pages, etc. Those are pages that no one is ever going to find interesting. You can’t keep those pages, you really should delete them.

That’s it, see you next week!

joost signature

Google Panda & low quality pages

While reviewing websites, quite often we find sites that already have become a victim of Google Panda’s algorithm, or run the risk of getting “Pandalized”. A site runs the risk of being hit by Panda when it has a lot of low quality pages. In this post I want to give you some quick insights into what Google Panda is and how you can prevent from getting hit by it.

What is Google Panda?

A quick intro by yours truly on what Google Panda is:

How to fix low quality pages

In all, Panda usually affects your site because you have too many low quality pages as a proportion of your overall number of pages. So you need to fix your low quality pages in order for that ratio to become healthy again. You can fix problems with low quality pages in two ways:

  1. You improve the content and quality of these pages. This means adding more well written content, but it often also means making sure you don’t have too many ads on the page and improve the UX of the page.
  2. You remove these low quality pages or disallow the search engine access to the pages. This last method is often called a “Panda diet”.

Identifying which low quality pages to fix

Whether you decide to improve your content or to remove your low quality pages, you need to know where to start. The best way to identify pages that need fixing is to look at pages on your site that have very few visitors and/or a very high bounce rate. These are pages that aren’t ranking or that aren’t doing anything for your overall site’s performance because people go away from these pages too quickly.

There are a couple of tools that can help you identify which low quality pages need fixing. The one I like most for smaller to medium sized sites is Screaming Frog. When you’ve opened it, you can connect to a site’s Google Analytics data and then have Screaming Frog crawl the site, after which you can sort by bounce rate. There are also some filters to show you pages above 70% bounce rate (which really is too high) and pages that have no GA data because nobody visited those pages…

It’s important to note that sometimes, a bounce doesn’t mean something’s wrong. On our knowledge base, if a page has a high bounce rate, the reason usually is that it solved the problem. So people didn’t do anything else because they didn’t have to. So you really have to go into the pages you’ve identified with these methods and evaluate their individual quality.

Improve or remove?

Once you’ve identified which pages on your site needs fixing, you need to start thinking about whether you want to improve or remove them. Usually, you’ll end up doing a bit of both. Pages that target keywords that truly matter to you should just be improved. If other pages are targeting keywords that are not interesting enough to your business, or they’re not targeting anything at all, get rid of them. You can choose to no index them, thereby preventing Google from showing them in the index, but I honestly think you’re usually better off deleting pages like that.

Weekly SEO Recap: Google app indexing & rel=author

Joost's weekly SEO recapAt Yoast, we’ve been ridiculously busy getting ready for all the product launches we’ve got coming up, like our new eBook and our upcoming Basic SEO training course. Luckily, we also still have time to look at the news because Google has been rather busy and there was other stuff to tell too, regarding new Apple releases.

Google Panda & Penguin

So… Google Panda 4.2 is apparently “still rolling out“. This makes it harder for people like us to diagnose whether a site got hit by Panda, but luckily not undoable (the signs are usually relatively clear for the trained eye). You’d hope that more info would come with that, but there was nothing else.

Google Penguin on the other hand seems to be truly becoming “real-time”. Gary Ilyes of Google said at SMX that he “hoped” it would be ready by the end of the year.

I have to say that it’s getting harder and harder to trust specifically Gary when he says things because it’s been kind of hit and miss. We’ll have to see what comes of it.

Google wants your app data

In other Google news, Google seems to understand that it’s slowly missing the boat. They now say that they’ll give a ranking boost if you use app indexing. They’re afraid that if they don’t get everyone to include app-indexing, which allows Google to index the contents of mobile phone apps, they won’t be a complete search engine anymore and platforms like Facebook might beat them at some point.

The problem with remarks like this from Google is knowing whether it’s actually true. It’s very easy for them to say that they’ll give you a ranking boost, it’s now up to the global SEO community to prove whether they did or not.

Rel=author making a comeback?

In what I’d clearly call the weirdest news of the week, Gary Ilyes also said you shouldn’t remove your rel=author markup. I was personally involved in getting that markup on millions of sites (by adding it to our plugins and to WordPress core). I took it out the day Google dropped the author highlight. I’d be happy to add it back in, but I’ll need some more info before we do, so I’ve reached out to an engineer at Google to see if he could comment.

Mac OS X 10.11: el Capitan

Safari Pinned TabsWhen a new OS X comes out, experienced Mac users will often go straight to Ars Technica for their review of the newest version of OS X to see what’s new. You should too. I read it and the pinned tab feature was the one that made me think “hah, I might need to do that”. So I added a pinned tab icon to Yoast.com this morning, and then wrote a quick tutorial on adding one of these so called mask-icons just now.

Not exactly SEO, but branding is incredibly important for your long term search rankings too.

That’s it, see you next week!

joost signature