This week we had quite a bit of news. But…. I’ve written about it all, already. So I’m gonna be a lazy guy and point you straight at the three posts I wrote this week on Google changes:

Joost's weekly SEO Recap

The week started with us not knowing much yet, but it’s good to read this to get an idea of what everybody thought had happened:

Google update: Real time Penguin? Or something else?

Then we realized it wasn’t really Penguin, so we moved on. We got the news that Google had made Google Panda a part of its core algorithm:

Google Panda part of Google’s core algorithm

And finally, we figured out what had changed in Google and what this update was really about: brand terms. Read this post for the full view:

Google core algorithm update: brand terms

In all honesty, this is what we’d call, in Dutch “een storm in een glas water”, which translates as: “a storm in a glass of water”, basically: much ado about nothing.

That’s it, see you next week!

joost signature

Over the weekend we saw an incredibly big shuffle in Google search results. I wrote about it earlier this week, as we were researching what happened. I’ll be honest: we’re still researching. But let me update you on what we know and don’t know about this Google core algorithm update.

Google Core Algorithm update

What we know

We know a few things about this update now. Despite all the promises about a Google Penguin update early this year, this is not it. It’s also not Google Panda. But there’s news about Google Panda anyway, which I’ve written a separate post on:

Read more: ‘Google Panda part of Google’s core algorithm’ »

How we know that this is not Penguin or Panda? Googler Gary Ilyes said so. New Google star Zineb Ait tweeted that this update didn’t have a name but was “just one of the frequent quality updates” (in French).

What did this Google core algorithm update change?

So… What changed? We don’t know. The changes are… Weird. We’ve been using a couple of datasets to look at this update, but most of all we’re looking at SearchMetrics. They publish a list of winners and losers every week and this week the changes seem to have happened mostly for news sites, specifically for brand terms. For instance, check this list of the keywords that the biggest loser in the US, the Atlantic, lost its position for:

Keywords the Atlantic lost traffic for

Almost all of these are brand terms.

Bartosz has written a good post (with tons of interesting screenshots if you don’t have access to SearchMetrics), that touches on some of the things I had seen too. He calls it a “brand bidding update”, which I don’t think is right. I do agree with him that the change was in the type of results that Google shows for brand queries. The switch seems to have been from news articles to more “timeless” URLs.

Slugs and/or site structure?

You won’t believe this, and it’s a correlation only (so don’t say I’ve said this is true), but I’m seeing a high correlation between the keyword(s) being the first word(s) of the slug (the URL) and the ranking. It can’t be that simple though. It’s very possible it has to do with a better site structure for the winners versus the losers. Some of the biggest winners are category URLs on sites that have good optimization for their categories and good internal linking, like Mashable. So… This might be a good time to revisit your Category SEO tactics:

Read more: ‘Using category and tag pages for your site’s SEO’ »

Visibility impacted, but traffic?

SearchMetrics (and many similar tools) calculate a score based on the amount of traffic for a specific term and the position you’re ranking on. The idea is that if you rank, for instance, #3 for a term, you’ll receive a certain proportion of the traffic for that term. This is a very valuable way of looking at a site as a site’s visibility score usually has a high correlation to a site’s traffic.

The problem with this visibility score is when searches are mostly navigational. For instance, we rank in the top 7 for [[google analytics], but we get close to 0 traffic for that term. The reason is that 99.9% of people searching for [[google analytics], actually want to go to Google Analytics.

This means that the actual changes in terms of traffic for this update, even though the changes in visibility are huge, will differ highly per term and will, very often, be negligible. This is in my opinion something in the SearchMetrics visibility score that has to be changed, and something I’ve discussed with my good friend (and SearchMetrics founder and CTO) Marcus Tober before.

Conclusion

The impact of this Google core algorithm update in terms of search results and visibility was huge, the impact on actual traffic might not be as big. There are definitely things we’ll need to figure out over the coming weeks / months though, like what the importance of site structure and URLs are. Interesting times!

 

Panda has become a part of Google’s core algorithm. This post discusses what that means for SEOs and webmasters. Amidst (or actually slightly before) all the kerfuffle about the changes to Google’s core algorithm this week, Jennifer Slegg posted a huge article about Google Panda, that’s really worth a look. It was vetted by people on Google’s webmaster team, and really does contain a lot of nice info. Most of it wasn’t new to me, but the fact that Panda had become a part of Google’s core algorithm  was.

Google Panda became a part of Google's Core algorithm

Bit of history

Google Panda, when it first came out (and was still called the “Farmer update”), was an algorithm that was run every once in a while. “Once in a while” was a few times a year, and being hit by Panda would thus mean that you could be waiting months before you got “redeemed”.

This meant that we could point at a specific “update” of Panda. This, in turn, meant we could see by the date on which changes happened to a site’s traffic whether that change in traffic was related to Google Panda or not and could study what had changed in how the Panda perceived sites. This has now changed slightly.

“Panda is an algorithm that’s applied to sites overall and has become one of our core ranking signals.”

Not real time

The fact that Google Panda is now part of the core algorithm doesn’t mean that it updates in real time. It just means that it’s no longer a spam filter that’s applied after the core ranking algorithm has done its work. What it does mean, is that we can’t say “this was Panda version X”. It still means we can keep track of changes (like the one happening last weekend) and say “this was because of the early January 2016 changes”. We will have to figure out, over time, whether we can reliably see if an update was in fact an update of the Panda part of the algorithm or not.

One of the questions I still have is whether they’ve changed how they run Panda in terms of timing too. I could understand it if they ran it bit by bit on sections of their index in stead of running it over their entire index all at once. This would mean there would be no pinpointing of dates at all anymore. Whether that’s the case is still unclear.

Much ado about nothing

In all, for normal users, this doesn’t change all that much if anything at all. A lot of it is just “SEO Semantics”. You have to make a site that’s high quality, with a good site structure and a good user experience. Easy does it, right?

If you haven't read it yet, read: ‘Google core algorithm update: brand terms’ »

One of the most asked questions in our site reviews is “What happened to my rankings!?”. Although we have the possibility of using all kind of great tools to analyze your website, this questions remains one of the toughest ones to answer.

Google isn’t always clear about why some sites rank better than others. In this article, I’d like to illustrate how we go about found drops in rankings or traffic, by using an example from our Platinum SEO reviews.

we'll check what happened to your site's ranking

Analyzing the drop

Perhaps something happened to your website at a certain point that caused your rankings to drop or traffic to plummet. Using for instance Google Analytics, you should try to pinpoint the date of the change. The tool we usually use for that, is Searchmetrics.com. Based upon collected data, so not that useful for newer sites, Searchmetrics tells you at what point things changed. Here’s a real life example:

searchmetrics-rankings-drop

That simply hurts. This website took a hit around May 18, perhaps a bit earlier already. Luckily, they were able to gain back a lot of their traffic, but things haven’t been quite like before.

What caused the drop

What we do in this case, is align the graphs in Searchmetrics with Google’s algorithm updates. Usually we turn to Moz, as their overview is accurate and has some convenient links to articles about that drop.

Panda 4.0 (#26) — May 19, 2014
Google confirmed a major Panda update that likely included both an algorithm update and a data refresh. Officially, about 7.5% of English-language queries were affected. While Matt Cutts said it began rolling out on 5/20, our data strongly suggests it started earlier.

Hm, that sounds about right. Right? There was a Payday Loan update on the 16th as well, so let’s not jump to conclusions. What could have triggered a Panda penalty in this case?
majestic-exampleThe website at hand is packed with news, has built up quite some authority over the years and its link profile looks awesome (see image on the right). In MajesticSEO, which we use alongside Searchmetrics for deeper insights on backlinks, that deep purple color indicates loads of linking domains. One might argue on the trust flow of these sites, but a lot of these circle around 20-30 and that seems alright (there is always room for improvement, right). This  shouldn’t have anything to do with that drop.

Panda is about quality. So we checked a number of things, like content quality, design quality, code quality. All seemed right. At that moment, we were tearing our hair out and drinking heavily, as we just could not figure out what happened.

Ghost banners

Ghost banner exampleJust the other day, I was at Conversion Hotel and an obvious subject was ghost buttons. Don’t use these, they’re scary. The website we were analyzing might have had a penalty because of another bad spirit, being ghost banners. Google is a machine. Google recognizes certain sized elements on your website, that have a link and might perhaps be a banner. Think along the lines of a popular posts section, a list of product images or (and even worse) links to related articles on other websites. If your website is packed with these, Google could quite easily mistake these for banners.

Does this mean changing something like a sidebar would help you get rid of that possible Panda penalty? That isn’t guaranteed. Too many factors we can’t influence might play a part in this. What we do in our Platinum SEO review, to our best knowledge and using our combined experience, is analyze your website and identify every possible issue your website might have. The review leaves you with an extensive to-do list. If you follow-up on that list, or have someone follow-up on that list, you will be able to serve your website the best way possible to Google and other search engines.

There is no number one ranking guarantee. You have a part in that optimization, and Google has a part in picking up on changes. But you’ll be able to give it your very best go, using loads of know-how we provide and loads of market knowledge you already have.

Our Platinum SEO review gives you a complete overview of all the things you can do SEO wise to improve your website for search engines.

And it is available now (until December 11) for a mere $1,799 instead of the usual $2,499. So if you have experienced any strange drops in your rankings or traffic, we’d be happy to have a look!

The Yoast SEO plugin helps you to optimize your text for the keyword you want to be found for. In Yoast SEO 3.0 we made some big changes in our content analysis. In this post we’ll discuss the adaption of our Yoast SEO keyword density check and the possibility to optimize for multiple keywords in Yoast SEO Premium.

keyword stuffing is not a good idea in a post-Panda world

Keyword stuffing is not a great SEO strategy. You’ll be hit by Google Panda (or another update) in no time. Optimizing your text for specific keywords however, is something you definitely can do! This is the reason we have our focus keyword functionality in Yoast SEO. If you go to far though, over-optimization is around the corner. Over-optimization can be seriously dangerous, which is why our Yoast SEO plugin has some safeguards (in the form of red bullets) of doing so.

Keyword density check is much stricter now

Google prefers nice, readable texts. Your text should be well structured and attractively written. Texts with a high keyword density do not read nicely. They are, in fact, terrible to read!  Instead of using your focus keyword over and over, you should use synonyms if possible. Google actually recognizes synonyms to keywords now. With Google’s ability to recognize synonyms, optimizing for a single focus keyword becomes more and more silly.

We therefore decided to update the keyword density check in Yoast SEO. In the new Yoast SEO the keyword density for your post has to be between 0.5 and 2.5%. In the old analysis, you could get away with a keyword density of as high as 4.5%. We’ve come to the conclusion that that’s just too high in this post-Panda world!

If you want to check your old posts and make sure their keyword densities are within our new guideline, you can do so. When you upgrade you’ll see (or have seen) a notice about recalculating SEO scores. This is one of the things we recalculate at that point. This does mean that a post that was green before can now suddenly turn red… If you can’t find that notice, you can find the tool under SEO → Tools.

Multiple keywords

In Yoast SEO premium we have a new feature which enables you to optimize for more than one focus keyword. You could use this in optimizing for two related keywords, allowing you to rank in Google on different keywords. You could also use this to optimize for two synonyms. Optimizing a post for two or three synonyms simultaneously while still requiring a 1% keyword density as a minimum, would lead to over-optimization and thus angry Pandas. This was one of the reasons to lower our “required” keyword density to 0.5%. We are actually working on some new functionality now, allowing you to treat synonyms and multiple keywords differently in our Content SEO analysis. As that has multiple implications that’ll take a while to get right.

Your SEO strategy should never focus on one single keyword. You really do need a proper keyword strategy. Sometimes it’s useful to try to make a single post or page rank for multiple (related) keywords. Perhaps you have a shop in ballet accessories and are writing a post about ballet shoes. But, you’d also like this post to rank for [dance shoes], as [dance shoes] is a more general (and common) search term. Our multiple keywords functionality is actually really well fitted to help you optimize for more than one keyword like this. It also allows you to focus on multiple angles and words, reducing the risk that you over-optimize your texts.

Until the end of the year, Yoast SEO Premium, Yoast_SEO_In_Icon_71_x2which has this multiple keywords functionality, costs only $69 per year for support, upgrades and updates.

When you delete a page (or post) from your site, you delete one or more URLs too. That old URL, when visited, will usually return a 404 not found error. But is that what you wanted? Maybe that page should be redirected somewhere. If not, and you deliberately deleted that content, serving a 410 header would actually be a better idea. This post explains the choices you have and how to make them.

Optimize your site for search & social media and keep it optimized with Yoast SEO Premium »

Yoast SEO: the #1 WordPress SEO plugin Info

Redirect or delete a page completely?

The first choice you have to make is whether or not the content you deleted has a proper equivalent on your site. Think of it this way: if I clicked on a link to the page you deleted, would there be another page on your site that gives me the information I was expecting? If that’s true for a specific page on your site for a majority of the people clicking on that link, you should redirect the deleted URL to that page.

In general, I’d advise you to redirect a page even when only a smaller portion of the visitors would benefit from that redirect. The reasoning is simple: if the other option is for all your visitors to get a page saying “content not found”, that’s not really a good alternative either…

Create a redirect

When you redirect that deleted page URL to another URL, make sure the redirect you use is a so-called 301 redirect. There are several types of redirects, but a 301 redirect is what’s called a permanent redirect. In doing so, Google and other search engines will assign the link value of the old URL to the URL you redirected the visitors too.

Delete a page completely (?)

If there is no page on your site with that information, you need to ask yourself the question: should I really be deleting that page? Or should I just make it better? If you decide to delete it nonetheless, make sure you send the proper HTTP header: a 410 content deleted header.

404 and 410 HTTP headers

The difference between a 404 and a 410 header is simple: 404 means “content not found”, 410 means “content deleted” and is thus more specific. If a URL returns a 410, Google is far more certain you removed the URL on purpose and it should, therefore, remove that URL from its index. This means it will do so much quicker.

If you’re using WordPress and our Yoast SEO Premium plugin, the redirects module in this plugin is capable of serving 410 headers. The redirects manager is the perfect tool for working with redirects. It automatically asks you what you want to with a certain URL when you delete it or change the permalink. You can set any type of redirect and it even ties into Google Search Console, so you can fix errors directly from the WordPress backend.

The problem with serving 410 content deleted headers is that Google’s support for it is incomplete. It will delete pages that serve a 410 from its index faster. Yet, in Google Search Console, Google will report 410s under “Not found” crawl errors, just like 404s. I’ve complained to Google about this several times but unfortunately, they have yet to fix it.

Collateral damage when deleting a page

When you delete one or more posts or pages from your site, there’s often collateral damage. Say you deleted all the posts on your site that have a specific tag. That tag now being empty, its archive’s URL will also give a 404. Even when you handle all the URLs of those posts you deleted properly (by redirecting or 410ing them) the tag archive will still give a 404, so you should make sure to deal with that URL too.

Even when you didn’t delete all the posts in a tag, the tag archive might now have 5 instead of 12 posts. If you display 10 posts per page in your archives, page 2 of that archive will now no longer exist, and thus give a 404 error. These aren’t the biggest problems in the world when you delete one or two posts, but if you’re dealing with a Google Panda problem and because of that deleting lots of poor content, creating a lot of 404s like this can take your site down even further, so proceed with care!

Read more: ‘Which redirect should I use?’ »

The post How to properly delete a page from your site appeared first on Yoast.

After this week’s launch of our first Basic SEO training, it’s now time to sit back and relax. For about two minutes. The next milestone is our upcoming Yoast SEO release which we’ve determined, because of the sheer amount of new features in it, will be called Yoast SEO 3.0. Luckily, the search engines took it reasonably slow on us this week, so this recap was easy to write. As I was writing it, it turned out I was giving more advise on dealing with Google Panda, but that can never be a bad thing I guess.

Joost's weekly SEO recap

Google discontinues AJAX crawling

Several months after hinting at it, Google has stopped encouraging people to use their AJAX crawling guidelines from 2009. I have to say, this is done better than Google normally treats its APIs. Sites built using the methods from 2009, including the use of so called _escaped_fragment URL’s will still be indexed, but Google recommends that when you change (parts of) your site, you remove the need for these.

Google has a history of dropping support for API’s at far too short notice for normal companies to comply, so this is actually a rather good gesture. What annoys me is that they don’t announce a specific cut-off date. All they say is:

If your current setup is working fine, you should not have to immediately change anything. If you’re building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls.

Some corporate websites out there have been built using this AJAX crawling system and won’t go away or be changed for years to come. A specific cut-off date would be very useful for sites like these, even (or maybe even: especially) when that date is 2 or 3 years in the future.

Google Panda: delete content the right way

It was if the devil was playing with it. On October 6th, I posted about Google Panda and how we suggest sites deal with that. In this post I included advice to delete low quality pages. Two days later, Gary Ilyes of Google, tweets the following in response to Jennifer Slegg of The SEM Post:

Of course, people were confused since I’d just said something “else”. He later went on to clarify, after several SEOs were pressing him. That’s when it became clear why he said what he said:

“We see too many people cut the good”. That’s why he says don’t delete. Don’t throw the baby out with the bathwater. If someone still might find content useful, don’t delete it. But that’s very similar to what I said.

So how do you fix your Google Panda problems?

I don’t want to reiterate what I said in this post about Google Panda, which you should read if you’re into this, but let’s make sure it’s well understood. Fixing your site’s Google Panda issues is about several things:

  1. Improve content that’s not good enough but might be useful.
  2. Remove content that will never be useful or that you can’t properly maintain.
  3. Improve your site structure so that content that might be useful can be (more easily) found.

It’s not one of those three things, it’s all of them. I’ve seen plenty of sites that had content that was auto-generated, auto-generated tag pages, search pages, etc. Those are pages that no one is ever going to find interesting. You can’t keep those pages, you really should delete them.

That’s it, see you next week!

joost signature

While reviewing websites, quite often we find sites that already have become a victim of Google Panda’s algorithm, or run the risk of getting “Pandalized”. A site runs the risk of being hit by Panda when it has a lot of low quality pages. In this post I want to give you some quick insights into what Google Panda is and how you can prevent from getting hit by it.

What is Google Panda?

A quick intro by yours truly on what Google Panda is:

How to fix low quality pages

In all, Panda usually affects your site because you have too many low quality pages as a proportion of your overall number of pages. So you need to fix your low quality pages in order for that ratio to become healthy again. You can fix problems with low quality pages in two ways:

  1. You improve the content and quality of these pages. This means adding more well written content, but it often also means making sure you don’t have too many ads on the page and improve the UX of the page.
  2. You remove these low quality pages or disallow the search engine access to the pages. This last method is often called a “Panda diet”.

Identifying which low quality pages to fix

Whether you decide to improve your content or to remove your low quality pages, you need to know where to start. The best way to identify pages that need fixing is to look at pages on your site that have very few visitors and/or a very high bounce rate. These are pages that aren’t ranking or that aren’t doing anything for your overall site’s performance because people go away from these pages too quickly.

There are a couple of tools that can help you identify which low quality pages need fixing. The one I like most for smaller to medium sized sites is Screaming Frog. When you’ve opened it, you can connect to a site’s Google Analytics data and then have Screaming Frog crawl the site, after which you can sort by bounce rate. There are also some filters to show you pages above 70% bounce rate (which really is too high) and pages that have no GA data because nobody visited those pages…

It’s important to note that sometimes, a bounce doesn’t mean something’s wrong. On our knowledge base, if a page has a high bounce rate, the reason usually is that it solved the problem. So people didn’t do anything else because they didn’t have to. So you really have to go into the pages you’ve identified with these methods and evaluate their individual quality.

Improve or remove?

Once you’ve identified which pages on your site needs fixing, you need to start thinking about whether you want to improve or remove them. Usually, you’ll end up doing a bit of both. Pages that target keywords that truly matter to you should just be improved. If other pages are targeting keywords that are not interesting enough to your business, or they’re not targeting anything at all, get rid of them. You can choose to no index them, thereby preventing Google from showing them in the index, but I honestly think you’re usually better off deleting pages like that.

Joost's weekly SEO recapAt Yoast, we’ve been ridiculously busy getting ready for all the product launches we’ve got coming up, like our new eBook and our upcoming Basic SEO training course. Luckily, we also still have time to look at the news because Google has been rather busy and there was other stuff to tell too, regarding new Apple releases.

Google Panda & Penguin

So… Google Panda 4.2 is apparently “still rolling out“. This makes it harder for people like us to diagnose whether a site got hit by Panda, but luckily not undoable (the signs are usually relatively clear for the trained eye). You’d hope that more info would come with that, but there was nothing else.

Google Penguin on the other hand seems to be truly becoming “real-time”. Gary Ilyes of Google said at SMX that he “hoped” it would be ready by the end of the year.

I have to say that it’s getting harder and harder to trust specifically Gary when he says things because it’s been kind of hit and miss. We’ll have to see what comes of it.

Google wants your app data

In other Google news, Google seems to understand that it’s slowly missing the boat. They now say that they’ll give a ranking boost if you use app indexing. They’re afraid that if they don’t get everyone to include app-indexing, which allows Google to index the contents of mobile phone apps, they won’t be a complete search engine anymore and platforms like Facebook might beat them at some point.

The problem with remarks like this from Google is knowing whether it’s actually true. It’s very easy for them to say that they’ll give you a ranking boost, it’s now up to the global SEO community to prove whether they did or not.

Rel=author making a comeback?

In what I’d clearly call the weirdest news of the week, Gary Ilyes also said you shouldn’t remove your rel=author markup. I was personally involved in getting that markup on millions of sites (by adding it to our plugins and to WordPress core). I took it out the day Google dropped the author highlight. I’d be happy to add it back in, but I’ll need some more info before we do, so I’ve reached out to an engineer at Google to see if he could comment.

Mac OS X 10.11: el Capitan

Safari Pinned TabsWhen a new OS X comes out, experienced Mac users will often go straight to Ars Technica for their review of the newest version of OS X to see what’s new. You should too. I read it and the pinned tab feature was the one that made me think “hah, I might need to do that”. So I added a pinned tab icon to Yoast.com this morning, and then wrote a quick tutorial on adding one of these so called mask-icons just now.

Not exactly SEO, but branding is incredibly important for your long term search rankings too.

That’s it, see you next week!

joost signature

Joost's weekly SEO recapI write this knowing that next week I’ll be in Munich with many friends of the SEO industry for SEOktoberfest, a conference unlike any other. We’ll talk shop with some of the best and brightest minds in our industry and to say I’m excited is an extreme understatement.

Google loses antitrust case in Russia

These headlines are coming more and more. This week, Google lost an antitrust case in Russia. This follows on similar cases in India, Europe and many countries throughout the world. It’s not all that surprising, given Google’s power, that governments everywhere are becoming nervous. We’ll see what all these cases together end up doing to how Google organizes itself and how it behaves. Google’s recent change to its corporate structure, creating Alphabet Inc, should probably be seen in light of these issues too.

In other legal news, Google was the suing party in a case against another firm impersonating Google in their robo-calls.

Structured markup and HTTPS as a ranking factor

We wrote about this not too long ago, saying structured markup is not a ranking factor. Obviously, Google reserves the right to start using it for ranking. I would say that too, if I were Google, and I wanted to get more people to add that kind of markup to their pages. We’ll believe it when we see the first research that confirms that they’re using it for ranking. This piece of user-research did show that having snippets on position 2 beats being in the first spot. What I did find curious was they incorporated an author image in the research, something that basically doesn’t exist anymore in the normal search results.

In a similar vein, Google said that HTTPS acts as a “tie breaker“. This is what I’d describe as a “weak ranking factor” at that point. Even if it is only a tie breaker, you should probably still do this for a myriad of other reasons: consumer trust, more reliable analytics data, etc. My thoughts on this are essentially the same as in January 2014.

Over-optimization a common issue?

In a recent hang out, Google’s Gary Ilyes once again stressed the important of quality content:

… what I see often is people try to rank for queries they don’t have high quality and great value content for, and that’s the problem. Sooner or later the algorithms will catch that, you don’t have to overthink it, it’s simple content analysis and they will adjust the rank for the site, and that’s it.

Of course, this is much in line with our own thinking on the topic. If you’re in doubt about your content strategy, our book Content SEO is a must-read. In that same hang-out, Gary also discussed the slowness of the Panda roll out, though we’ve seen reports that Google rolled back parts of Panda as well, we discussed those last week.

That’s it, see you next week!

joost signature

This post first appeared as Weekly SEO Recap: antitrust ranking factors on Yoast. Whoopity Doo!