Site migrations are probably not on most people’s fun list. Alas, sometimes they’re necessary to ensure the continued health of a website. Once you’ve decided you need to do a migration, it’s important to make sure you know what you’re doing and make a plan for how to approach things beforehand. Whether you’re moving from HTTP to HTTPS, switching your TLD, or moving to another domain: think about what you need to change, and make sure you can easily pinpoint the cause if something goes wrong.

In any case, take into account that you may experience lower traffic, the first few weeks after your migration, simply because your new URLs need to be properly indexed again. If everything goes well and you don’t break things, your rankings will improve over time. Let’s discuss a case where this might happen in some more detail in today’s Ask Yoast!

Anthony Spitery emailed us about his situation:

Our website ranks well in Google but it’s a subdomain of another URL that is no longer registered. We want to move to another host and we’re wondering what the SEO impact would be if we turned the subdomain into the primary domain. Do we lose our ranking?

Watch the video or read the transcript further down the page for my answer!

The impact of a site migration

“Well, yes, you’ll lose rankings, because you have to migrate it, so you’ll have to take a bit of a hit. It’s probably a better idea in the long run, though, so I would still do it. But you have to realize that for somewhere between three and six months you will take a loss in traffic. That loss in traffic can vary: I’ve seen less than 10%, but I’ve also seen more than 40%. So, it can be quite a painful experience.

But it’s worth it in the long run, especially if that other domain is not used anymore, because otherwise, that reflects poorly on your business as well. So I would take the hit, and do it. Good luck.”

Ask Yoast

In the series Ask Yoast, we answer SEO questions from our readers. Do you have an SEO-related question? A pressing SEO dilemma you can’t find the answer to? Send an email to, and your question may be featured in one of our weekly Ask Yoast vlogs.

Note: you may want to check our blog and knowledge base first, the answer to your question could already be out there! For urgent questions, for example about the Yoast SEO plugin not working properly, please contact us through our support page.

Read more: Domain names and their influence on SEO »

The post Ask Yoast: Site migration and rankings appeared first on Yoast.

As a website gets bigger, it’s often hard to prevent pages from becoming duplicates or near-duplicates of each other. This can cause duplicate content issues. If you have two similar pages, and they are both eligible to rank for a certain keyphrase, the search engine simply doesn’t know which of the two URLs it should send the traffic to. To solve this, you can select a preferred URL, this is what we call the canonical URL.

Same content, multiple URLs

You might, for instance, have a post or product that is attached to two categories and exists under two URLs, like so:

If these URLs are both for the same product, choosing one as the canonical URL tells Google and other search engines which one to show in the search results.

Canonicals also enable you to point search engines to the original version of an article. Let’s say, you’ve written a post for another party that is published on their website. If you’d like to post it on your site too, you could agree on posting it with a canonical to the original version.

How to detect a canonical URL

A canonical URL can be seen in the source of a webpage, by searching for rel="canonical". It is an element only the search engines see, your users won’t be affected by it.

a canonical in the source code

An example of a canonical in the source code of one of our posts: it refers to the original version of the article that was first published on another website.

When to redirect, when to use a canonical

Improve your technical SEO skills!

  • Which technical SEO errors hurt your site?
  • Solve them and climb the rankings!
  • Improve your site speed on the go
  • On-demand SEO training by Yoast
More info
Unlike with redirects, users don’t see your canonical. If you can redirect a URL without breaking your site. You should. But if redirecting makes your site illogical, setting the canonical is a viable solution.

Want to learn more? Read our Ultimate guide to canonical URLs.

The post What is a canonical URL? appeared first on Yoast.

Every page on the web has an address, a URL, which stands for ‘Uniform Resource Locator’. Sometimes, content moves from one URL to another URL. That’s when a redirect is needed. A redirect automatically makes a browser go from one URL to another URL.

A redirect can point to any other URL: it doesn’t need to point to the same website. Redirects to another domain are sometimes referred to as cross-domain redirects.

Types of redirects

There are several ways of making a browser redirect. Redirects can be divided into two classes: serverside redirects and client-side redirects. Each of these can then be sub-divided into several types.

Serverside redirects

Serverside redirects are performed directly on the server and result in a tiny bit of content being sent to the browser, in so-called HTTP status headers. The browsers then know where to go and will follow immediately. These HTTP headers have a code for the type of serverside redirects, and a new location to which the browser should take you.

Redirect type Use case Browser impact SEO Impact
301 A permanent redirect, used for when a page has moved or for when a page has been deleted and similar content can be found elsewhere. Browsers will cache a 301 redirect and immediately perform it again next time without needing to fetch the original URL again, until the cache is cleared. Search engines follow the redirect and will add the new URL to the index. Links pointing to the old URL will be counted towards the ranking of the new URL.
302 A temporary redirect, used for when a page needs to be temporarily moved, or for when the original URL should always be requested. This is for instance the case with language or geo-location based redirects. Browsers will not cache a 302 redirect, so the server will be getting a request for the original URL every time. Search engines will follow the redirect, but maintain the old URL in their index. Because too many systems use a 302 by default, when a 301 should instead be used, search engines tend to treat long-standing 302s like 301s in many ways.
307 An “improved” temporary redirect, that will always be treated as temporary by search engines. Browsers will never cache 307 redirects. Search engines might not always follow 307 redirects as they’re deemed temporary.
308 Hardly ever used, a 308 means “follow this redirect and never go to the old URL again”. Browsers will hard cache 308 redirects. Similar to a 301.

Client-Side redirects

A client-side redirect is the result of some code that runs in the browser and then redirects the ‘client’, the browser, to another URL. To be able to run that code, it needs to be sent to the browser first, and therefore this is always a slower solution. Client-side redirects should thus be prevented as much as possible.

Even better SEO with Yoast SEO Premium!

  • Optimize your site for the right keywords
  • Never a dead link in your site again
  • Previews for Twitter and Facebook
  • Get suggestions for links as you write
More info
There are two types of client-side redirects: the so-called meta refresh, which refreshes the page to another URL after a particular period of time, or a JavaScript redirect, which changes the window’s URL after that code has been run. The SEO impact of both types of client-side redirects is hard to quantify, but usually, it’s not as reliable as serverside redirects.

When to create a redirect

Redirects should be created when:

  • You’re moving from one system to another and change URLs because of that.
  • You’ve deleted a page and there is similar content available elsewhere.
  • You’re merging the content of several pages into one.

Read more: Which redirect should I use? »

The post What is a redirect? appeared first on Yoast.

As much as we advocate holistic SEO here at Yoast, there will always be people turning to the dark side, employing less than savory techniques for their own gain. When someone targets a website with actions intended to lower its ranking in the SERPs, it’s called ‘negative SEO’.

One way people can try to damage a site’s rankings, is by getting loads of unnatural, shady links to point to a website. Now, you shouldn’t worry about being the target of a negative SEO attack like that the moment you notice a drop in your rankings! In most cases, the cause is something else. But, if you ascertained that there’s suddenly a great many shady backlinks to your site, it may be time to take action. Google’s disavow links tool allows you to ask Google not to take certain links into account when assessing your site. But is it OK to use this tool, and is it always necessary? Let’s discuss in today’s Ask Yoast!

Shant emailed us about this predicament:

I noticed about 18,000 links to my domain in Google Search Console from a few unethical websites. I suspect someone is targeting me with negative SEO, but my rankings are currently not affected. Should I still disavow these 18,000 links to my domain or could this harm my ranking? Or will Google’s algorithm realize this is a negative SEO effort and ignore them?

Watch the video or read the transcript further down the page for my answer!

Dealing with bad backlinks

“Well, if you don’t want those links, then disavowing them doesn’t really hurt you. If you know how to disavow them, by all means do it. And you can disavow at a domain level, so if they only come from a few domains then just disavow those entire domains. If they’re not links you’re proud of, then they’re probably not helping you rank either.

Even better SEO with Yoast SEO Premium!

  • Optimize your site for the right keywords
  • Never a dead link in your site again
  • Previews for Twitter and Facebook
  • Get suggestions for links as you write
More info
But, if it’s not really hurting your rankings at the moment, then you can also just do nothing because, yes, Google will usually figure out a lot of this by itself and say, “Hey, these domains are really, really shady and we should not allow these links to do anything. Good luck.”

Ask Yoast

In the series Ask Yoast, we answer SEO questions from our readers. Do you have an SEO-related question? A pressing SEO dilemma you can’t find the answer to? Send an email to, and your question may be featured in one of our weekly Ask Yoast vlogs.

Note: you may want to check our blog and knowledge base first, the answer to your question could already be out there! For urgent questions, for example about the Yoast SEO plugin not working properly, please contact us through our support page.

Read more: Clean up your bad backlinks »


The post Ask Yoast: Disavow backlinks from shady sites? appeared first on Yoast.

SEOs are, more often than not, “cat people”. That’s no surprise, given that we spend a lot of time online, and given that The Internet Is Made Of Cats. That’s why the first website which I’m looking at in this series belongs to the Cats Protection charity – a UK organization dedicated to feline welfare.

I’m going to explore their website and see where they have issues, errors, and opportunities with their technical SEO. Hopefully, I’ll find some things which they can fix and improve, and thus improve their organic visibility. Maybe we’ll learn something along the way, too.

Not a cat person? Don’t worry! I’ll be choosing a different charity in each post. Let me know who you think I should audit next time in the comments!

Introducing the brand

The Cats Protection homepage

Cats Protection (formally the Cats Protection League, or CPL) describe themselves as the UK’s Largest Feline Welfare Charity. They have over 250 volunteer-run branches, 29 adoption centers, and 3 homing centers. That’s a lot of cats, a lot of people, and a lot of logistics.

Their website (at reflects this complexity – it’s broad, deep, and covers everything from adoption to veterinary servers, to advice and location listings.

Perhaps because of that complexity, it suffers from a number of issues, which hinder its performance. Let’s investigate, and try to understand what’s going on.

Understanding the opportunity

Before I dive into some of the technical challenges, we should spend some time assessing the current performance of the site. This will help us spot areas which are underperforming, and help us to identify areas which might have issues. We’ll also get a feeling for how much they could grow if they fix those issues.

The ‘Visibility Index’ score (Sistrix) for, desktop and mobile

I’m using Sistrix to see data on the ‘visibility index‘ for, for desktop and mobile devices. We can see that the site experienced gradual growth from 2010 until late 2013, but – other than a brief spike in late-2018 (which coincides with Google’s “Medic” update in August)- seems to have stagnated.

Where they’re winning

As we dig into some of the visibility data, you can see that Cats Protection rank very well in the UK for a variety of cat- and adoption-related phrases. Let’s see some examples of where they’re winning.

Keyword # Page
lost cat 1 /help-and-advice/lost-a-cat
cats adoption 1 /adopt-a-cat
buying a cat 1 /adopt-a-cat/[…]adopt-from-us
cat reproduction 1 /help[…]/reproduction
cat charity 1 /
cat protection 1 /
cat neutering 1 /what-we-do/neutering
neuter cat 1 /what-we-do/neutering
buy a cat 1 /adopt-a-cat/[…]adopt-from-us
cat help 1 /cat-care/help-and-advice
adopting a cat 1 /adopt-a-cat/ready-to-adopt
cat org 1 /
outdoor cats 1 /uploads/[…]outdoor_cats.pdf
heart murmur in cats 1 /uploads/[…]heart_disease.pdf
free cat neutering 1 /what-we-do/neutering
cats 2 /
cat care 2 /cat-care
feral cats 2 /help-and-advice/feral-cat
cat for adoption 2 /adopt-a-cat
cats for adoption 2 /adopt-a-cat
cat body language 2 /help[…]/body-language
plants poisonous to cats 2 /help[…]/dangerous-plants
neutering cats 2 /what-we-do/neutering
found cat 2 /help-and-advice/found-a-cat
old cats 2 /help-and-advice/elderly-cats
cat behaviour problems 2 /[…]behaviour-problems

They’re doing a great job. They’re ranking highly for important, competitive keywords. In almost all cases, the result is a well-aligned page which is a great result, full of useful information.

SALE: Learn keyword research right!

  • Find your most effective keywords
  • Expert advice from SEO pro's at Yoast
  • Online & on-demand SEO training
  • Hands-on: start optimizing today!
More info

These kinds of keywords likely account for a large proportion of the visits they receive from search engines. It’s reasonable to assume that they also drive many of their conversions and successful outcomes.

But they’re losing to the competition

It’s not all good news, though. Once you explore beyond the keywords where they’re winning, you can see that they’re often beaten to the top positions on key terms (e.g., “adopt a cat”, where they rank #2).

In cases like this, they’re frequently outranked by one of three main competitors in the search results:

  • Purina – a cat food manufacturer/retailer.
  • Blue Cross – a charity dedicated to helping sick, injured and homeless animals
  • The RSPCA – a general animal welfare and rehoming charity

Now, I don’t want to suggest or imply that any one of these is a better or worse charity, service, or result than Cats Protection. Obviously, each of these (and many more) charities and websites can co-exist in the same space, and do good work. There’s plenty of opportunity for all of them to make the world a better place without directly competing with each other.

Purina’s “Cat Anxiety” page has almost 1,000 words of helpful content.

In fact, for many of the keywords they’re likely to be interested in, the results from each site are equally good. Each of these sites is doing great work in educating, supporting, and charitable activity.

Even Purina, which isn’t a charity, has a website full of high quality, useful, content around cat care.

However, among the major players in this space, Cats Protection has the lowest visibility. Their visibility is dwarfed by Blue Cross and the RSPCA, and the gap looks set to continue to widen. Even Purina’s content appears to be eating directly into Cats Protection market share.

The Visibility Index of Cats Protection vs organic search competitors over time

It’d be a shame if Cats Protection could be helping more cats, but fail to do so because their visibility is hindered by technical issues with their website.

To compete, and to grow, Cats Protection needs to identify opportunities to improve their SEO.

Looking at the long tail

Cats Protection probably doesn’t want to go head-to-head with the RSPCA (or just fight to take market share directly from other charities). That’s why I’ll need to look deeper or elsewhere for opportunities to improve performance and grow visibility.

If the site gets stronger technically, then it’s likely to perform better. Not just against the big players for competitive ‘head’ keywords, but also for long-tail keywords, where they can beat poorer quality resources from other sites.

As soon as you start looking at keywords where Cats Protection has a presence but low visibility, it’s obvious that there are many opportunities to improve performance. Unfortunately, there are some significant architectural and technical challenges which might be holding them back.

I’ve used Sitebulb to crawl the site, and I’ve found three critical issues. These areas contribute significantly to the low (and declining) visibility.

Critical issues

1. The site is fragmented

Every individual branch of the charity appears to get and maintain its own subdomain and own version of the website.

For example, the Glasgow branch maintains what appears to be a close copy of the main website, and both the North London branch and the Birmingham branch both maintain their own divergent ‘local’ versions of the site. Much of the content on these sites is a direct copy of that which is available on the main website.

Fragmentation is harming their performance

This approach significantly limits visibility and potential, as it dilutes the value of each site. In particular;

  1. Search engines usually consider subdomains to be separate websites. It’s usually better to have one big site than to have lots of small websites. With lots of small sites, you risk value and visibility being split between each ‘sub site’.
  2. Content is repeated, duplicated, and diluted; pages that one team produces will often end up competing with pages created by other teams, rather than competing with other websites.
  3. The site doesn’t use canonical URL tags to indicate the ‘main version’ of a given page to search engines. This makes this page-vs-page competition even worse.

This combination of technical and editorial fragmentation means that they’re spread too thin. None of the individual sites, or their pages, are strong enough to compete against larger websites. That means they get fewer visits, less engagement, and fewer links.

You can see some examples below where fragmentation is a huge issue for search engines. Google – in its confusion between the multiple sites and duplicate pages – continually switches the rankings and ranking pages for competitive terms between different versions. This weakens the performance and visibility of these pages, and the overall site(s).

Google continually switches the ranking page(s) for competitive keywords

Rankings for “adopt a kitten” continually fluctuate between competing pages

If Cats Protection consolidated their efforts and their content, they might have a chance. Otherwise, other brands will continue to outperform them with fewer, but stronger pages.

Managing local vs general content

While it makes sense to enable (and encourage) local branches to produce content which is specifically designed for local audiences, there are better ways to do this.

They could achieve the same level of autonomy and localization by just using subfolders for each branch. Those branches could create locale-specific content within those page trees. ‘Core’ content could remain a shared, centralized asset, without the need to duplicate pages in each section.

At the moment, their site and server configuration actually appears to be set up to allow for a subdomain-based approach., for example, appears to resolve to the same content as It seems that they’ve just neglected to choose which version they want to make the canonical, and/or to redirect the other version.

They’ve also got some additional nasty issues where:

  • The non-HTTPS version of many of the subdomains resolves without redirecting to the secure version. That’s going to be fragmenting their page value even further.
  • Requests to any subdomain resolve to the main site; e.g., returns the homepage. Aside from further compounding their fragmentation issues, this opens them up to some nasty negative SEO attack vectors.
  • There are frequent HTTPS/security problems when local branches link (or are linked to) including the ‘www’ component and the location subdomain (e.g.,

Incidentally, if Cats Protection were running on WordPress (they’re on a proprietary CMS running on ASP.NET), they’d be a perfect fit for WordPress multisite. They’d be able to manage their ‘main’ site while allowing teams from each branch to produce their own content in neat, organized, local subfolders. They could also manage access, permissions, and how shared content should behave. And of course, the Yoast SEO plugin would take care of canonical tags, duplication, and consolidation.

Canonical URL tags to the rescue

While resolving all of these fragmentation issues feels like a big technical challenge, there might be an easy win for Cats Protection. If they add support for canonical tags, they could tell Google to consolidate the links and value on shared pages back to the original. That way each local site can contribute to the whole, while maintaining its own dedicated pages and information.

That’s not a perfect solution, but it’d go some way to arresting the brand’s declining visibility. Regardless of their approach to site structure, they should prioritize adding support and functionality for canonical URL tags. That way they can ensure that they aren’t leaking value between duplicate and multiple versions of pages. That would also allow them to pool resources on improving the performance of key, shared content.

Even better SEO with Yoast SEO Premium!

  • Optimize your site for the right keywords
  • Never a dead link in your site again
  • Previews for Twitter and Facebook
  • Get suggestions for links as you write
More info

The great news is, because they’re running Google Tag Manager, they could insert canonical URL tags without having to spend development resources. They could just define triggers and tags through GTM, and populate the canonical tags via JavaScript. This isn’t best practice, but it’s a lot better than nothing!

2. Their best content is buried in PDFs

In many of the cases where Cats Protection is outranked by other charities or results, it’s because some of its best content is buried in PDF files like this one.

A high-quality resource about cat dental health, in a PDF

PDFs typically perform poorly in search results. That’s because they can be harder for search engines to digest, and provide a comparatively poor user experience when clicked from search results. That means that they’re less likely to be linked to, cited or shared. This seriously limits the site’s potential to rank for competitive keywords.

As an example, this excellent resource on cat behavior currently ranks in position #5, behind Purina (whose content is, in my personal opinion, not even nearly as good or polished), and behind several generic content pages.

The information in here is deeper, more specific, and better written than many of the resources which outrank it. But its performance is limited by its format.

If this were a page (and was as well-structured and well-presented as the PDF), it would undoubtedly create better engagement and interaction. That would drive the kinds of links and shares which could lead to significantly increased visibility. It would also benefit from being part of a networked hub of pages, linking to and being linked from related content.

A great resource in its own right, this PDF links to a bunch of even more in-depth PDF resources!

Amazingly, this particular PDF is also only a summary. It references other, more specific PDFs throughout, which are of equally high quality. But it doesn’t link to them, so search engines struggle to discover or understand the connections between the documents.

This is just the tip of the iceberg. There are dozens of these types of PDFs, and hundreds of scenarios where they’re being outranked by lower quality content. This is costing Cats Protection significant visibility, visits, and adoptions.

Aside from being easier to style (outside of the constraints of rigid website templates and workflows), there’s very little reason to produce web content in this manner. This type of content should be produced in a ‘web first’ manner, and then adapted as necessary for other business purposes.

How bad is it?

To demonstrate the severity of the issues, I’ve looked at several examples of where PDFs rank for potentially important keywords.

In the following table, I’ve used Sistrix to filter down to only see keywords where the ranking URL is a PDF, and it contains the word “new” (i.e., “new cat”, “get a new kitten”). These are likely to represent the kinds of searches people make when deciding to adopt. You can see that Cats Protection frequently ranks relatively poorly and that these PDFs aren’t particularly effective as landing pages.

Keyword # PDF
acclimating a new cat 3 EG02_Welcome_home.pdf
getting a new cat 4 EG02_Welcome_home.pdf
looking after a new kitten 4 EG15_Caring_for_your_kitten.pdf
having a new kitten 5 EG02_Welcome_home.pdf
new kitten 6 EG02_Welcome_home.pdf
new kitten care 6 EG15_Caring_for_your_kitten.pdf
new kitten tips 8 EG02_Welcome_home.pdf
getting a new kitten 8 EG02_Welcome_home.pdf
how to take care of a new kitten 9 EG15_Caring_for_your_kitten.pdf
how to take care of your new kitten 10 EG15_Caring_for_your_kitten.pdf
new kitten advice 12 EG15_Caring_for_your_kitten.pdf

This tiny subset of keywords represents over 1,000 searches per month in the UK. That’s 1000 scenarios where Cats Protection inadvertently provides a poor user experience and loses to other, often lower quality resources.

And value might not even be getting to those resources…

Many of the links to these resources appear to route through an internal URL shortener – likely a marketing tool for producing ‘pretty’ or shorter URLs than the full-length file locations.

E.g., redirects to, with a 302 redirect code.

This is common practice on many sites, and usually not a problem – except, in this case, resources redirect via a 302 redirect. They should change this to a 301; otherwise, there’s a chance that any equity which might have flowed through the link to the PDF gets ‘stuck’ at the redirect.

It’s not too late!

The good news is, it’s not too late to convert these into pages and to alter whichever internal workflows and processes currently exist around these resources. That will almost certainly improve the rankings, visibility, and traffic for these kinds of keywords.

Except, upon investigation, you can see from the URL path that it looks like all of these assets were produced in 2013. My guess is that these PDFs were commissioned as a batch, and haven’t been updated or extended since. That goes some way to explaining the format, and why they’re so isolated. Their creation was a one-off project, rather than part of the day-to-day activities of the site and marketing teams.

There’s more opportunity here

The ‘Veterinary Guides’ page on just links out to PDF files.

Because much of the key content is tucked away in PDF files, the performance of many of the site’s ‘hub’ pages is also limited. Sections like this one, which should be the heart of a rich information library, is simply a set of links pointing out to aging PDF files. That limits the likelihood that people will engage, link, share or use these pages.

This a shame, because Cats Protection could choose to compete strategically on the quality of these assets. They could go further; produce more, make them deeper and better, and refine their website templating system to allow them to present them richly and beautifully. This could go a long way toward helping them reclaim lost ground from Purina and other competitors.

At the very least, they should upgrade the existing PDFs into rich, high-quality pages.

Once they’ve done that, they should update all of the links which currently point to the PDF assets, to point at the new URLs. Lastly, they should set a canonical URL on the PDF files via an HTTP header (you can’t insert canonical URL meta/link tags directly into PDFs) pointing at the new page URL.

Not only would that directly impact the performance and visibility of that content, but it’ll also help them to build relevance and authority in the ‘hub’ pages, like their veterinary guides section.

3. Their editorial and ‘marketing’ content is on the wrong website

While this is primarily a technical audit, bear with me as I talk briefly about content and tone of voice. I believe that technical issues and constraints have played a significant role in defining the whole brand’s tone of voice online, and not for the better.

Because, surprisingly for such an emotive ‘product’, much of the content on the website might be considered to be ‘dry’; perhaps even a bit ‘corporate’.

In order to attract and engage visitors (and to encourage them to cite, link and share content – which is critical for SEO performance), content needs to have a personality. It needs to stand for something and to have an opinion. Pages have to create an emotional response. Of course, Cats Protection do all of this, but they do it on the wrong website.

Much of their emotive content lives on a dedicated subdomain (the ‘Meow blog‘), where it rarely sees the light of day.

Emotive and ‘real world’ content lives on the ‘Meow blog’

It’s another fragmentation issue

The Meow blog runs on an entirely separate CMS from the main site (Blogger/Blogspot). This site is also riddled with technical issues and flaws – not to mention the severely limited stylistic, layout and presentation options. This site gets little traffic or attention, and very few links/likes/shares. Much of its content competes with – and is beaten by – dryer content on the main site.

Heartwarming stories about re-homing kittens abound on the Meow blog

But the blog is full of pictures of cats, rescue and recovery stories from volunteers and adopters, and warm ‘from the front line’ editorial content. This is the kind of content you’d want to read before deciding whether or not to engage with Cats Protection, and it should be part of the core user journey. Today, most users miss this entirely.

We can see from the following table, which shows the site’s highest rankings, that their content gets very little traction. The site only ranks in the top 100 results of Google for 258 keywords, and only ranks in the top ten for 5 of those. Nobody who is searching for exactly the kinds of things which Cats Protection should have an opinion on – and be found for – is arriving here.

Keyword # URL
how much does it cost to neuter a cat 4 /[…]cost-of-getting-cat-neutered.html
do i need to vaccinate my indoor cat 6 /[…]do-indoor-cats-need-boosters-vaccinations.html
do cats need booster shots 7 /[…]do-indoor-cats-need-boosters-vaccinations.html
the kittens 9 /[…]kitten-watch-kittens-in-new-homes.html
how to show affection to your cat 9 /[…]5-ways-to-show-your-cat-you-love-them.html
why doesn’t my cat purr 10 /[…]why-doesnt-my-cat-purr-veterinary-faqs.html
do indoor cats need vaccinations 10 /[…]do-indoor-cats-need-boosters-vaccinations.html
cat lovers blog 11 /[…]top-books-for-cat-lovers.html
becoming a cat behaviorist 11 /[…]careers-with-cats-cat-behaviourist.html
cat ideal weight 14 /[…]cats-ideal-weight-veterinary-faqs.html

Furthermore, this separation means that where personality is ‘injected’ into core site pages, the stark contrast can make it feels artificial and contrived. It often reads like ‘marketing content’ when compared with the flat tone of the content around it.

Why is the blog on a different system?

Typically, this kind of separation occurs when a CMS hasn’t properly anticipated (or otherwise can’t support) ‘editorial’ content; blog posts and articles which are authored, categorized, media-rich, and so forth. These are different types of requirements and functions from a website which just supports ‘pages’, or which has been designed and built to serve a very specific set of requirements.

As a result, a marketing team will typically create and maintain a separate ‘blog’, often separated from the main site. This can cripple the performance of those companies’ marketing and reach, as the blog never inherits the authority of the main site (and therefore, has little visibility), and fails to deliver against marketing and commercial goals. This often leads to abandonment, and over-investment in marketing in rented channels, like Facebook. Speculatively, it looks like this is exactly what’s happened here.

Conversely, either through poor training and management or outright rebellion, some teams prefer to publish ‘blog-like’ content as static articles within the constraints and confines of their local branch news sections. These attempts, lacking the kind of architecture, framework, and strategy which a successful blog requires, also fail to perform. Here’s an example from the Brighton branch.

The Brighton branch rolling out their own ‘blog’ solution

From a technical perspective, the main site should be able to house this content on the same domain, as part of the same editorial and structural processes which manage their ‘main’ content. If the separation of the blog from the main site is due to technical constraints inherent in the main site, this is a devastating failure of planning, scoping and/or budgeting. It’s limited their ability to attract and engage audiences, to integrate and showcase their personality into their main site content, and to convert more of their audience to donation, adoption or other outcomes.

While fixing some of the technical issues we’ve spotted should result in immediate improvements to visibility, the long-term damage of this separation of content types will require years of effort to undo.

This is something which WordPress gets right at a deep architectural level. The core distinction between ‘pages’ and ‘posts’ (as well as support for custom post types, custom/shared, and post capability management) is hugely powerful and flexible. Other platforms could learn a lot from studying how WordPress solves exactly this kind of problem.

Regardless of their choice of platform(s), Cats Protection need to have a solid strategy for how they seamlessly house and integrate blog content with ‘core’ site content in a way which aligns with technical and editorial best practice.

4. Serious technical SEO standards abound

I understand that as a charity, Cats Protection has limited budget and resource to invest in their website. It’s unreasonable to expect their custom-built site to be completely perfect when it comes to technical SEO, or to adhere 100% to cutting edge standards.

SALE: Improve your technical SEO skills!

  • Which technical SEO errors hurt your site?
  • Solve them and climb the rankings!
  • Improve your site speed on the go
  • On-demand SEO training by Yoast
More info

However, the gap between ‘current’ and ‘best’ performance is wide – enough that I’d be remiss in our review not to point out some of the more severe issues which I’ve identified.

I’ve spotted dozens of issues throughout the site which are likely impacting performance, ranging from severe problems with how the site behaves, to minor challenges with individual templates or pages.

Individually, many of these problems aren’t serious enough to cause alarm. Collectively, however, they represent one of the biggest factors limiting the site’s visibility.

I’ve highlighted the issues which I think represent the biggest opportunities – those which, if fixed, should help to unlock increased performance worth many times the resource invested in resolving them.

Many pages and templates are missing title, description or H1 tags

Page titles and meta descriptions are hugely important for SEO. They’re an opportunity to describe the focus of the page, and to optimize your ‘advert’ (your organic listing) in search engines. H1 tags act as the primary heading for a page, and help users and search engines to know where they are, and what a page is about.

Not having these tags is a serious omission, and can severely impact the performance of a page.

Many important pages on the site (like the “Adopt a cat” page) are missing titles, descriptions and headings, which is likely impacting the visibility of and traffic to many key pages.

Source code from the ‘Adopt a cat’ page, which is missing a title and description

As well as harming performance, omitting titles often forces Google to try and make its own decisions – often with terrible effect. The following screenshot is from a search result for the ‘Kittens’ page – the site’s main page for donation signups. An empty title tag and missing <h1> heading tag has caused Google to think that the title should be ‘Cookie settings’.

Google incorrectly assumes the page title should be ‘Cookie settings’

There are hundreds of pages where this, and similar issues are occurring.

A good CMS should provide controls for editors and site admins to craft titles, descriptions, and headings. But the site should also automatically generate a sensible default title, description and primary heading for any template or page where there’s nothing manual specified (based on the name and content of the page). Needless to say, this is one of the core features in Yoast SEO!

Errors and content retirement processes are poorly managed

Requests for invalid URLs (such as frequently return a 200 HTTP header status. That tells Google that everything is okay and that the page is explicitly not an error. Every time somebody moves or deletes a page, they create more ‘error’ pages.

As a result, Google is frequently confused about what constitutes an actual error, and many error pages are incorrectly indexed.

Is this an error page? The 200 HTTP header says not.

This further dilutes the performance of key pages, and the site overall.

A ‘raw’ server error generated from an invalid .aspx URL

Then again, at least this page provides links and routes back into the main content. Other types of errors (such as those generated by requesting invalid URL structures like this one, which ends in .aspx) return a ‘raw’ server error, which although it correctly returns a 404 HTTP header status, is essentially a dead end to search engines. Needless to say, that negatively impacts performance.

The poor error management here also makes day-to-day site management much harder. Tools like Google Search Console, which report on erroneous URLs and offer suggestions, are rendered largely useless due to the ambiguity around what constitutes a ‘real’ 404 error. That makes undergoing a process of identifying, resolving and/or redirecting these kinds of URLs pretty much impossible. The site is constantly leaking, and accruing technical debt.

A ‘live’ adoption listing page

And it’s not just deleted or erroneous URLs; the site handles dynamic and retired content poorly. When a cat has been adopted, the page which used to have information about it now returns a 200 HTTP header status and a practically empty page.

An invalid ‘cid’ parameter in the URL returns an empty page, but a 200 HTTP header status

Every time Cats Protection list, or unlist a cat for adoption, they create new issues and grow their technical debt.

Their careers subdomain has the opposite problem; pages are seemingly never removed, and just build up over time. This wastes resources on crawling, indexing, and equity. Expired jobs should be elegantly retired (and the URLs redirected) after they expire. Properly retiring old jobs (or, at least their markup) is a requirement if Cats Protection want to take advantage enhanced search engine listings from resulting from implementing schema markup for jobs.

A job listing which expired in May 2018, but which is still accessible and indexed by Google

Issues like this crop up throughout their whole ecosystem. Value is constantly ‘leaking’ out of their site as a result. To prevent this, they need to ensure that invalid requests return a consistent, ‘friendly’ 404 page, with an appropriate HTTP status. They also need to implement processes which make sure that content expiry, movement, and deletion processes redirect or return appropriate HTTP status (something which the Redirect Manager in Yoast SEO Premium handles automatically).

Oher areas

In my opinion, these issues represent some of the biggest (technical SEO) barriers to growth which the brand faces. These are just the tip of the iceberg, but until they address and resolve them, fixing a million tiny issues page-by-page isn’t going to move the needle. I could definitely dig deeper, into areas like site speed (why haven’t they adopted HTTP/2?), their .NET implementation (why are they still using ViewState blocks?) and the overall UX – but this is already a long post.

There is, however, just one last area I’d like to consider.

A light at the end of the tunnel?

In researching and evaluating the platform, I couldn’t help but notice a link in the footer to the agency who designed and built the website – MCN NET. Excitingly, their homepage contains the following block of content:

After a very tense and nerve-racking tender process we were thrilled to hear that we had once again being nominated as the preferred supplier for the redevelopment of the Cats Protection website. This complex and feature rich website is scheduled to make its debut in 2019, so be sure to keep an eye out for it. It’ll be Purrrfect.

Hopefully, that means that many of the problems I’ve identified have already been solved. Hopefully.

Except, that’s perhaps a little ambitious. I don’t know the folks at MCN, and I don’t know what the brief, budget or scope they received was like when they built the current site. As I touched on earlier, charities don’t have money to burn on building perfect websites, and maybe what they got was the right balance of cost and quality for them, at the time. I’m certainly not suggesting that it’s solely MCN’s fault that the Cats Protection website suffers from these issues.

However, my many years of experience in and around web development has given me have a well-earned nervousness around .NET and Microsoft technologies, and a deep distrust of custom-built and proprietary content management systems.

That’s because, in my opinion, all of the issues I’ve pointed out in this article are basic. Arguably, they’re not even really SEO things – they’re just a “build a decent website which works reasonably well” level of standards. I recognize that, in part, that’s because the open source community – and WordPress, and Yoast, in particular – has made these the standards. And proprietary solutions and custom CMS platforms often struggle to keep up with to the thousands of improvements which the open source community contributes every month. If the current website is reaching its end of life, it’s not surprising that it’s creaking at the seams.

With a new website coming, I hope that all of this feedback can be taken on board, and the problems resolved. I understand the commercial realities which mean that ‘best’ isn’t achievable, but they could achieve a lot just by going from ‘bad’ to ‘good’.

With that in mind, if I were working for or on behalf of Cats Protection, I’d want to be very clear about the scope of the new website. I’d want detailed planning and documentation round its functionality, and the baseline level of technical quality required. Of course, that will have cost and resource implications, but the new site will have to work a lot harder than the current one. Cats Protection are competing in an aggressive, crowded market, full of strong competitors. The stability of their foundations will make the difference between winning and losing.

I’d also want to have a very clear plan in place for the migration strategy from the current website to a new website. Migrations of this size and complexity, when handled poorly, have been known to kill businesses.

Every existing page, URL and asset will need to be redirected to a new home. Given the types of fragmentation issues, errors and orphaned assets we’ve seen, this is a huge job. Even mapping out and creating that plan – never mind executing it – feels like a mammoth task.

Hopefully, this is all planned out, and in good hands. I can’t wait to see what the new site looks like, and how it boosts their visibility.


Cats Protection is a strong brand, doing good work, crippled by the condition of its aging and fragmented website. Other charities and brands are eating into its market share and visibility with worse content and marketing. They’re able to do this, in part, because they have stronger technical platforms.

Some of the technical decisions Cats Protection has made around its content strategy have caused long-term harm. The fragmentation of local branches and the separation of the blog have seriously limited their performance.

A disclaimer

Neither the author nor Yoast BV is in any way related to, representing, commissioned by or acting on behalf of Cats Protection. All content is the opinion of the author. Our assessment aims to educate the public through assessing real-world scenarios.

All of the content, information, and media we have explored is from publically available tools and sources. It is in no way privileged and was correct (to the best of our knowledge) at the time of writing. We acknowledge that we have only reviewed part of Cats Protection’s ecosystem and publically assessable marketing activities (and have deliberately focused on identifying issues and faults), and that our findings aren’t necessarily inclusive or representative of other/overall activity and performance.

The findings of this article should not be construed as professional advice, and we will be held in no way responsible for the outcome of actions taken (or inaction) based on the contents of this article.

In no way do we intend to judge, either positively or negatively, on the decisions, performance, or operations of Cats Protection, their website, partners, suppliers or personnel.

‘Fixing’ this could take years, even if the right foundations are put in place with the new site structure. Cats Protection will need to change the way it thinks about, produces and manages content, too.

The quality of their new website will have a significant impact on their future success. They must also carefully manage the process under which it is launched, otherwise they risk disaster.

This is a lot to tackle. And as a charity, they undoubtedly have limited budgets to achieve the necessary levels of quality, strategy, and functionality.

If you’ve read this far, perhaps you can help them better prepare for and invest in getting this right. Help them continue their mission of helping change the lives of cats and kittens – by donating to their cause.

Thanks for reading. Let me know if I’ve missed anything serious (or got anything wrong!) in the comments, or let me know which charity you’d like me to consider for my next review!

Read more: How to perform an SEO audit »

The post Jono reviews: Cats Protection appeared first on Yoast.

We’ve said it time and again: site speed is a crucial aspect of your SEO. That’s why we often write about site speed tools, speed optimization, and other things you need to know to make your site lightning fast. One factor in site speed is image optimization: on most sites, images will play a part in loading times. So, giving your image SEO some thought will pay off.

Want to bump your SEO to a higher level? Become a technical SEO expert with our Technical SEO training! »

Technical SEO training Info

Besides resizing and compressing your images to improve loading times, there’s the option to implement ‘lazy loading’ on your site. Lazy loading means that an image or object on your site doesn’t load until it appears in your visitor’s browser. For example: if a page has 8 images, only those that appear ‘above the fold’ load right away, while the others load as the user scrolls down. This can significantly improve speed, especially on pages that contain a lot of images. There are several plugins you can use to add lazy loading to your WordPress site. But is there really no catch? Will Google still index all your images?

MaAnna emailed us, wondering exactly that:

I’m testing the lazy load image function in WP Rocket. In online testers like WebPage Test, the waterfall doesn’t show the images loading, but when I do a Fetch and Render in Google Search Console all images on a page are shown. Can Google deal with lazy load and still index our images, as Fetch and Render seems to indicate?

Watch the video or read the transcript further down the page for my answer!

Can Google deal with Lazy Load?

“Yes, it can. It renders the page, it waits a bit and it scrolls down the page a bit to generate all the events that it needs to generate to make sure that it has loaded the entire page.

So yes, it can deal with that. You’re very fine using something like that lazy load image function. Google actually has code itself as well, in which it promotes the lazy loading of images because it really enhances people’s experience because pages get faster using lazy load. So, by all means, do use it. Use it well. Good luck!”

Ask Yoast

In the series Ask Yoast, we answer SEO questions from our readers. Do you have an SEO-related question? A pressing SEO dilemma to which you can’t find the answer? Send an email to, and your question may be featured in one of our weekly Ask Yoast vlogs.

Note: you may want to check our blog and knowledge base first, the answer to your question could already be out there! For urgent questions, for example about the Yoast SEO plugin not working properly, please contact us through our support page.

Read more: Does site speed influence SEO? »

The post Ask Yoast: Can Google deal with Lazy Load? appeared first on Yoast.

If you have two very similar sites in two different languages, you may wonder whether you need to implement hreflang. Will Google recognize both sites as ‘stand-alone’ websites, and is that what you want? While translated content isn’t considered duplicate content, it may still be worth your while to actively point users to the right domain with hreflang.

For those that aren’t well versed in technical SEO, implementing hreflang will probably take a lot of time and something might even break. If that’s the case for you, should you still go to great lengths to implement hreflang? I’ll dive into that in this Ask Yoast!

Moria Gur sent us her question on using hreflang:

I have two sites with two different domains for coloring pages, one in Hebrew and one in English. The images and text are similar (but in a different language). Should I use hreflang in this case? Or will Google recognize both as ‘stand-alone’ websites?

Watch the video or read the transcript further down the page for my answer!

When to use hreflang

Optimizing your site for multiple languages? You need our Multilingual SEO training! »

Multilingual SEO training Info
“Well, yes, Google will recognize both as stand-alone websites and there’s nothing wrong with them. Adding hreflang might give you a bit of an edge on both sites, but it’s also a lot of work. So, if you’re doing well with both sites right now, I would not do that, just because all the work involved is probably more work than it will return in terms of investment.

If you are not doing too well, or one is doing much better than the other, then maybe it’s worthwhile trying that. And you could just try that on a subset of the pages, and hreflang those properly to the other one. Good luck!”

Ask Yoast

In the series Ask Yoast, we answer SEO questions from our readers. Do you have an SEO-related question? A pressing SEO dilemma you can’t find the answer to? Send an email to, and your question may be featured in one of our weekly Ask Yoast vlogs.

Note: please check our blog and knowledge base first, the answer to your question may already be out there! For urgent questions, for example about the Yoast SEO plugin not working properly, we’d like to refer you to our support page.

Read more: hreflang: The ultimate guide »

The post Ask Yoast: Hreflang for sites with different domains appeared first on Yoast.

If you’re serious about your WordPress website, you have run a page speed test at some point. There are many variations of these tests out there. Some more convenient and true to your target audience than others. But they all will give you a pretty decent idea of where you can still improve your site. 

Certain speed optimizations may come across as “technically challenging” for some of you. Luckily, you have set up a WordPress website. And one of the things that make WordPress so awesome is the availability of WordPress plugins. Some free, some paid, but they all will help you to simplify difficult tasks. In this article, we’ll first show you a couple of page speed tests so you can check your page speed yourself. After that, we’ll go into a number of speed optimization recommendations. And show you how to solve these using just plugins.

Running a page speed test

Running a page speed test is as simple as inserting your website’s URL into a form on a website. That website then analyzes your website and comes up with recommendations. I’d like to mention two of those, but there are much more tests available.

  1. Pingdom provides a tool for speed testing. The nice thing is that you can test from different servers. For instance, from a server that is relatively close to you. Especially if you are targeting a local audience, this is a nice way to see how fast your website for them is.
  2. Google Lighthouse is a performance tool that lives in your browser. Click right on a page, choose Inspect and check the Audits tab in the new window that opens in your browser. Here, you can test speed for mobile device or desktop, and on different bandwidths for example. The test result looks like this:
    Google Lighthouse test result
    Small remark: most sites appear slower in Lighthouse. This is because Lighthouse emulates a number of devices, for instance, a slow mobile/3g connection. (see the second bar in the screenshot above). With mobile first, this is actually a good thing, right?

Before Lighthouse, Google PageSpeed Insights already showed us a lot of speed improvements. They even let you download of optimized images, CSS and JS files. As you are working with WordPress, it might be a hard task to replace your files with these optimized ones though. Luckily, WordPress has plugins.

There are many, many more speed testing tools available online. These are just a few that I wanted to mention before going into WordPress solutions that will help you improve speed.

Optimizing your page speed using WordPress plugins

After running a page speed test, I am pretty sure that most website owners feel they should invest some time into optimizing that speed for their website. You will have a dozen recommendations. These recommendations differ from things you can do yourselves and some things that you might need technical help for.

Image optimization

Your speed test might return this recommendation:image optimization for speed
Images usually play a large part in speed optimization, especially if you use large header images. Or if your site is image-heavy overall. It’s always a good idea to optimize these images. And it can be done with little quality loss these days. One of the things to look for is, like in the page speed test example above, images that are in fact larger than they are shown on your screen. If you have an image that covers your entire screen, and squeeze that into a 300 x 200 pixels spot on your website, you might be using an image of several MB’s. Instead, you could also change the dimensions of your image before uploading. And serve the image in the right dimensions and at a file size of some KB’s instead. By reducing the file size, you are speeding up your website.

Setting image dimensions in WordPress

WordPress comes with a handy default feature, where every image you upload is stored in several dimensions:Settings > Media
So if you want all the images in your posts to be the same width, pick one of the predefined ones or set your custom dimensions here. Images that you upload scale accordingly to these dimensions and the image in the original dimensions will also be available for you.

If you load, for instance, the medium size image instead of the much larger original, this will serve an image in a smaller file size, and this will be faster.

Image optimization plugins

There are also a number of image optimization plugins (paid and free) for WordPress available, like, Smush or Imagify. These might, for instance, remove so-called Exif data from the image. That is data that is really interesting for a photographer and will contain information about what settings the camera used to make that photo. Not really something you need for the image in your blog post, unless perhaps if you are in fact a photographer. Depending on your settings, you could also have these plugins replace your image with an image that is slightly lower in quality, for instance.

Some of these aforementioned plugins can also help you resize your images, by the way. Test these plugins for yourself and see which one is most convenient to work with and minifies your image files the best way. For further reading about image optimization, be sure to check this post about image SEO.

Browser cache

Another issue that comes across a lot in page speed tests is browser cache optimization.
Pingdom browser cache recommendation
Browser cache is about storing website files, like JS and CSS, in your local temporary internet files folder, so that they can be retrieved quickly on your next visit. Or, as Mozilla puts it:

The Firefox cache temporarily stores images, scripts, and other parts of websites you visit in order to speed up your browsing experience.

Caching in WP Super Cache

Most speed optimization plugins help you to optimize this caching. Sometimes as simple as this:
WP Super Cache
The Advanced tab of WP Super Cache here has a lot of more in-depth configuration for that, but starting out with the set defaults of a plugin is usually a good start. After that, start tweaking these advanced settings and see what they do.

Note that WP Super Cache has an option to disable cache for what they call “known users”. These are logged in users (and commenters), which allows for development (or commenting) without caching. That means for every refresh of the website in the browser window, you will get the latest state of that website instead of a cached version. That last one might be older because of that expiration time. If you set that expiration time to say 3600 seconds, a browser will only check for changes of the cached website after an hour. You see how that can be annoying if you want to see, for instance, design changes right away while developing.

Other WordPress caching plugins

I mention WP Super Cache here because it’s free and easy to use for most users. But there are alternatives. WP Fastest Cache is popular as well, with over 600K+ active installs. It has similar features to optimize caching:
WP Fastest Cache
A paid plugin that I also like is WP Rocket. It’s so easy to configure, that you’ll wonder if you have done things right. But your page speed test will tell you that it works pretty much immediately straight out-of-the-box. Let me explain something about compression and show you WP Rockets settings for that.


Regardless of whether your page speed test tool tells you to:

  • Try to minify your CSS files,
  • minify the JS files of your site,
  • minify your HTML files or
  • enable (GZIP) compression

These recommendations are all compression related. It’s about making your files as small as possible before sending them to a browser. It’s like reducing the file size of your images, but for JavaScript or CSS files, or for instance your HTML file itself. GZIP compression is about sending a zipped file to your browser, that your browser can unzip and read. Recommendations may look like this:
Minify recommendation Lightspeed
In WP Rocket, the settings for compression look like this:
WP Rocket - Compression
Again, a lot is set to the right settings by default, as we do in Yoast SEO, but even more can be configured to your needs. How well compression works, might depend on your server settings as well.

If you feel like the compression optimization that is done with any of the plugins mentioned above fails, contact your hosting company and see if and how they can help you configure compression for your website. They will surely be able to help you out, especially when you are using one of these WordPress hosting companies.

Serving CSS and JS files

One more thing that speed tests will tell you, is to combine (external) CSS or JavaScript files or defer parsing scripts. These recommendations are about the way these files are served to the website.

The combine option for these files is, like you can see in the WP Rocket screenshot above, not recommended for HTTP/2 websites. For these websites, multiple script files can be loaded at the same time. For non-HTTP/2 sites, combining these files will lower the number of server requests, which again makes your site faster.

Deferring scripts or recommendations like “Eliminate render-blocking JavaScript and CSS in above-the-fold content” are about the way these scripts are loaded in your template files. If all of these are served from the top section of your template, your browser will wait to show (certain elements of) your page until these files are fully loaded. Sometimes it pays to transfer less-relevant scripts to the footer of your template, so your browser will first show your website. It can add the enhancements that these JavaScripts or CSS files make later. A plugin that can help you with this is Scripts-to-Footer. Warning: test this carefully. If you change the way that these files load, this can impact your website. Things may all of a sudden stop working or look different.

We have to mention CDNs

A Content Delivery Network caches static content. With static content, we mean files like HTML, CSS, JavaScript and image files. These files don’t change that often, so we can serve them from a CDN with many servers that are located near your visitors, so you can get them to your visitors super fast. It’s like traveling: the shorter the trip, the faster you get to your destination. Common sense, right? The same goes for these files. If the server that is serving the static file is located near your visitor (and servers are equally fast, obviously), the site will load faster for that visitor. Please read this post if you want to know more about CDNs.

There are many ways to optimize page speed in WordPress

Page speed tests will give you even more recommendations. Again, you might not be able to follow up on all of these yourself. Be sure to ask your expert in that case, like your web developer or agency, or your hosting company. But in the end, it’s good that you are using WordPress. There are many decent plugins that can help you optimize the speed of your website after a page speed test!

Read more: Site speed: tools and suggestions »

The post How to optimize WordPress after running a page speed test appeared first on Yoast.

When you add a nofollow tag to a link on your site, you’re basically telling search engines that they shouldn’t count your backlink when ranking that page. Doing this helps you avoid leaking link value to pages that may not be trustworthy, or, in case of affiliate marketing, to your advertiser’s website.

But what if you include several affiliate links in every blog post you publish? While you probably link to relevant products, all these links should still have a nofollow tag. This can easily become a large amount of nofollowed links. Does this have any consequences for the link value of your pages?

Nikola was worried about this and emailed us his question:

I have a nofollow tag on my Amazon affiliate links because in Google’s Webmaster guidelines it’s said that ad links should have this tag. But I’m worried that this will cause a drop in link value of my pages as I add these links to almost all of my posts. What should I do? Do Amazon affiliate links hurt blog SEO?

Watch the video or read the transcript further down the page for my answer!

Adding a nofollow tag to Amazon affiliate links

“No, they don’t hurt blog SEO. Nofollow doesn’t hurt anything. If it’s an ad it deserves a nofollow and you’re doing it perfectly right and you shouldn’t change anything. Good luck.”

Optimize your site for search & social media and keep it optimized with Yoast SEO Premium »

Yoast SEO: the #1 WordPress SEO plugin Info

Ask Yoast

In the series Ask Yoast, we answer SEO questions from our readers. Do you have an SEO-related question? A pressing SEO conundrum you can’t find the answer to? Send an email to, and your question may be featured in one of our weekly Ask Yoast vlogs.

Note: please check our blog and knowledge base first, the answer to your question may already be out there! For urgent questions, for example about the Yoast SEO plugin not working properly, we’d like to refer you to our support page.

Read more: How to cloak your affiliate links »

The post Ask Yoast: Nofollow tags for Amazon affiliate links appeared first on Yoast.

My husband – Joost de Valk – and I often have discussions on how technology will change our day-to-day life. Joost is an early adopter, while I am much slower and more reluctant to technological change. Our discussions are pretty heated. So, what’s Joost’s opinion on the future of voice search? How dominant will voice search be? And how will search be affected by it? I interviewed my early-adopting-voice-addict-husband  to shed some light and perspective on the matter of voice search. I did some thinking myself as well. Here, I share our views on what the future of voice search could look like. 

Want to learn practical SEO skills to rank higher in Google? Our Basic SEO training is just what you need! »

Basic SEO training Info

Voice queries make a lot of sense

Joost just likes voice. He likes talking to machines. Joost asks Siri to set the timer while he’s cooking dinner and gives orders to Google Home when he wants to listen to some music. So what is it what attracts him in voice search? ‘I like voice whenever I cannot type,’ Joost answers,  ‘So, I use it while I am cooking, or when we are in a car together and we have a discussion. Using a voice query is just as easy as typing in a keyword. And if you do not have access to a keyboard, voice search is especially useful.’

I think Joost is right about that: voice queries just make sense. Voice search is easy to use (as long as your voice is recognized properly). For most people, speaking to a machine is quicker than typing. And, you can use voice search everywhere, even when you’re doing other things.

Voice results do not (always) make sense

The results that voice gives us are always singular. Siri will set a timer, Google Home will play the song. Joost:  ‘Voice results only make sense if you’re looking for a singular result. If you want to know something specific. If you want to end the discussion you’re having in the car and need to know exactly how many people live in France. And also, if you search for a specific restaurant. But if you want to have dinner in a nice restaurant and you’re not sure which one it ‘ll be,  you’ll probably prefer to see some options. And right then and there, is where I think voice results as they work now stop making sense.’

I started thinking about that. Most search queries people use are not aimed at a singular result. People like to browse. People want to choose. That’s why physical stores have a lot of options. People like to browse through different pairs of jeans before they choose which one they’ll buy. Online, we’ll probably check out different sites or at least different models before we add a pair of jeans to our shopping cart.

If you’re searching for information that is longer than a few sentences, voice result is not very useful either. That’s because it is hard to digest information solely by listening. As a listener, you’re a very passive receiver of information. As a reader,  you can scan a text, you can skip pieces information or read an important paragraph twice. You cannot do that as a listener. As a reader, you’re much more in control. So, if you’re searching for information about what to do in Barcelona, it makes much more sense to get that information from a book or a screen.

Search engines are growing towards singular results

Joost thinks that search engines are working towards singular results. They are developing that type of functionality. ‘The answer boxes you see in the search results are an example of that,’ Joost explains. ‘Search engines are trying to give one single answer to a search query. But, in a lot of the cases, people aren’t searching for one answer. In many cases people want to make a choice, they want to browse.’

So what will the future bring?

‘I think you’ll see different applications being connected to each other,’ Joost answers when I ask him what the future of voice search will look like. ‘Siri, for example, would then be connected to your Apple TV. Search results and information would appear on the screen closest to you that Apple controls. I think voice will become the dominant search query, but I think screens will continue to be important in presenting search results.’

Read more: How to prepare for voice search? »

The post Voice search: what will the future bring? appeared first on Yoast.