Playing with the X-Robots-Tag HTTP header

Traditionally, you will use a robots.txt file on your server to manage what pages, folders, subdomains or other content search engines will be allowed to crawl. But did you know that there’s also such a thing as the X-Robots-Tag HTTP header? In this post we’ll discuss what the possibilities are and how this might be a better option for your blog.

Quick recap: robots.txt

Before we continue, let’s take a look at what a robots.txt file does. In a nutshell, what it does is tell search engines to not crawl a particular page, file or directory of your website.

Using this, helps both you and search engines such as Google. By not providing access to certain, unimportant areas of your website, you can save on your crawl budget and reduce load on your server.

Please note that using the robots.txt file to hide your entire website for search engines is definitely not recommended.

Say hello to X-Robots-Tag

Back in 2007, Google announced that they added support for the X-Robots-Tag directive. What this meant was that you not only could restrict access to search engines via a robots.txt file, you could also programmatically set various robot.txt-related directives in the headers of a HTTP response. Now, you might be thinking “But can’t I just use the robots meta tag instead?”. The answer is yes. And no. If you plan on programmatically blocking a particular page that is written in HTML, then using the meta tag should suffice. But if you plan on blocking crawling of, lets say an image, then you could use the HTTP response approach to do this in code. Obviously you can always use the latter method if you don’t feel like adding additional HTML to your website.

X-Robots-Tag directives

As Sebastian explained in 2008, there are two different kinds of directives: crawler directives and indexer directives. I’ll briefly explain the difference below.

Get the most out of Yoast SEO, learn every feature and best practice in our Yoast SEO for WordPress training! »

Yoast SEO for WordPress training$ 99 - Buy now » Info

Crawler directives

The robots.txt file only contains the so called ‘crawler directives’, which tells search engines where they are or aren’t allowed to go. By using the

Allow

directive, you can specify where search engines are allowed to crawl.

Disallow

does the exact opposite. Additionally, you can use the

Sitemap

directive to help search engines out and crawl your website even faster.

Note that it’s also possible to fine tune the directives for a specific search engine by using the

User-agent

directive in combination with the other directives.

As Sebastian points out and explains thoroughly in another post, pages can still show up in search results in case there are enough links pointing to it, despite explicitly defining these with the

Disallow

directive. This basically means that if you want to really hide something from the search engines, and thus from people using search, robots.txt won’t suffice.

Indexer directives

Indexer directives are directives that are set on a per page and/or per element basis. Up until July 2007, there were two directives: the microformat rel=”nofollow”, which means that that link should not pass authority / PageRank, and the Meta Robots tag.

With the Meta Robots tag, you can really prevent search engines from showing pages you want to keep out of the search results. The same result can be achieved with the X-Robots-Tag HTTP header. As described earlier, the X-Robots-Tag gives you more flexibility by also allowing you to control how specific file(types) are indexed.

Example uses of the X-Robots-Tag

Theory is nice and all, but let’s see how you could use the X-Robots-Tag in the wild!

If you want to prevent search engines from showing files you’ve generated with PHP, you could add the following in the head of the header.php file:

header("X-Robots-Tag: noindex", true);

This would not prevent search engines from following the links on those pages. If you want to do that, then alter the previous example as follows:

header("X-Robots-Tag: noindex, nofollow", true);

Now, although using this method in PHP has its benefits, you’ll most likely end up wanting to block specific filetypes altogether. The more practical approach would be to add the X-Robots-Tag to your Apache server configuration or a .htaccess file.

Imagine you run a website which also has some .doc files, but you don’t want search engines to index that filetype for a particular reason. On Apache servers, you should add the following line to the configuration / a .htaccess file:

<FilesMatch ".doc$">
Header set X-Robots-Tag "index, noarchive, nosnippet"
</FilesMatch>

Or, if you’d want to do this for both .doc and .pdf files:

<FilesMatch ".(doc|pdf)$">
Header set X-Robots-Tag "index, noarchive, nosnippet"
</FilesMatch>

If you’re running Nginx instead of Apache, you can get a similar result by adding the following to the server configuration:

location ~* .(doc|pdf)$ {
	add_header  X-Robots-Tag "index, noarchive, nosnippet";
}

There are cases in which the robots.txt file itself might show up in search results. By using an alteration of the previous method, you can prevent this from happening to your website:

<FilesMatch "robots.txt">
Header set X-Robots-Tag "noindex"
</FilesMatch>

And in Nginx:

location = robots.txt {
	add_header  X-Robots-Tag "noindex";
}

Conclusion

As you can see based on the examples above, the X-Robots-Tag HTTP header is a very powerful tool. Use it wisely and with caution, as you won’t be the first to block your entire site by accident. Nevertheless, it’s a great addition to your toolset if you know how to use it.

Read more: ‘Meta robots tag: the ultimate guide’ »

How to use Yoast SEO’s content analysis tool

The content analysis in the Yoast SEO plugin assesses crucial aspects of the posts and pages you add to your website. In particular, it checks how SEO friendly and easy to read your content is. Here, we’ll go through the most important features of this tool. Check out our step-by-step guide on to optimize your page and learn how to interpret the feedback you’ll get.

The content analysis: readability and SEO

You’ll find our readability and SEO analysis in the Yoast SEO meta box and the Yoast SEO sidebar (the latter only if you use the WordPress Block editor). So, when you’re working on your post or page just scroll down to find them in the meta box below your post:

The Yoast SEO meta box you’ll find below your post

Or, in the Block Editor, click on the Yoast icon in the upper right corner of your screen to see the Yoast SEO sidebar:

The Yoast SEO sidebar
The Yoast SEO sidebar

Here, we’ll be focusing on the SEO and readability analysis. But we’ll also have a quick look at the Social tab and the Cornerstone content section because it’s important to use those correctly when you’re optimizing your post or page.

The readability analysis

The readability analysis aims to help you write text which is easy to read and understand. Foremost, this is crucial if you want readers to stay, read your text and understand what you’re writing about. Or perhaps even take action, for instance, if you’re explaining how to perform a certain task in your post. Secondly, search engines love readable copy! That’s why you should invest some time in writing in plain language.

The readability analysis includes several checks based on the characteristics of a text that is easy to read and understand:

If you write a readable text, based on the criteria above, the plugin will reward you with a green bullet.

The SEO analysis

Readability is essential, but there’s more to focus on if you want to create a search-engine-friendly page on your website. That’s why we have an SEO analysis too!

The SEO analysis of Yoast SEO

To get the most out of the SEO analysis, you’ll have to enter a focus keyphrase first. The focus keyphrase is the phrase you’d like your post to rank for. You should determine what phrases you’d like to rank for by doing keyword research.

Once you’ve entered your focus keyphrase, the SEO analysis checks the presence of your focus keyphrase in:

The plugin calculates the number of words and frequency of the focus keyphrase in your article. In addition to this, Yoast SEO Premium checks how you distributed the keywords on your page. Your article should contain your focus keyphrase or its synonyms evenly throughout the text.

And it checks whether you’re using the exact same focus keyphrase on other pages of your website so you don’t accidentally compete with yourself. If you were to optimize two different articles for the same focus keyphrase, you could have both posts turn up in the same search on Google. Read why you shouldn’t use your focus keyphrase more than once.

And, last but not least, Yoast SEO has several links and image checks for your article.

If you write a relatively SEO-friendly article – based on the above criteria – the plugin will reward you with a green bullet. If you follow the instructions and craft your pages and posts so they get green bullets, they have a better chance of ranking high in search.

On a special note: the cornerstone analysis

Because your cornerstone articles should be the best articles on your site, and you want them to rank high in the search engines, Yoast SEO provides a specific cornerstone content analysis. If you mark an article in Yoast SEO as cornerstone content in the cornerstone content section, the content analysis will be a little stricter – your article will need to be longer, for example. You can also check if these articles are linked to often enough elsewhere on your site with the text link counter in Yoast SEO.

8 steps to optimize your post or page

Enough with the theory, let’s get to work! If you follow the steps below, you’re making your post or page reader and SEO friendly.

Step 1: Enter your focus keyphrase

First, enter your focus keyphrase – the keyword you want your post to rank for – in the Yoast SEO meta box or the sidebar. This should be a keyphrase which came from your keyword research and which you’ll keep in mind throughout the writing process.

Read more: How to choose the perfect focus keyphrase »

Yoast SEO Premium lets you optimize an article for more than just a single focus keyphrase – you can add synonyms and related keyphrases as well. Yoast SEO Premium recognizes these related keyphrases, synonyms and even different word forms in your text. Using synonyms and different word forms makes your text easier to read. Using related keyphrases provides context to your article, helping search engines to understand what it’s about.

Step 2: Put your text in the WordPress backend

Distraction free writing

WordPress has a distraction-free writing mode that enables you to write in the WordPress backend without being distracted by the menu, the toolbar, the categories box, etc.Distraction free writing mode

You can write your article directly in the post editor of WordPress or write in any text editor and copy and paste it into the WordPress backend. Either way is fine!

If you choose to paste your text into the WordPress backend, don’t forget to check if your headings are transferred well (H1 should be your post title, H2 is the subheading etc).

Step 3: Check your readability scores

We believe your audience comes first, so readability comes before SEO! This means that, first and foremost, you have to write content people like to read. Put effort into writing an attractive, well-structured and original article. Then use our readability checks to make sure your text is nice and easy to read.

Click on the readability analysis tab and you’ll see your scores for the readability checks. The green bullets show which aspects of your content are good, while orange and red bullets indicate where you can make improvements.

readability analysis yoast seo

Clicking on the eye icon will highlight the sections where the analysis identified a problem:

screenshot-passive-voice-highlight
Unnecessary use of passive voice found by the readability analysis

Now let’s solve this in step 4!

Step 4: Make readability adjustments

The readability analysis checks tell you where your article’s readability could be improved. If you tend to write long sentences, our analysis will show you which ones to rewrite. Split long sentences up into shorter sentences. If your paragraphs are too long, divide them up too. If you use a lot of passive voice, rephrase a few sentences. Here you can find all Yoast SEO’s readability assessments and instructions on how to get great scores.

Make sure that the overall bullet – the one on the upper right in the backend of your post – is green. The overall bullet will turn green if you have covered most of the readability aspects.

Step 5: Optimize your snippet

In Yoast SEO’s snippet preview you can see what your page might look like in the search results. If you click on edit snippet, you can adjust the SEO title, the URL and the meta description. By doing so, you can make your search result more enticing and convince people to click on your result.

the snippet preview in yoast seo
The snippet preview in Yoast SEO

Keep reading: How to use the snippet preview of Yoast SEO »

Step 6: Check your SEO bullets

Go to the SEO analysis and check out those SEO bullets. Again, green tells you which aspects of your SEO are good, while orange and red indicates where you can improve your SEO strategy.

SEO-analysis-results-red-orange-bullets
The SEO analysis in Yoast SEO

Do you see some red bullets on your post? Don’t panic, we’ll help you with that in step 5. But whatever you do, don’t change your focus keyword now! Why? Read The temptation of the green bullet to find out.

Step 7: Make SEO adjustments

The checks in the SEO analysis will show you where you can improve your SEO. Look critically at your title, the headings, and subheadings of your article. Do they include your focus keyphrase? If not, can you edit them (without changing the structure or content of your article) to include your focus keyword? Don’t go crazy and put your focus keyphrase in every heading, because that’s too much!

Your article should also include the focus keyphrase a couple of times, and if it’s not in your first paragraph, add it. As a general rule of thumb: try to use your search terms in about 1 to 2 percent of your text. Moreover, try to use synonyms and related keywords, as they’ll make your text easier to understand and can even enhance SEO.

Also, don’t forget to add links to related posts on your site and to check your images. Have you added an alt text? And are they the right size?

Remember, you don’t have to keep on optimizing until all of the bullets are green. Posts on Yoast.com often have a few orange bullets and sometimes even one or two red ones. The important thing is that the overall bullet should be green, which happens if most of your SEO aspects are covered.

Step 8: Fill out the social data

The final step is to add social data for your article. This means that you can instruct social media channels, like Facebook and Twitter, what to show when you share a page. You can add a social description, title, and an image with Yoast SEO. In Yoast SEO Premium you can even check what your page will look like when shared on social:

the social preview of yoast seo
Social previews in Yoast SEO Premium

Time to publish!

Great, you’re ready to hit publish! Go ahead and go live. When it’s live, take a second to check what it looks like on your site. You might see things you haven’t noticed before. Also, don’t forget to monitor how your post or page is doing, because this will give you valuable input for improvement of the post at hand or future posts.

But what if you have all green bullets and still no rankings?

SEO copywriting with Yoast SEO

SEO copywriting is still hard work, but the content analysis tool in Yoast SEO makes the process of writing awesome, SEO-friendly articles much easier. For more practical tips, make sure you read SEO copywriting: the Ultimate Guide.

The post How to use Yoast SEO’s content analysis tool appeared first on Yoast.

How is your site’s SEO doing?

How is your site's SEO doingOne question we get quite often in our website reviews is whether we can help people recover from the drop they noticed in their rankings or traffic. A lot of the times, this is a legitimate drop and people were actually in a bit of trouble. However, more often than not there wasn’t anything wrong with either the traffic or the rankings.

So today I’ll be explaining where you should and should not be looking when checking whether your site is doing well or not.

My traffic dropped!

We’ve seen quite a few clients who claimed to see a drop in their traffic. On investigation, we could not find any such drop anywhere. When checking your traffic, there’s only one place you need to go: Google Analytics. I don’t really trust any other tool to give me the right analytics or data.

However, Google Analytics isn’t that straightforward, so let me tell you where to look, when looking for a drop due to bad SEO. When looking at overall traffic related to SEO, this is the most important place you need to check (read this post for a more info on SEO in Google Analytics); the Acquisition > All Traffic > Channels section:

Channels Google Analytics

This will give you a line chart of your site’s total traffic as well as a complete overview of this traffic divided into different “channels”. A channel is basically a couple of sources (where your visitors come from) grouped into one. So any traffic from either Bing, Google, Yahoo, Yandex or any other search engine will be combined in the channel “Organic Search”. Any traffic from Twitter, Facebook, Pinterest, etc. will be combined in the channel “Social”.

Do you see a drop in the line chart this metric gives you? If not, you’re probably doing fine. If you are, don’t panic just yet. There are a few things you can do to make sure this drop is actually related to SEO.

Organic Search

Google Analytics gives you the ability to see just the traffic from search engines. There are two ways of doing this. The first is probably the easiest: you just click “Organic Search” in the list:

Channels_-_Google_Analytics

You’ll now get a line chart and statistics from just this channel. This can be useful, but it also makes it impossible to compare the traffic from this channel with traffic from other channels. So my personal preference is to select the checkbox before the traffic channel I want to view and then click that “Plot Rows” button at the top. This will give you a second line in the line chart, like this:

So the blue line is the total traffic, and the orange line is the traffic from search engines. As you can see, there are some spikes in our total traffic that have nothing to do with our traffic from search engines. So they don’t have anything to do with SEO efforts on our part. These spikes actually came from newsletters and social media.

And it works the same way the other way around: you might have a drop in traffic that has nothing to do with your SEO efforts. Something could’ve gone wrong in your social media, or maybe your newsletter wound up in everyone’s spam folder. So always check for the source of the drop, before blaming it on bad SEO.

Check the right timeframe and period

As you can see in the above screenshot, the graph is set to one point per day and covers a timeframe of about 1,5 month. This is fine if your traffic is steady. However, if your business (and thus your traffic) is more seasonal, this might not be the best timeframe. Put it up to a year or half a year so you can see if your traffic is actually lower than usual around that time.

Also, if your timeframe includes the current month, day or week, please be aware that the last point in your line graph will always be lower. You see the same thing happening in the last screenshot; the last point drops off. This is not because our traffic dropped, this is simply because the last point is today, and today is still not finished and will accumulate a lot more traffic. The same thing goes for weeks and months.

My ranking dropped!

This one is a bit harder to check, unfortunately. The thing is, Google has personalized search. So what shows up for you when you search a specific keyword won’t show up for me. The results are based on your personal browsing behavior and a lot more which I won’t go into here.

We’ve had clients stating they were already ranking #1 for everything they wanted. This can actually happen if you google yourself a lot and only click through to your own site. Long story short, it’s pretty hard to use Google to find out how your rankings are doing. Of course, using Google in a private browser session can give you some indication. In fact, most tools aren’t much more than that; an indication.

PageRank

A lot of people still cling to the idea of Google PageRank. However, this doesn’t mean anything. Google has deprecated the entire thing as far back as 2009. Google always tried to encourage people to look at other metrics such as Google Analytics. These simply give you far more insight into how your site’s doing.

So simply put: a drop or rise in Google PageRank doesn’t mean your rankings have dropped or risen. It doesn’t really mean anything.

Ranking trackers

At Yoast, we’re not fan of ranking trackers which give you a very general idea of how your ranking is doing. It doesn’t matter what number website you are in the world compared to all other websites, it matters how you’re doing in your field of work.

Another issue these trackers have is they track all your rankings. This won’t be an issue for everyone, but for us it means that our rankings fluctuate a lot. The cause for this is that we rank well for the term “Google Analytics”. These rankings tend to fluctuate quite a bit and there’s a lot of searching going on for these terms. So every time we drop or rise a bit, the rankings in these trackers shoot up or down as well, even though the users weren’t looking for us.

These tools can give a general indication, but should not be used as anything other than that.

Google Webmaster Tools

Google Webmaster Tools (GWT) is one of the tools that I would recommend using to check how your site’s doing. We explained GWT in detail a few weeks ago, but I’ll just quickly show you where to find the most important metrics:

Webmaster Tools Search Queries https yoast com

The Search Queries section shows you how many impressions your site has had and how many clicks resulted from those impressions. An impression is every time your website shows up in the search results the user is looking at. If either of these metrics is going down, you’ll know something is up. Obviously play with the timeframe here as well, so you know for sure it’s not a temporary drop.

Next is the Index Status section:

Webmaster Tools Index Status https yoast com

This metric will show you how many pages are actually indexed in Google. This is obviously an important metric as well, so a steady or rising line is what you’re looking for. If this line is dropping (without you having disallowed pages, for example), this is definitely something to look into.

Conclusion

When you want to check your site’s SEO yourself, I recommend only paying attention to Google Analytics and Google Webmaster Tools. Other tools are fine for giving a general idea or indication, but nothing more than that. Be sure to check the things I’ve told you to check so you know for sure it’s an SEO problem. Of course, you could also order a Website Review and have us analyze your website for you ;)

Do you think I’ve missed anything here? Or do you just have some more questions? Let me know in the comments!

This post first appeared as How is your site’s SEO doing? on Yoast. Whoopity Doo!

Bing Webmaster Tools: Security, Widgets and Messages

Bing Webmaster Tools: Security, Widgets and MessagesCompared to the first three sections, the last three are relatively small. In this final post in our series on Bing Webmaster Tools we will go over Security, Malware and Messages and share our findings.

Security

Making sure your website is secure is of course really important. We have already emphasized this on our page on WordPress security, of course. This section is about the security alerts in Bing Webmaster Tools.

Malware

Malware is software that is intended to damage or disable computers and computer systems. For a website, this usually means software that is left on your website, usually in a file on your server. It’s software that is left there without your knowledge. If you find an alert in Bing Webmaster Tools about malware on your website, you should clean your site as soon as possible.

Bing Webmaster Tools divides the malware into two categories: malware found on the page and malware reference found on the page. In the first case, there is an immediate issue to be solved on the page with the listed URL, in the second case, there is a resource linked on the page at that URL that has malware. The page itself isn’t infected in that last case.

Bing Webmaster Tools lists a couple of malware issues:

  • Malware Network Reference: any trace of a known malware distribution network is detected on your website.
  • Browser Exploit: malicious browser exploit detected, which may cause unsolicited execution of external code.
  • Malicious JavaScript: in a page or one of its attached script of frames, malicious JavaScript is detected (f.i. a spammy redirect).
  • Malicious ActiveX: ActiveX interactions seem to trigger malicious activity.
  • Heapspray: Bing detected a potential preparation for a browser exploit via a heapspray.  Heapspraying is a technique used in exploits to facilitate arbitrary code execution.
  • Malware Found on Adjacent Pages: URL is in a folder or subdomain containing malware.
  • Malware Reported by External Source: external sources reported that malware, obviously.

Sucuri is our and your friend in this. Hire these guys to clean up your site. After that, you want to address the vulnerability that allowed the malware to be installed. Simply download the Sucuri plugin and follow their step by step instructions and guidance on how to secure your website (sections Hardening and Post-Hack in that plugin).

And only after this, you should Request a Review in Bing Webmaster Tools to have them check again for the presence of malware.

Track Certificates beta

Yet another beta in Bing Webmaster Tools: Track Certificates. This page will tell you all about the certificates that were requested by people visiting your site. The main purpose of that list is so you’ll be able to spot unexpected or suspicious certificates, so you can report them to Microsoft using the Report link.

This will also include security certificates like the ones we have on our website:

Bing Webmaster Tools: Track Certificates

Widgets

In their competition struggle with Google, Bing has released a number of additions for your website that should make your life better. Two of these are in Bing Webmaster Tools and we’ll discuss both briefly.

Knowledge Widget beta

Bing explains the knowledge widget like this:

For the first time, we empowered every webmaster to use the entity data from the Bing Knowledge repository. Since then, webmasters have added the embed code to thousands of pages to enhance their websites with the rich entity information from the Bing Knowledge system.

Yes, the Bing Knowledge information is similar to the Knowledge Graph in Google (Bing added it first, by the way). It’s a separate block of content but now on your website itself! It works really simple: while the visitor goes over your website, the widget detects related entities in real time, marking them with a little Bing charm. Now I do understand the social value, but feel that this is a bit like the pop-up ads in texts / on links that were briefly popular a couple of years ago. But hey, I might be wrong. Bing tells us that this is improving engagement, time-on-site, and user satisfaction.

Adding the widget is simple: add your URL, copy the provided code and paste it for instance right before the </body> tag in your template. Options are available for displaying images, images and links, just links. Besides that, there is an option to only activate the Bing Knowledge information on text selection by a user. It looks like this (example from Bing itself):

Bing Webmaster Tools: Knowledge Widget

It’s a collapsible sidebar on your website.

At the moment of writing, this is only available for English entities.

Translator Widget

This Translator Widget could be useful, and is similar to the Google Translate option that is on a lot of websites. It only requires a simple copy/paste action and you are good to go. There is even a WordPress plugin to help you out.

You can set the language of your website, and tell Bing to automatically translate based upon the visitor’s browser language, or have the visitor pick the language himself.

I’m not a big fan of these widgets, Google or Bing. I understand that these are ‘convenient’, but rather see people putting some effort (or money) in a decent translation. Go read our post on hreflang. It’s not that hard.

Messages

Preferably this section looks like this:

bwt-no-messages

Bing sends five types of messages:

  • Administrator: if anything changes to the Bing Webmaster Tools service, the administrator will inform you about it.
  • Crawl errors: if an error occurs during the crawl of your website, Bing Webmaster Tools will automatically tell you about it.
  • Index issues: if Bingbot has any problem indexing your site, a message will be sent as well.
  • Malware: following the section on Malware, you’ll receive a message here that malware has been found.
  • Bing Ads: automatically generated messages about Bing Ads.

The Current and Archived sections at Messages are self-explanatory.

Bing Webmaster Tools: It’s a wrap!

With this post, we conclude our series on Bing Webmaster Tools. We have compared Bing Webmaster Tools to Google Webmaster Tools a couple of times during these four posts. Is that a fair comparison? For me it was like having Skippy sandwiches every day and then trying a jar of Peanut Butter & Co. It’s different, but the same product. One could say variety is the spice of life, but in this case I tend stick to Google Webmaster Tools for the larger scale and user base, although the Markup Validator tool in Bing Webmaster Tools will come in handy now and then for quick checks!

For more reading on Webmaster Tools, please find all our related articles here.

This post first appeared as Bing Webmaster Tools: Security, Widgets and Messages on Yoast. Whoopity Doo!

Bing Webmaster Tools: Diagnostics & Tools

Bing Webmaster Tools: Diagnostics & ToolsBing Webmaster Tools provides a number of tools to analyze your website. Somehow, they managed to squeeze these into one page in Bing Webmaster Tools as the dashboard page for the section Diagnostics & Tools. Fortunately, all tools also have a separate page…
In this post, we will go over all tools and tell you how to use these to your advantage.

This is already our third article on Bing Webmaster Tools. In case you have missed the first two, go read these here:

Keyword Research beta

Out of the five tools described in this article, Bing classifies most as ‘beta’. With all these tools, we also have to keep in mind that we are dealing with data from organic search at Bing. Let’s start with the Keyword Research tool:

Bing Webmaster Tools: Keyword Research

I have tested this tool using a number of WordPress SEO related keywords for our website. As you can see, you can set a country and language, as well as a time frame from your research. I chose the US, US English and changed the default last 30 days to January 2015. Note that, as with Google Webmaster Tools, there is a gap between today and the end of your data; Bing Webmaster Tools goes up to 3 days ago.

The tiny ‘Strict’ option in there is to determine whether you want the research to be done for exact phrases or phrases containing the keyword. As I’d like to investigate general keywords (WordPress, plugin, seo), I have left this option unchecked:

Bing Webmaster Tools: Keyword Planner

All results can be sorted, by the way. Just click the (blue) title of a column. First thing that comes to mind is that the Google Keyword Planner shows 550K searches for WordPress in January 2015. Bing Webmaster Tools shows 60K. We’ve already mentioned that the search volume in Google’s keyword planner isn’t always useful, and the same must apply to these Bing numbers.

Buy KeywordThe $-icon behind the keyword is a hint. You can actually buy the keyword here directly. That’s pretty straightforward :)

I still feel that if you properly optimize your website, you can get more clicks as well from organic results. Not trying to ruin your revenue model here, Bing. Paid results seem to work fine if you have narrowed down your niche and target these specific visitors. If the ad / paid result tells me what I need (to know), I’ll click it without a doubt.

In the first table, you see search volume and trends. These are not just going up, by the way :) If you choose a longer period of time, it also won’t be just a straight line :) It will tell you what the rise or drop in search volume is as well.

You can click all the results, but doing so will just filter the results, it won’t give you more details. By the way, just to be complete, you must have seen the Suggestions as well in the screenshots above. These keywords are additional keywords, related to your original keyword by relevancy.

It’s a nice tool, but being used to the keyword tools we normally use, this isn’t one I’ll use very often. Perhaps the release version will have some features that will make it more attractive?

Link Explorer

This is Bing’s alternative for MajesticSEO, OpenSiteExplorer and more backlink tools like that. It allows you to find all backlinks known to Bing for any site. It has a few filters:

  • Filter by site: only pages on your website linking to that URL.
  • Anchor text: only links that use this anchor text.
  • Additional query: only pages that link and that rank for the search keyword as well.
  • Scope: links to the URL entered only or all pages from that domain.
  • Source: only links from the site at hand (internal) or all links to that URL (external).

For that last one, you probably want to use the inbound link tool at Reports & Data instead. It’s more comprehensive and for instance provides you with the exact anchor text for each link.

The result of the Link Explorer tool is something like this:

Bing Webmaster Tools: Link Explorer

Clicking the Source URL will perform a link analysis in Bing Webmaster Tools for that domain, clicking the title will get you to the website itself.

What I am missing here is total search volumes, and some more details on the domain. MajesticSEO adds Trust and Citation flow, Moz’ OpenSiteExplorer adds both page and domain authority. Right now, the filters are what making this section on Bing Webmaster Tools interesting, not per se the (outcome of the) general exploration. if I wanted to check all the pages that link to our domain using “WordPress plugins”, this tool works fine.

Fetch as Bingbot beta

Like Fetch as Google, this will tell you how the search engine bot sees your website. Simply enter your URL and click Fetch:

Bing Webmaster Tools: Fetch as Bingbot

After the fetch is done, status will change from Pending to Completed. Now you can click the Status and see how Bing sees your website (clicking the URL will simply get you to the page itself). Where Google Webmaster Tools actually renders your website as well, Bing displays the server response and your source code, as seen by Bingbot, of course. It will tell you if a page is redirected, or blocked by your robots.txt, for example.

Markup Validator beta

I really like this one, to be honest. As it is totally unrelated to search traffic in terms of volume, this tools is a nice substitute for Google’s Rich Snippet Test tool. I don’t know if it is just getting used to something new, but I liked the clean setup of that Google tool like it was a few weeks ago. Bing tells me just what I need to know; is schema.org implemented correctly?

Bing Webmaster Tools: Rich Snippets Test

It is. It will also tell you about RDFa, which you still want to use for breadcrumbs, for instance. Nice extra is that this Bing Webmaster tool also extracts the OpenGraph data.

And yes, I understand the new Google test tool explains what is wrong (according to Google, that is*), but if you just want to see what schema.org / RDFa / OpenGraph is on a page, this tool actually works just fine!

* The Rich Snippet Test tool keeps on telling me to link the last element in my breadcrumbs, but why should I link to the page the visitor is already on? From a UX stand, that seems odd…

SEO Analyzer beta

There we go: Bing Webmaster Tools gives you a free SEO analyzer to cover all the basics (for more in depth analysis, check our site reviews).

You can simply insert a URL and see what basics can be improved. This is very much like the HTML Improvements in Google Webmaster Tools. There is also some overlap with the SEO Reports in Bing Webmaster Tools, but the SEO analyzer is a single page analyzing tool.

Bing Webmaster Tools: SEO Analyzer

It will tell you if H1’s are missing, which images lack ALT tags and for instance if a page is missing language information. I especially like the way it is displayed, with a direct reference to the ‘error’ on the Analyzed tab on the right:

Bing Webmaster Tools: Missing ALT tag

Two things to keep in mind when using the SEO Analyzer:

  • SEO Analyzer, unlike Bingbot, will ignore robots.txt directives, so you can basically check any page on your site.
  • SEO Analyzer follows any redirect and analyzes the page it end up on. It will tell you that a redirect is detected and followed (right below the SEO Suggestions).

Verify Bingbot Tool

So why would you want to know what IP address the Bingbot is? Well, the Bingbot might be ‘overcrawling’ your website, unintendedly sending more than the usual server load your way. With this tool, you can verify if the IP address that is causing this load, is indeed Bingbot, which is needed for your support request. By the way, Bing does respect that crawl-delay line in your robots.txt. That might already help.

It seems to me that this tools should be incorporated in a workload. The fact that it is displayed here as a separate section, seems a bit too much honor for that tool.

Site Move

If you are moving your site to a new domain, be sure to let Bing Webmaster Tools know. It’s pretty easy:

Bing Webmaster Tools: Site Move

Just make sure both sites are verified. This can be done for entire domains, subdomains and directories. If you are moving a larger part of your website, like yoast.com/plugins to yoast.com/wp-plugins, this is also the section where you can see if all went well. Note that this is just a notification, it has nothing to do with actually moving a site or directory.

That’s section three of Bing Webmaster Tools

When comparing Bing Webmaster Tools to Google Webmaster Tools (and of course we do), this section holds some nice surprises. As mentioned, I like the simplicity of the Rich Snippet test, and the way the SEO Analyzer highlights the improvements on your page.

As mentioned in our earlier posts, I’d never replace Google Webmaster Tools entirely by Bing Webmaster Tools, but a combination of might have its benefits. In the next post in this series, we will cover the remaining three sections of Bing Webmaster Tools we’d like to discuss: Security, Widgets and Messages.

This post first appeared as Bing Webmaster Tools: Diagnostics & Tools on Yoast. Whoopity Doo!

Bing Webmaster Tools: Reports and Data

Bing Webmaster Tools: Reports & DataIn this second Bing Webmaster Tools post in our series on Webmaster Tools, the focus is on reporting. We’ll be going over a number of pages, and try to explain what information is available and how this can help you improve your website. Or get new insights about your audience, of course. This section is called: Reports and Data.

Note that although we very much like the ‘broader’ setup of Bing Webmaster Tools, I personally tend to use Google Webmaster Tools first for one simple reason. Google dominates the search engine market. There is data from much more users in Google Webmaster Tools. Having said that, there are a lot of things in Bing Webmaster Tools that could be used to further improve Google Webmaster Tools. Now let’s go over the different chapters in this section of Bing Webmaster Tools.

Site activity

As you can see in the graph below (upper right corner), Bing Webmaster Tools combines data from Yahoo! and Bing for these stats. It also painfully highlights the difference in use of the webmaster tools: as you can see, our website gets around 80 clicks a day from Bing and Yahoo!, compared to about 9,000 clicks on average per day from Google.

Bing Webmaster Tools: Site Activity

In the chart, you can also choose to show a number of other things:

  • Clicks from Search
  • Appeared in Search
  • Pages Crawled
  • Crawl Errors
  • Total Pages indexed

It’s nice to align the number of pages crawled and the crawl errors, but as with most graphs like this, it’s the trend that matter the most to me here. If all of a sudden the graph flatlines at zero (except at the crawl errors), there must be something wrong.

Page Traffic

Let’s start with the main table here:
Bing Webmaster Tools: Page Traffic
There is a lot going on here, right? All the arrows and numbers tell you what happened with for instance the Clicks from Search, and how often you appeared in search. For your convenience, Bing has divided the clicks by the appearances, giving you the click-through rate (CTR).

The average search click position tells you the position your result was when clicked, on average. The trend is key here. If the average position gets closer to #1, you know you are doing well, as your page is increasing it’s ranking for ‘that’ search term. This highlights the main issue: you need to combine this with the Search Keywords in the last column to be useful. Just click View, as Bing Webmaster Tools presents these keywords in a convenient popup:

Bing Webmaster Tools: Traffic Details

Who’s ‘youst’, right? I can’t get used to that downwards pointing arrow with a number, it’s like we dropped 675 clicks. But that’s not what is meant. We had 675 clicks from Bing search result pages to our homepage where the search keyword was ‘yoast’, and that is a drop from the beginning of the selected time period.

This is also where you can check the performance per position in the search result pages. Just click the ‘+’ in front of the keyword:

Bing Webmaster Tools: Traffic Details, positions

Back to the main table for your Page Traffic. The last column we haven’t mentioned yet is Average Search Appearance Position. In most if not all cases, this number is higher than the average click position, simply because we all know the higher the position, the more clicks.

Index Explorer

This is your Mac Finder or Windows Explorer for all things Bing found in your website or site’s structure. It’s how Bing sees your site. You can see files Bing considers 301 Redirects or 404 Error pages, for instance. There are a number of filters here, listed as the blue links below. Next to that, you can use some custom options:

Bing Webmaster Tools: Index Explorer, robots.txt

Strange thing is that our robots.txt is actually updated a while ago already. Where Google tends to find these updates within a day, sometimes hour, Bing still shows some old content in here. We re-fetched the robots.txt during the writing of this article, just to make sure. The fetch shows the current content of our robots.txt. I’m sure Bing Webmaster Tools will find that eventually.

Search Keywords

Much like the Page Traffic table, this table shows you the number of clicks from search results in Bing, as well as all other things explained there. It are keywords instead of URLs in this table, but that is the main difference.

Bing Webmaster Tools: Search Keywords

As you can see, that last column is different. You can view the pages instead of keywords (at Page Traffic). So it’s basically the same table, but the other way around :) Use to your preference, so to speak.

SEO Reports

Why order a site review when you have Bing SEO Reports. [promo] Well, for one thing, we tell you much more about stuff from speed to design, from content to social. [/promo] Bing SEO Reports in beta, but does provide some nice insights. We need to work on our meta descriptions, according to this overview:

Bing Webmaster Tools: SEO Reports

I especially like the fact that Bing tells you how they consider these suggestions to influence your SEO. Bing agrees with Google that meta descriptions are like invites to your website that should have a clear message about the contents of the page and tells that the severity of these meta descriptions is high. The fact that the meta description is missing, isn’t bad* per se, but it’s good to know where these are missing. You can easily click the suggestion title to find the pages these should be added.

Bing Webmaster Tools has taken the time to add a proper explanation to all these suggestions:

Bing Webmaster Tools: SEO Analysis Detail

If you click one of the (50, it’s limited to that) links on that Detail page, you’ll be taken to a SEO Analyzer (also in beta). More on that SEO Analyzer in our next post on Bing Webmaster Tools.

It seems our knowledge base – we’re using HelpScout’s awesome Docs – is missing meta language information. I can imagine that being so due to their global user base. In that case it’s a good thing language isn’t templated.

All in all, this is a very useful section you can use for some basic SEO check of your website.

* in case you can’t come up with a proper meta description, let Bing (or Google) decide. Both will create a meta description that in most cases is constructed from a text snippet containing the keyword used in search. This can be beneficial for conversion, as you will understand.

Again, the details make this a useful section. On the dashboard page of this section, the total number of inbound links (links to your website), doesn’t say much, unless there is a break in the trend. Right below the graph, you’ll find target pages. Clicking one of these gives you an overview of the pages linking your specific page:

Bing Webmaster Tools: Link details

As you can see, this includes the anchor text. There are up to 20,000 external pages listed. As the popup isn’t suitable for that number of links, the export option in the upper right of the popup comes in really handy for further analysis. Bing Webmaster Tools offers that export option on almost all pages, by the way.

Crawl Information

This is a quick overview of error codes (like 404, 502), Redirects (301, 302), DNS failures, connection timeouts and robots.txt exclusions. All is grouped per category:

Bing Webmaster Tools: Crawl Information

Click the row marked with * in Bing Webmaster Tools to find all URLs that resulted in that error, are redirected, etc. It’s nice to see if all pages you have excluded are indeed excluded, for instance. Clicking a link in this table isn’t providing more information, but gets you to the page at hand to check f.i. if the error or redirect still exists. As mentioned right above the table, Bing refers you to the Index Explorer for more details. It would be nice if I could mark a 404 as fixed here, by the way (in case any of the Bing people reads this).

Malware

Score.

Bing Webmaster Tools: Malware

In case Bing detects any malware on your website, it will list the URLs of potentially infected pages here.
Malware (short for malicious software) is “any software used to disrupt computer operation, gather sensitive information, or gain access to private computer systems.” (Wikipedia). Of course you don’t want that on your website.

In the unwanted case you have some URLs listed here, fix the malware or have our friends at Sucuri clean this up, and Request a Review via this same section in Bing Webmaster Tools.

That concludes this section in Bing Webmaster Tools

To round things up, the Reports & Data section in Bing Webmaster Tools gives you a lot to work with. Be sure to add your site to Bing Webmaster Tools and check this section to learn more about what can be improved for your website.

If you have any additions or remarks, these are more than welcome. If you are an absolute Bing adept, please let us know what hidden gems you have found in the Reports & Data section in Bing Webmaster Tools!

This post first appeared as Bing Webmaster Tools: Reports and Data on Yoast. Whoopity Doo!

Bing Webmaster Tools: Configure My Site

Bing Webmaster Tools - Configure My SiteWe recently did a series about Google Webmaster Tools. So we thought it was time to do a series about Bing Webmaster Tools as well. Bing isn’t that big in The Netherlands, or Europe, for that matter, but it still holds some ground in the US. The most important reason, however, is we think Bing Webmaster Tools is pretty awesome. It offers some really nice tools and details you, unfortunately, won’t find in Google Webmaster Tools (anymore).

In this post we’ll go into the first main menu item of Bing Webmaster Tools and its subitems. But first, we’ll explain how you can set it up for your own site!

Setting up Bing Webmaster Tools

Since Bing Webmaster Tools is probably less known than its Google counterpart, we’ll let you know how to set it up! So lets start with the beginning.

You need a Microsoft Live account

Don’t worry, this doesn’t mean you’ll need an @outlook.com or @hotmail.com email address. You can actually use any email address you like. Go here to sign up for a Live account and just follow the steps.

Once you have a Live account, you can go to Bing Webmaster Tools and sign in with the account you just created.

Add your site

After logging in to your Bing Webmaster Tools, you obviously need to add your site:

Bing Webmaster Tools - Add your site

Fill in your website’s URL here and click ADD. This will take you to this screen:

Bing Webmaster Tools - Add your site and sitemap

If you already have a sitemap and you’re aware of it, you can fill in the sitemap here as well. We recommend just leaving the last option on the default, for now. Now you can click the “ADD” button again. Doing this will take you to the following screen:

Bing Webmaster Tools - Verify ownership

And this is where the fun comes in. All of this might seem very technical and way over your head. However, if you’re using our WordPress SEO plugin, this is all very easy. Simply copy the entire part in the grey bar and paste it in our plugin here:

WordPress SEO Setting up Bing Webmaster Tools

Hit “Save Changes” and it’ll remove everything it won’t need, just your ‘key’ will remain. After that hit verify back in the Bing Webmaster Tools setup. If you have some caching, this might not work instantaneously; clear your cache or wait for a bit until you can verify it.

Once you’ve verified it, you’ll be able to access your Bing Webmaster Tools!

Configure My Site

When clicking the ‘Configure My Site’ menu item, you’ll be taken to a dashboard overview of all that you can do in this menu item. You can even do most things right there in the dashboard, which makes it the go to place, once you know what it all does ;)

So lets get to explaining what you can play with in Bing Webmaster Tools.

Sitemaps

The sitemaps menu item is about just that: sitemaps. Here you can (re)submit, remove or export sitemaps. Adding a sitemap is simple, just copy the sitemap’s URL, paste it in the appropriate bar and hit Submit. As with Google, if you have multiple sitemaps under a sitemap index (as our WordPress SEO plugin does for you), you just have to add the sitemap index.

If you’ve added a sitemap index, just clicking that index sitemap’s link will take you to all underlying sitemaps. You can also see whether any errors occurred when crawling your sitemap, when it was last crawled and how many URLs were submitted through the sitemaps, for instance.

Submit URLs

In this section you can add URLs of pages that you think are important directly into the Bing index. This feature is limited to 10 links a day and 50 links a month per domain. Whether this actually makes sure your content is indexed more quickly than through your sitemaps, for instance, remains a bit unclear.

Ignore URL Parameters

URL Parameters are anything after the normal URL, so usually anything after a question mark in your URL. If you want Bing to ignore any of these parameters, you can tell them here. Just to be clear: this will ignore only the parameters, it’ll literally remove the parameters from the URL before adding the page to the index. So this does not mean any URLs with these parameters will not be indexed, you should do that in the Block URLs section.

A URL with parameters is actually viewed as a different URL as the same one without parameters. So “https://yoast.com/?utm_source=” etc… would actually be considered a different URL than just “https://yoast.com/”, even though they end up at the same place. So using this feature can prevent duplicate content.

Crawl Control

This is one of the more nifty tools in the Configure My Site section of Bing Webmaster Tools. You can tell Bing when to crawl your site more and when to crawl it less:

Crawl Control Bing Webmaster Tools

They have some preset crawl rates, which are made to prevent an overload of your server. So when your site is busy, they’ll crawl your site less, and when it’s less busy, they’ll crawl more. However, you can also set a custom crawl rate. Just click in the graph to increase or decrease the crawl rate.

Note: If you have any info on crawl rate specified in your robots.txt, that will take precedence over anything you set up here.

Deep links are the Bing equivalent of Google’s Site Links. If you don’t know what those are either, they’re all the links in the pink block:

Bing Webmaster Tools - Deep Links

In the Deep Links section you can block pages from showing up there. What’s nice about Bing here, is that they allow you to remove the URL showing as a deep link for all URLs, a specific URL or even for a specific country or region.

What’s not so nice about Bing here is that they limited this block to 90 days. The block can be extended, but that means you have to revisit your webmaster tools every 3 months to extend the block you set.

Block URLs

The Block URLs section is very similar to the Deep Links section, in that it blocks URLs from showing up in the Bing search results. Adding a URL here will block the URL from showing up anywhere in the search results of Bing, though.

Annoyingly, they’ve again set the block to a maximum of 90 days, so you should only use this feature if you’ve just deleted a page from your site. Any permanent pages you want to block should be blocked in robots.txt.

Page Preview

The page preview is just that: a preview of your page. Bing doesn’t show this page preview everywhere, but they do in for example Bing Smart Search in Windows. You can tell Bing to refresh or remove the preview of a specific URL here. Just fill in the URL and click Fetch. You’ll then be given a choice to either block the preview or refresh it.

Disavow Links

Here you can disavow links you got from other websites. If you don’t know what disavowing a link means, read the post by Michiel on cleaning up bad backlinks. The best feature here by far is that you’re able to disavow not only a single page link, but also complete directories or domains. This can save you a lot of time when cleaning up your backlinks.

Geo-Targeting

In this section you can set a geo-targeting for your domain, subdomain, directory or page. Geo-targeting means you tell Bing the part of your website you selected is meant for a specific region or country. For instance, if we had a German version of yoast.com under yoast.com/de/, we’d set the geo-targeting for the yoast.com/de/ directory to Germany. This doesn’t mean it won’t show in Bing results in other countries, it just makes it more clear to Bing what people should be seeing the content.

One note about this: although you can set geo-targeting for your domain, if your domain has a country-code (.nl, .de, .co.uk for instance), geo-targeting your entire domain won’t work. Directories and pages can still be geo-targeted.

Verify ownership

If you need to verify your website again, you can do so here.

Connected Pages

Under Connected Pages you can tell Bing any social media presences you have elsewhere. By connecting these pages to Bing Webmaster Tools, you’ll be given quite some nice insights. For instance, we’re able to see how many times our twitter account appeared in search and how many times people clicked those appearances. We can even see what keywords people used to get our twitter account to appear, how many backlinks our twitter account (or status updates) has and what anchor texts were used for those links.

App Linking

Here you can link any apps you’ve connected in the Connected Pages section to Bing search results on Windows and Windows Phone. As we don’t have any experience with apps, I’d love to hear your thoughts on what this does exactly! ;)

Users

Here you can manage the users that have access to Bing Webmaster Tools for the current site.

What do you think?

Although Bing is definitely used less than Google, their Webmaster Tools are actually pretty awesome. So if Bing gives you a relevant amount of traffic to your site, it’s definitely something you should check out!

Do you agree? Let me know in the comments!

The next post on the next main menu item in Bing Webmaster Tools, written by Michiel, will be published tomorrow.

This post first appeared as Bing Webmaster Tools: Configure My Site on Yoast. Whoopity Doo!

Google Webmaster Tools: Crawl

Google Webmaster Tools: CrawlThe section in Google Webmaster Tools that works most closely with our WordPress SEO Premium plugin, is the Crawl section. In our premium plugin, you’ll find a Webmaster Tools section, that lists all pages Google somehow did not find on your website. You can easily import these into the plugin and redirect the ones that need redirecting, so the crawl error is resolved.

But there is more in that fourth ‘chapter’ in Google Webmaster Tools. Following our posts on Search Appearance, Search Traffic and Google Index, this article digs into for instance crawl errors and stats, finding out how Google sees your website and your XML sitemaps.

Crawl Errors

This section lists two types of errors, being site errors and URL errors. Site errors simply lists whether your DNS works, the server can easily be reached (no timeouts, for instance) and if Google can access your robots.txt file.

Google Webmaster Tools: Crawl / Site error

Google provides background information on the error (when did it occur, how often in the last 90 days). If things like this happen too much (as in more than once or twice a year without advance warning), be sure to contact your hosting company or switch hosts altogether.

The URL error section is divided into multiple sections and subsections. First, you can check for errors that Google gets when acting a number of different devices. On for instance the desktop tab, we find the number of server errors, access denied errors and page not found errors.

Google Webmaster Tools: crawl / URL errors

Please monitor these errors, as too many errors could send a signal of low quality (bad maintenance) to Google. A 404 can be redirected, like mentioned for instance in our WordPress SEO Premium plugin, or with a bit more work straight in your .htaccess file. After that, check the checkbox in front of the URL and click Mark As Fixed (this will just make sure the list is cleaned up, it won’t do much besides that).

We have found in our reviews, that a lot of people either ignore the errors here, or forget to mark errors as fixed. This will only lead to a very long list of errors. Clean that list up now and then. If you want to check if any of these URLs are already fixed, click the link to find more information on the error and use Fetch as Google to see if the URL is now accessible for Google (or click Marked as fixed, of course).

Soft 404s

These tabs can also show Soft 404s. A soft 404 occurs, when a page as such exists, but has an empty ‘content area’. Let me elaborate on that. Google does a fine job on locating content on a page. It understands your header, sidebar, footer and content area. That also means that you can have a fully packed sidebar and footer, Google will still return a Soft 404 when the content area is empty. And by empty we also mean a category page with no posts, or any other page stating there is no content available. Or a page that just says 404 Not Found and returns a 200 OK server response anyway (happens more often than you think). This also occurs when you would for instance link an internal search page that just doesn’t return any results. There is no page, but an almost empty page is returned anyway.

Although the server will return a 200 OK message for that page, Google will consider it a (Soft) 404. You don’t want these pages on your website for a number of reasons. For one, these pages are obviously not very user-friendly. But besides that, Googlebot will have to go over all these pages for no reason at all (as they lack content), which will prevent your site from being crawled in an efficient way. After all, you don’t want Googlebot to spend time trying to see what’s on all these non-existing pages. Either add content to these pages or noindex them completely. In case of an empty category or tag page, consider removing the category or tag. You’re not using it anyway.

Smartphone visits

For your smartphone visits, Google also tests for faulty redirects and blocked URLs. A blocked URL is a page that is blocked for Googlebot-mobile for smartphones in robots.txt. Simply check if these are intentionally blocked and otherwise change robots.txt to allow access.

Faulty redirects occur when a page redirects to “an irrelevant landing page instead of the equivalent, smartphone-optimized version of the desktop page”. That could for instance be a redirect to your homepage instead of a redirect to the exact same page on your mobile site. Check the question mark next to labels in Google Webmaster Tools for more information on the terminology in Google Webmaster Tools, by the way. These explanations really come in handy, sometimes :)

Lastly, there can also be a tab for News at URL sections. That shows crawl errors for Google News content. Of course that is only for News sites. See the Google News requirements to see if your website qualifies at all for Google News, otherwise don’t bother looking for this tab.

In the end, you’ll just want this to be the only text in the Crawl Error section:

Google Webmaster Tools: No crawl errors

Crawl Stats

This is your handy overview of Googlebot activity on your website. It shows you the pages it crawled per day, the number of bytes downloaded per day and the time spent downloading a page. Depending on the number of pages of your website, this might be handy information.

Google Webmaster Tools: crawled per day

We think this is great for trend watching. If you have made changes to your site structure, robots.txt or for instance added XML sitemaps for the very first time, that should show in the provided graphs.

In case these stats show a drastically declining line, of even a flat line at zero, there’s probably something really wrong with either the website (robots.txt might be blocking Googlebot), or your server (it could be down, for instance). Again, monitor this.

Fetch as Google

As mentioned, in case of any crawl errors, you should look into what happened and why. One of the tools Google Webmaster Tools provides, is a way to view your website as Google does. You can fetch a page as Google. You could either click the link at Crawl Errors and click the Fetch as Google link in the pop-up, or go to the Fetch as Google section in Google Webmaster Tools to enter an URL manually:

Google Webmaster Tools: Fetch as Google

Note that in the image above, all rendered pages have been rendered quite a while ago – perhaps these already can be fetched the right way – so visit them to check this and perhaps even do a refetch. In the image, we see three different kinds of statuses (these go for both Fetch and Fetch and Render commands):

  • Partial: The page can be partially rendered, as some elements are probably not displayed as intended or not at all, for instance because you are blocking CSS or JS in your robots.txt file. If you click the line with the partial status in the overview, you’ll actually be taken to a snapshot of how Google rendered your page. On that page, Webmaster Tools will also tell you which resources it could not get, so you’ll be able to fix this.
  • Not Found: The page can’t be found. This might be because a redirect isn’t set yet in case of an URL / structure change, or perhaps you simply deleted the page (server returns a 404 error).
  • Unreachable: Googlebot didn’t have the patience to wait for your website to load (make it faster), or your server simply replied that it could not allow the request for the URL.

Of course there are more. Other statuses you might find here are:

  • Complete: That’s the one you want. Google managed to crawl the entire page.
  • Redirected: Either the server or your website itself (HTML/JS) told Google to visit another URL.
  • Not authorized: Your server tells Google that URL access is restricted or has been blocked from crawling (server returns a 403 error).
  • DNS not found: Perhaps you entered the wrong URL? The domain name seems incorrect.
  • Blocked: Your robots.txt tells Google to bugger off.
  • Unreachable robots.txt: Google can’t reach your robots.txt at all. More on testing the robots.txt below.
  • Temporarily unreachable: Either the server took too long to reply or too many consecutive requests were made to the server for different URLs.
  • Error: An error occurred when trying to complete the fetch (contact Webmaster Tools product support in this case).

By clicking the URL, you can see the rendered page as seen by both Googlebot and a visitor, so you can make an judgement on the impact of the blocked file:

Google Webmaster Tools: Render

In this case, the impact is clearly low.

Robots.txt Tester

Last week, Joost explained a lot about the Partial status at Fetch as Google in his post WordPress robots.txt Example. You really need to make sure your robots.txt is in order.

By the way, you might be wondering if you really need that robots.txt file. Actually, you don’t. If you think Google should crawl all sections on your server, you could leave it out. The Robots.txt tester will return this message in that case:

Google Webmaster Tools: No robots.txt

In the Robots.txt tester, Google will show the robots.txt you are using and tell you any and all issues Google finds:

Google Webmaster Tools: robots.txt warning

That’s a warning: Googlebot ignores that crawl delay. If for some reason you do want to set a crawl delay, please do so using the gear icon in the upper right in Google Webmaster Tools (at Site Settings > Crawl Rate – new crawl rates are valid for 90 days). Please note that this is not how often Google visits your site, it’s the speed of Googlebot’s requests during the crawl of your website.

Google Webmaster Tools: robots.txt error

Hm. And what do you want Google to do with /wordpress/wp-includes/? By the way, like in this example, we see a lot of webmasters adding a link to the sitemap to their robots.txt. No problem, but why not simply add that in Google Webmaster Tools instead? More on that later.

gwt robots txt wrong comment

This is another syntax not understood. Comments in robots.txt can be done using hashtags instead:

Google Webmaster Tools: robots.txt comment

Works like a charm.

gwt robots txt error

Ouch. The horror. Google could not find a user-agent in this robots.txt – but it’s there, right? It’s actually preceded by a space, and that immediately triggers a ‘syntax not understood’ for the robots.txt test.
It doesn’t mean all 20 restrictions in that robots.txt will be ignored, by the way. This seems to be a strict test, but Google is very capable of filtering that space. Google actually encourages whitespace for readability. But strictly speaking, it shouldn’t be in there.

Visit Webmaster Tools for even more information on the robots.txt syntax.

Test allowed / disallowed

One more thing. When you want to test whether a page or directory on your site can or can not be reached by Googlebot, or for instance Googlebot-News or Googlebot-Mobile, you can test that as well in this section, right below the robots.txt code.

If we take the last example above, and test the /Tests/ part of it, you’ll see that that indeed can be indexed if we follow the strict rules of the Robots.txt tester:

Google Webmaster Tools: robots.txt allowed

Although the text ‘Allowed’ is green, it’s not a good thing that this directory can be indexed. As mentioned, the space isn’t allowing Googlebot in this case, according to the Google search result pages:

Robots.txt tester isn't flawless

Feel free to add any insights on this particular issue in the comments below.

If you test a page or directory and find that it is blocked like this:

gwt robots txt url blocked

the test tool will also tell you what line in your robots.txt is causing this:

Google Webmaster Tools: robots.txt blocked by line

All in all, if you are using your robots.txt actively, make sure your robots.txt will do what you intended it to do. The Robots.txt tester will help you a lot.

Sitemaps

It doesn’t matter if you are setting up a new WordPress site, or have just installed our WordPress SEO plugin: activate XML sitemaps and remove any other plugin that does this as well. Don’t forget to redirect the old XML sitemap, probably at /sitemap.xml, to ours at /sitemap_index.xml. If you experience any issues, check our knowledge base.

Having said that, if you have a proper XML sitemap, go to Google Webmaster Tools and test and add it at Sitemaps. Unfortunately Google doesn’t allow you to add a tested sitemap immediately after testing. Yes, this is my feature request, Google ;)

Google Webmaster Tools: Tabs on sitemaps

These sitemaps can be added manually, but perhaps Google already found some. These are listed on the All tab:

Google Webmaster Tools: sitemaps found by Google

We often get questions about image XML sitemaps. Images are actually already inserted by our plugin in for instance post-sitemap.xml and page-sitemap.xml:

Google XML Sitemaps

Back to Google Webmaster Tools. First you want to make sure /sitemap_index.xml contains all the content types you want Google to index. Please check the XML sitemap section in our plugin and see if you can exclude any post types or taxonomies from the XML sitemaps; this usually already fixes a lot of warnings and errors. Especially on shops, where sitemaps can be created for shipping classes and cloth sizes, for instance, that would be my first advice.

Second, you add the /sitemap_index.xml, which will be added immediately, along with any sitemaps listed on that sitemap. If for some reason that sitemap still contains content you’d rather not have indexed, simply change that in our plugin and resubmit it in Google Webmaster Tools. Note that you can manually add and delete sitemaps, but the ones that are automatically added for instance by an index sitemap, can only be deleted by a resubmit.

I thought about adding a list of possible errors and warnings to this article as well. Seriously. But when I found that Webmaster Tools actually added internal navigation on their Sitemap Errors page, it seemed to make sense to simply link that page.

Common warnings are warnings about Google not being able to reach a page, due to a long response time or for instance an URL included in an XML sitemap, but excluded in robots.txt.

Errors vary from invalid date formats to sitemaps not being found (a 404 on your sitemap is never a good idea). A sitemap can also be empty, or a required tag can be missing.

Indexed versus submitted content

Another thing you might wonder about is the difference between index and submitted content types (pages, video, images). These are the red and blue bars in this section. The red bar (indexed types) is usually a bit lower, as Google isn’t crawling your entire site at once. Time is precious, so Google spiders a (large) number of pages at a time, but if your site structure goes a gazillion levels deep, chances are Googlebot isn’t getting to these deepest pages in a crawl. It’s not that Google bookmarks where they end up and start from there the next time it crawls your website. This emphasizes the need for a good internal link structure, well formatted sitemaps and things like that.

URL Parameters

Let’s start with Google’s warning here:

Use this feature only if you’re sure how parameters work. Incorrectly excluding URLs could result in many pages disappearing from search.

If you are using URL parameters, like the default s for WordPress search, please check this section. When discussing this post with Joost, he told from experience that f.i. in case of a site migrations, things might go terribly wrong if this isn’t done properly.

In this section, you can tell Google how to handle your parameters. When clicking Add a Parameter, you’ll get a pop-up with these options:

gwt url parameters

I entered the ‘s’ for search, and have to decide on that parameter to affect page content or just change the way content is displayed on the page. Google respectively calls this passive and active URL parameters. Active parameters can for instance be used for sorting, pagination and sometimes even translations or categorization. Passive parameters are usually for tracking or referrals, like Magento’s SID (session ID) and Google’s own utm_source.

Now if you feel the parameter “changes, reorders or narrows page content”, and pick Yes in the above select box, you’ll be presented with four more options. You can set here how you want Google to handle the parameter:

  1. Let Googlebot decide: a general option if you’re not sure what to choose here.
  2. Every URL: Every URL using this parameter is an entirely new page or product.
  3. Only URLs with specified value: This will indicate to Google that you only want URLs crawled with a specific value for this parameter and forget the rest, for instance to avoid duplicate content due to sorting options.
  4. No URLs: Don’t crawl pages with this parameter at all. Avoiding duplicate content is a reason to use this one as well.

Note that instead of using URL parameters for option 3 and 4, you could also set the right canonical on all of these pages.

TL;DR?

Sorry to disappoint you. There is no Too Long; Didn’t Read in this. In the previous posts on Google Webmaster Tools, we have already emphasized the importance of checking your site now and then, or monitoring it actively. Google Webmaster Tools helps a lot with that, and this is one of the longest (and in my opinion most interesting) sections in Google Webmaster Tools.

Feel free to drop any addition or question related to this section in the comments. Looking forward to it!

This post first appeared as Google Webmaster Tools: Crawl on Yoast. Whoopity Doo!

Google Webmaster Tools: Google Index

Google Webmaster Tools - Google IndexThis is already the third post in our Google Webmaster Tools series. Last week we’ve written about the Search Appearance section and the Search Traffic section of Google Webmaster Tools. So if you jumped in here, and want to start at the beginning, please read those posts first.

Today we’ll be going into the Google Index section, which obviously gives you some insight into how your website is being indexed in Google.

Index Status

The Index Status shows you how many URLs of your website have been found and added to Google’s index:

Webmaster Tools Index Status

This can give you a good idea of how your site is doing in the index. If you see this line dropping, for instance, you’d know there’s an issue. Basically any major and unexpected change to this graph should be something you look into.

Actually, the “Advanced” tab gives you just a bit more insight into how all your indexed pages are divided:

Webmaster Tools Index Status https yoast com

As you can see, this shows you how many of your pages are being blocked by your robots.txt as well. And you can also see how many of your pages have been removed from the index, but more on that in the next chapter.

There’s something else this graph makes clear. As of March 9th of last year (at the “update” line) Google Webmaster Tools shows the data separately for both HTTP and HTTPS websites. This means that if you moved your site from HTTP to HTTPS since then, you’ll need to add your site again, using the red “Add a site” button. Then, fill in the entire URL, including the HTTP or HTTPS part:

Webmaster Tools Home

Interpretation of the Index Status

There’s a few things you should always look for when checking out your Index Status:

  • Your indexed pages should be a steadily increasing number. This tells you two things: Google can index your site and you keep your site ‘alive’ by adding content;
  • Sudden drops in the graph. This means Google is having trouble accessing (all of) your website. Something is blocking Google out, whether it’s robots.txt changes or a server that’s down: you need to look into it! This could also have to do with the separate HTTP and HTTPS tracking I mentioned above;
  • Sudden (and unexpected) spikes in the graph. This could be an issue with duplicate content (such as both www and non-www, wrong canonicals, etc.), automatically generated pages, or even hacks.

Content Keywords

The Content Keywords area gives you a pretty good idea of what keywords are important for your website. When you click on the Content Keywords menu item, it’ll give you a nice list of keywords right away:

Webmaster Tools Content Keywords

These keywords are found on your website by Google. This does not mean you’re ranking for these keywords, it just means they’re the most relevant keywords for your site according to Google. You can also extend this list to 200 items, so it’ll give you a pretty broad idea.

This actually tells us a few things about your site. It shows you what Google thinks is most important for your website. Does this align with your own ideas of what your website’s about? For instance, if you find any keywords here that you didn’t expect, such as “Viagra” or “payday loan”, this could mean that your site has been hacked. And next to that, if you’d expect any keywords that you can’t find in this list, there’s a few things you can check:

  • Your robots.txt might be blocking the page(s) that contain the keyword(s) you’re expecting;
  • The page containing the keyword might not be old enough yet for Google to have crawled it;
  • Google excludes keywords they consider boilerplate or common words from this list. What they’d consider boilerplate or common differs per site.

Blocked Resources

A new addition (March 11, 2015) to the Google Index section is Blocked Resources. This report shows you how many of your resources are blocked for Google:

Google Webmaster Tools: Blocked Resources

The report will show where these resources are hosted. Clicking the host will take you to a second page showing all the blocked files per host. Again, that report will show how many pages are affected per file. These files could be images used on your site, but also CSS or JavaScript files. In that detailed report, you can click on a file and see all the pages it is listed on, as well as the last date Google detected these.

After you have found the blocked files, you can use the Fetch and Render option to view the page as Google and decide on whether you need to fix this right away or not. Depending on the impact of the blocked file, of course. Google Webmaster Tools will guide you in fixing the robots.txt block for files hosted on your own sites. For blocked resources with a larger impact that are hosted on a third party site, you’ll have to contact the owner of that host / website and ask them to remove the block. Always ask yourself if you really need that file first!

Remove URLs

This section of Google Webmaster Tools gives you the power you’d expect: to remove URLs from Google’s search results altogether. You can actually also block those pages from coming up in the search results by disallowing the page in your robots.txt. Or you can make it a password protected page, if that suits you better.

However, you can also do this from Google Webmaster Tools quite easily:

Webmaster Tools Remove URLs https yoast com

Just type in your URL and hit “Continue”. The next window will give you three options to removing a URL from the search results.

Webmaster_Tools_-_Remove_URLs

The first option will completely remove the URL you entered from the search results, along with the cache. You can find the cached version of your website here in Google:

yoast com Google Search

So the first option would remove both that cached version and your entire result. The second option would only remove the cached version from the Google search results. The third option (Remove directory) would remove both of these things for not only that page, but also for every subpage. So removing the directory yoast.com/test/ would also remove yoast.com/test/test1/ and so on.

Be sure to only remove pages you don’t want to be showing up in the Google search results anymore. Don’t use this for crawl errors, 404s or anything like that. Those things should be fixed differently. Also, if the page is meant to stay out of the Google search results, be sure to remove the page from your site (404 or 410 error) or disallow Google from crawling the page in your robots.txt. Be sure to do this within 90 days of using this removal request! Otherwise, your page might get reindexed.

Conclusion

The Google Index section of your Google Webmaster Tools is a great section for monitoring how Google’s handling your site. Whether Google has suddenly stopped indexing your website, or has a different idea of what your site’s about, this is the section to find that out.

So be sure to keep an eye on this! And also keep an eye out for the next post on Google Webmaster Tools, which will be about the Crawl section. That post will go a long way in pinpointing how to find out where the issues you found in the Google Index section came from.

That’s it! Please let us know what you think in the comments!

This post first appeared as Google Webmaster Tools: Google Index on Yoast. Whoopity Doo!

Google Webmaster Tools: Search Traffic

Google Webmaster Tools: Search TrafficFollowing Thijs’ article on Search Appearance in Google Webmaster Tools, I wanted to talk to you today about the second section: Search Traffic. Although the common thread is search traffic, the subsection deal with a lot of different topics like search queries and links.

In this article, we will explain all that can be found in these subsections.

Search Queries

Ever since Google started using SSL back in 2011, us webmasters find that annoying ‘keyword not provided’ in our Google Analytics stats. Google created an ‘alternative’ in an “aggregated list of the top 1,000 search queries that drove traffic to their site for each of the past 30 days through Google Webmaster Tools.” I’m under the impression this isn’t limited to the top 1,000 anymore (see screenshot below, which shows 9,142 queries) and it’s up to 90 days if I am not mistaken.

Google Webmaster Tools: Search Queries

Google Webmaster Tools: Search Queries

 

The interesting thing is that if we click the ‘With Change’ button right above the bottom table, we can actually see how traffic changed, and perhaps more importantly, how clicks from Google to the pages listed at ‘Query’ changed over time. Depending on the time span you select in the upper right corner, you could actually use this to test meta descriptions and titles.

Note that a number of things can also be found in Google Analytics, but these results can be filtered on Image search, Mobile search, Image search and Video search. But also on location, which will come in handy for people targeting domestic or specific markets abroad. The mobile filter, with more and more traffic being from mobile devices, is of course very interesting to keep a keen eye on – what are these specific people looking for? And are these pages optimized for mobile?

Right below the title in that screenshot (Search Queries), we find a second tab, “Top Pages”. It is similar, but instead of the search keyword, this page shows the URLs of your most visited pages. Always ask yourself if the pages that top that list are also the pages you want to rank for. If not, it could be necessary to get back to the drawing board and create a new site structure around these pages. That way you can leverage the rankings of these pages for the rest of your website. As mentioned, there is overlap with Google Analytics. The main difference is the absence of ‘keyword not provided’ ;-)

Links to Your Site

This section in Google Webmaster Tools provides what the title says: information on links to your website. It’s divided into three sections:

Google Webmaster Tools: Links to Your Site

Google Webmaster Tools: Links to Your Site

 

It’s a logical threesome: which websites links our content, what content is linked the most and what anchor / link texts are used the most.

Interesting to see Pinterest bringing in the most traffic, right. For our regular readers: yes, we used this example in our Social marketing post as well. But now Tumblr has even gone above Google traffic. In this case, social is the new Google.

For this site, it’s clear that the domain name is used the most as an anchor text, which is quite common. The second one is a general ‘visit site’, but in the top 25 of Anchor texts, 17 of the first 25 were along the lines of ‘the 50 best’, ’15 healthy ..’ and lists like that. For this specific site, that really seems to work and of course we would encourage them to create more lists like that. See for yourself what anchors are often used, it will probably give you a general idea of what kind of posts work / should be written for your website.

Of course this is also emphasized by the most linked content.

Google Webmaster Tools: Linked pages

Google Webmaster Tools: Linked pages

 

This is also interesting: 30K+ links from 43 domains, that probably means some ‘site wide’ links (links on every page of a website). In that case, it might pay to do some further investigating in tools like MajesticSEO or OpenLinkProfiler (of course there are more tools like that). Find the side wide links and see if you can improve that backlink, for instance by being linked in articles as well, instead of just in a sidebar. That will improve the quality of the link (not the traffic, per se).

Internal Links

On a lot of (WordPress) sites, most internal links will go to tag and (product) category pages. That seems to make sense. This section tells you if I am right about this. In one of the projects we worked on, we found this:

Google Webmaster Tools: Internal Links

Google Webmaster Tools: Internal Links

 

Everything about it is odd :) Why is that one tag page getting excessive internal links? This might be the case when you have just started tagging your products and this is the first category you have used for this. That should mean this list looks a lot different in a few weeks.

The second odd thing is that Google tells us a .htm page is linked from 76 pages. Luckily, Google Webmaster Tools allows you to find that page using the search bar at the top of this page. It will tell you what pages link to it (you can also click the blue link in the table, by the way):

Google Webmaster Tools: Links to a specific page

Google Webmaster Tools: Links to a specific page

 

Somewhere on that site, there seems to be a remainder of an old site (current pages don’t have that .htm extension). The page at hand actually returns a 404 page, unfortunately, so this is something that should be looked into. Another reason to make sure to check Google Webmaster Tools on a regular basis.

Manual Actions

Let’s hope this section is totally empty for you. The best message here is “No manual webspam actions found.”. Google uses this section in Google Webmaster Tools to tell you what ‘penalties’ your website has received from the friendly people at Google. This isn’t a bad thing. In our ongoing quest for better websites (did I mention our site reviews in this post already?), the quality of a website is very important. The goal of a website should always be to serve the visitor the best information or products in the best possible way, and preferably Google should be one of these visitors. The visitor comes first. If you find a manual webspam action here, Google found something that doesn’t serve your visitor. There are a number of flavors:

  • Unnatural links
    Make sure from and to your site are valuable, not just for SEO. Preferably your links come from and link to related content that is valuable for your readers. Another unnatural link is a link that is from a detected link network.
  • Hacked
    A message stating your site’s probably hacked by a third party. Google might label your site as compromised or lower your rankings.
  • Hidden redirects
    Links to for instance affiliate sites that you have hidden using a redirect (f.i. cloaking) Cloaking and ‘sneaky’ redirects are a violation of Google’s Webmaster Guidelines.
  • Thin content
    If your website is flooded by low quality content, for instance duplicate content or pages with little to no text, Google will value your website lower.
  • Hidden text
    Back in the days this worked very well: white text on a white background, stuffed with keywords you wanted to rank for. Well, Google got smarter and will find that text. Again: write for your visitor, not for Google.
  • Plain Spam
    Again, you’re not following Google’s guidelines. Automatically generated content, scraped content and aggressive cloaking could be the reason Google considers your website pure spam.
  • Spammy freehosts
    If the majority of the sites that are on the same server as yours are considered spammy, the entire server might be blacklisted. And yes, your site might non-intentionally suffer from this as well. Make sure to choose the right hosting company.
  • Spammy structured markup
    If you use rich snippets for too many irrelevant elements on a page, or mark up content that is hidden to the visitor, that might be considered spammy. Mark up what’s necessary, and not everything is necessary.

All these things are unnatural optimization or a sign of low quality. Luckily Google provides information on recommended actions via Webmaster Tools. However, these might be lengthy processes and take some hard work from your site. But hey, you were impatient and wanted that quick win.

In conclusion: prevent any messages from turning up in this Manual Actions section.

International Targeting

If you are running an international company, chances are your website is available in multiple languages. Although there is more than one way to this, the best way is to set up different sites per top-level domain and link these websites using a hreflang tag. Alternatives would be telling Google this via a sitemap or your HTTP headers. In this section, Google Webmaster Tools tells you if the implementation is correct.

Besides that, and often overlooked, you can actually select a geographical target audience here, on the second tab Country:

Google Webmaster Tools: International Targeting

Google Webmaster Tools: International Targeting

 

If your business solely focuses on one country, why not tell Google that, right?

Mobile Usability

I could go on and on about Mobile Usability / User Experience. Mobile traffic is really important for a lot of websites and Google does a nice job emphasizing that to us webmasters. In Google PageSpeed Insights, there is a mobile UX section, as there is in Google Webmaster Tools:

Google Webmaster Tools: Mobile Usability

Google Webmaster Tools: Mobile Usability

 

Webmaster Tools won’t only tell you what is wrong, but also on how many and which pages these errors occur (just click the double arrow on the right next to the error).

In my opinion, most mobile errors that are highlighted here can be fixed with just a tad bit of knowledge of CSS.

Want to know more about Google Webmaster Tools?

As mentioned, we’ll be going over Webmaster Tools in a series of articles. This is only article number two, so please stay tuned and visit our website frequently to find out all there is to know.

If you have any additions to this section of Google Webmaster Tools, like alternative ways to look at this data, or combine it with other reports in Google Webmaster Tools or Google Analytics, we are very much looking forward to your thoughts. Any insightful additions might be used in future posts, of course! Thanks in advance.

This post first appeared as Google Webmaster Tools: Search Traffic on Yoast. Whoopity Doo!