Branding is an essential aspect of your marketing strategy. If you develop an awesome branding strategy, people will get to know your brand, your company, and your website. In this post, I will first explain what branding is and why it can help you with your SEO. After that, I’ll give 5 practical tips you can use to improve your own branding strategy.

5 tips on branding

What is branding?

Branding is the process of creating a clear, unique image of your product or your company. Your audience should be able to recognize your brand. Whether it is a post on Facebook, your newsletter or the product section on your website, the image of your brand should be similar.

Branding is hard. In order to set up a successful branding strategy, you should first have a clear vision in mind what your brand is about: What is your mission? What values are important? Which style fits your brand (formal/informal)? What does your preferred audience look like?

Why is branding important for SEO?

If you are able to set up a high quality branding strategy, optimizing your site for the search engines will become much easier. Chances rise that your preferred audience will get to know your brand name. Your brand name could then become an incentive to click on your link in the search results (even if you’re not in the top three!). And, if you do your branding really well, people will start searching for your brand as well. It’ll be far less hard to rank for your brand, than for a lot of other search terms.

In order to help you to set up a successful branding strategy yourself, I will share 5 practical tips:

Tip 1: Stay consistent

The most important thing in branding is to stay consistent. Develop a certain style and stick with it! Design a logo, and stick with it! Phrase your mission and stick with it! If you are consistent in the way you present your brand to your audience,  people will eventually start to remember and to recognize your brand.

Tip 2: Phrase a tagline and make it visible

Your tagline phrases the most important message about your brand or your product in a single sentence. Make sure it stands out on your website. You can for instance place a tagline below your brand name. The tagline of Yoast is: the art & science of website optimization.

If possible, try to write your taglines in an action-oriented way. You can do this by using verbs and sentences that imply an action for the visitor. For instance, we could have a tagline saying: ‘Keep your site optimized with the Yoast SEO Premium plugin!’. This shows people one of the core values of the plugin, and making it active will motivate a lot more people to actually try it.

Tip 3: Use images

Images are a very important aspect of your branding strategy. You can use pictures and illustrations on your website, in your newsletter, on Facebook or in (printed) advertisements.  Of course, you should make sure your images fit your brand. If you sell ballet shoes, you should probably not use pictures of wild animals in the jungle.  You would want to use pictures that express elegance and grace.

If you consistently pick illustrations and photos that fit your brand, your audience will eventually recognize and remember your brand from simply looking at your pictures. At Yoast, we work with two illustrators in order to make unique illustrations that will give the Yoast feeling to our audience.

If you use your own photos, you could try to develop some sort of consistent style. You can for instance make sure all your pictures have the same dimensions, use a similar way of editing or use similar pictures. On Facebook, we always put a text bar on our images. We include the title of our post and the Yoast logo in that text bar. That text bar ensures consistency within all of our Facebook posts.

Tip 4: Use your brand name

Make sure your brand name will become familiar to your audience. That means you should use that brand name! Perhaps you can use your brand name in one of your products like we do in Yoast SEO. Make sure to use your brand name in your newsletter and in your (Facebook) posts. People should hear and read your brand name regularly!

Tip 5: Use your logo

Your logo is of great importance to your branding strategy. Branding is more than designing an awesome logo though (that’s why this is the final tip and not the first one I share). Ideally, your logo should stand out, it should be something people recognize without any context. Designing a logo doesn’t have to be too expensive. Go check out 99designs for instance!

The colors you choose for your logo are of great importance as well. Make sure to use these colors elsewhere: in your newsletter, on your website, in images. If you use the same colors everywhere, these colors will become part of your brand. People will recognize your brand only by looking at the colors in your newsletter or in your Facebook post.

Once you have a kickass logo, make sure to use it! Present it to your audience: on your website, in your newsletter, on Facebook: everywhere!

Conclusion

If you develop a successful branding strategy, people will remember and recognize your brand. In the long run, your logo or brand name will be something that immediately evokes emotions. As people get more familiar with your brand, your SEO will get easier as well. Therefore, combining your SEO strategy with an awesome branding strategy is the way to go!

Read more: ‘Positioning your shop in the online market’ »

The robots.txt file is one of the primary ways of telling a search engine where it can and can’t go on your website. All major search engines support the basic functionality it offers. There are some extra rules that are used by a few search engines which can be useful too. This guide covers all the uses of robots.txt for your website. While it looks deceivingly simple, making a mistake in your robots.txt can seriously harm you site, so make sure to read and understand this.

What is a robots.txt file?

humans.txt

A couple of developers sat down and realized that they were, in fact, not robots. They were (and are) humans. So they created the humans.txt standard as a way of highlighting which people work on a site, amongst other things.

A robots.txt file is a text file, following a strict syntax. It’s going to be read by search engine spiders. These spiders are also called robots, hence the name. The syntax is strict simply because it has to be computer readable. There’s no reading between the lines here, something is either 1, or 0.

Also called the “Robots Exclusion Protocol”, the robots.txt file is the result of a consensus between early search engine spider developers. It’s not an official standard by any standards organization, but all major search engines do adhere to it.

What does the robots.txt file do?

Crawl directives

The robots.txt file is one of a few crawl directives. We have guides on all of them, find them here:

Crawl directives guides by Yoast »

Search engines index the web by spidering pages. They follow links to go from site A to site B to site C and so on. Before a search engine spiders any page on a domain it hasn’t encountered before, it will open that domains robots.txt file. The robots.txt file tells the search engine which URLs on that site it’s allowed to index.

A search engine will cache the robots.txt contents, but will usually refresh it multiple times a day. So changes will be reflected fairly quickly.

robots.txt

Where should I put my robots.txt file?

The robots.txt file should always be at the root of your domain. So if your domain is www.example.com, it should be found at http://www.example.com/robots.txt. Do be aware: if your domain responds without www. too, make sure it has the same robots.txt file! The same is true for http and https. When a search engine wants to spider the URL http://example.com/test, it will grab http://example.com/robots.txt. When it wants to spider that same URL but over https, it will grab the robots.txt from your https site too, so https://example.com/robots.txt.

It’s also very important that your robots.txt file is really called robots.txt. The name is case sensitive. Don’t make any mistakes in it or it will just not work.

Pros and cons of using robots.txt

Pro: crawl budget

Each site has an “allowance” in how many pages a search engine spider will crawl on that site, SEOs call this the crawl budget. By blocking sections of your site from the search engine spider, you allow your crawl budget to be used for other sections. Especially on sites where a lot of SEO clean up has to be done, it can be very beneficial to first quickly block the search engines from crawling a few sections.

blocking query parameters

One situation where crawl budget is specifically important is when your site uses a lot of query string parameters to filter and sort. Let’s say you have 10 different query parameters and with different values, that can be used in any combination. This leads to hundreds if not thousands of possible URLs. Blocking all query parameters from being crawled will help make sure the search engine only spiders your site’s main URLs and won’t go into the enormous trap that you’d otherwise create.

This line would block all URLs on your site with a query string on it:

Disallow: /*?*

Con: not removing a page from search results

Using the robots.txt file you can tell a spider where it cannot go on your site. You can not tell a search engine which URLs it cannot show in the search results. This means that not allowing a search engine to crawl a URL – called “blocking” it – does not mean that URL will not show up in the search results. If the search engine finds enough links to that URL, it will include it, it will just not know what’s on that page.

Screenshot of a result for a blocked URL in the Google search results

If you want to reliably block a page from showing up in the search results, you need to use a meta robots noindex tag. That means the search engine has to be able to index that page and find the noindex tag, so the page should not be blocked by robots.txt.

Because the search engine can’t crawl the page, it cannot distribute the link value for links to your blocked pages. If it could crawl, but not index the page, it could still spread the link value across the links it finds on the page. When a page is blocked with robots.txt, the link value is lost.

robots.txt syntax

WordPress robots.txt

We have a complete article on how to best setup your robots.txt for WordPress. Note that you can edit your site’s robots.txt file in the Yoast SEO Tools → File editor section.

A robots.txt file consists of one or more blocks of directives, each started by a user-agent line. The “user-agent” is the name of the specific spider it addresses. You can either have one block for all search engines, using a wildcard for the user-agent, or specific blocks for specific search engines. A search engine spider will always pick the most specific block that matches its name.

These blocks look like this (don’t be scared, we’ll explain below):

User-agent: *
Disallow: /

User-agent: Googlebot
Disallow:

User-agent: bingbot
Disallow: /not-for-bing/

Directives like Allow and Disallow should not be case sensitive, so whether you write them lowercase or capitalize them is up to you. The values are case sensitive however, /photo/ is not the same as /Photo/. We like to capitalize directives for the sake of readability in the file.

User-agent directive

The first bit of every block of directives is the user-agent. A user-agent identifies a specific spider. The user-agent field is matched against that specific spider’s (usually longer) user-agent. For instance, the most common spider from Google has the following user-agent:

Mozilla/5.0 (compatible; Googlebot/2.1; 
  +http://www.google.com/bot.html)

A relatively simple User-agent: Googlebot  line will do the trick if you want to tell this spider what to do.

Note that most search engines have multiple spiders. They will use specific spiders for their normal index, for their ad programs, for images, for videos, etc.

Search engines will always choose the most specific block of directives they can find. Say you have 3 sets of directives: one for *, one for Googlebot and one for Googlebot-News. If a bot comes by whose user-agent is Googlebot-Video, it would follow the Googlebot restrictions. A bot with the user-agent Googlebot-News would use the more specific Googlebot-News directives.

The most common user agents for search engine spiders

Below is a list of the user-agents you can use in your robots.txt file to match the most commonly used search engines:

Search engine Field User-agent
Baidu General baiduspider
Baidu Images baiduspider-image
Baidu Mobile baiduspider-mobile
Baidu News baiduspider-news
Baidu Video baiduspider-video
Bing General bingbot
Bing General msnbot
Bing Images & Video msnbot-media
Bing Ads adidxbot
Google General Googlebot
Google Images Googlebot-Image
Google Mobile Googlebot-Mobile
Google News Googlebot-News
Google Video Googlebot-Video
Google AdSense Mediapartners-Google
Google AdWords AdsBot-Google
Yahoo! General slurp
Yandex General yandex

Disallow directive

The second line in any block of directives is the Disallow line. You can have one or more of these lines, specifying parts of the site the specified spider can’t access. An empty Disallow line means you’re not disallowing anything, so basically it means that spider can access all sections of your site.

User-agent: *
Disallow: /

The example above would block all search engines that “listen” to robots.txt from crawling your site.

User-agent: *
Disallow:

The example above would, with only one character less, allow all search engines to crawl your entire site.

User-agent: googlebot
Disallow: /Photo

The example above would block Google from crawling the Photo directory on your site and everything in it. This means all the subdirectories of the /Photo directory would also not be spidered. It would not block Google from crawling the photo directory, as these lines are case sensitive.

How to use wildcards / regular expressions

“Officially”, the robots.txt standard doesn’t support regular expressions or wildcards. However, all major search engines do understand it. This means you can have lines like this to block groups of files:

Disallow: /*.php
Disallow: /copyrighted-images/*.jpg

In the example above, * is expanded to whatever filename it matches. Note that the rest of the line is still case sensitive, so the second line above will not block a file called /copyrighted-images/example.JPG from being crawled.

Some search engines, like Google, allow for more complicated regular expressions. Be aware that not all search engines might understand this logic. The most useful feature this adds is the $, which indicates the end of a URL. In the following example you can see what this does:

Disallow: /*.php$

This means /index.php could not be indexed, but /index.php?p=1 could be indexed. Of course, this is only useful in very specific circumstances and also pretty dangerous: it’s easy to unblock things you didn’t actually want to unblock.

Non-standard robots.txt crawl directives

On top of the Disallow and User-agent directives there are a couple of other crawl directives you can use. These directives are not supported by all search engine crawlers so make sure you’re aware of their limitations.

Allow directive

While not in the original “specification”, there was talk of an allow directive very early on. Most search engines seem to understand it, and it allows for simple, and very readable directives like this:

Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

The only other way of achieving the same result without an allow directive would have been to specifically disallow every single file in the wp-admin folder.

noindex directive

One of the lesser known directives, Google actually supports the noindex directive. We think this is a very dangerous thing. If you want to keep a page out of the search results, you usually have a good reason for that. Using a method of blocking that page that will only keep it out of Google, means you leave those pages open for other search engines. It could be very useful in a specific Googlebot user agent bit of your robots.txt though, if you’re working on improving your crawl budget. Note that noindex isn’t officially supported by Google, so while it works now, it might not at some point.

host directive

Supported by Yandex (and not by Google even though some posts say it does), this directive lets you decide whether you want the search engine to show example.com  or www.example.com. Simply specifying it as follows does the trick:

host: example.com

Because only Yandex supports the host directive, we wouldn’t advise you to rely on it. Especially as it doesn’t allow you to define a scheme (http or https) either. A better solution that works for all search engines would be to 301 redirect the hostnames that you don’t want in the index to the version that you do want. In our case, we redirect www.yoast.com to yoast.com.

crawl-delay directive

Supported by Yahoo!, Bing and Yandex the crawl-delay directive can be very useful to slow down these three, sometimes fairly crawl-hungry, search engines. These search engines have slightly different ways of reading the directive, but the end result is basically the same.

A line as follows below would lead to Yahoo! and Bing waiting 10 seconds after a crawl action. Yandex would only access your site once in every 10 second timeframe. A semantic difference, but interesting to know. Here’s the example crawl-delay line:

crawl-delay: 10

Do take care when using the crawl-delay directive. By setting a crawl delay of 10 seconds you’re only allowing these search engines to index 8,640 pages a day. This might seem plenty for a small site, but on large sites it isn’t all that much. On the other hand, if you get 0 to no traffic from these search engines, it’s a good way to save some bandwidth.

sitemap directive for XML Sitemaps

Using the sitemap directive you can tell search engines – specifically Bing, Yandex and Google – the location of your XML sitemap. You can, of course, also submit your XML sitemaps to each search engine using their respective webmaster tools solutions. We, in fact, highly recommend that you do. Search engine’s webmaster tools programs will give you very valuable information about your site. If you don’t want to do that, adding a sitemap line to your robots.txt is a good quick option.

Read more: ‘several articles about Webmaster Tools’ »

Validate your robots.txt

There are various tools out there that can help you validate your robots.txt, but when it comes to validating crawl directives, we like to go to the source. Google has a robots.txt testing tool in its Google Search Console (under the Crawl menu) and we’d highly suggest using that:

robots.txt tester

Be sure to test your changes thoroughly before you put them live! You wouldn’t be the first to accidentally robots.txt-block your entire site into search engine oblivion.

Keep reading: ‘WordPress robots.txt example for great SEO’ »

We understand that, at times, you might be tempted to buy links. Who doesn’t want to start ranking quickly? And, would it really harm your site, even if it’s from a “trustworthy” link selling site?

Watch this video and know why you should never buy links!

Why you should never buy links

Can’t watch the video? Here’s a transcript!

“Well, first of all, because buying links is cheating and we don’t cheat!

Second of all, you might think a site looks trustworthy, but if you can buy links from them, who else can buy links from them? And, who else will they be selling links to, so that your link might not really be that worthwhile anymore? So, how do you know who they’ve already sold links to, and how that knowledge might get to Google?

If you’re buying links, then at some point, you’re going to get caught! If that happens you’re going to lose everything you have. Just because you thought you could speed up yourself to the top of the rankings. It doesn’t work, I’ve seen it a thousand times, it’s not worth it. Don’t buy links!

In the series Ask Yoast we answer your question, on video! So send your SEO question to ask@yoast.com and finally get that answer!

Welcome to another Ask Yoast! This time we’ll take a question from Gary Cannon (Home España). He asks:

“What’s your opinion on SEO and using page tabs? Should each tab have an SEO text or just the first tab?”

Should you really use tabs? Check out our video or read the answer below!

Ask Yoast transcript

Well, in fairness, Google says, and testing shows, that Google really only counts the bit that a user can see. So if the first tab is the one that’s active, then the text on that is the text that Google counts in the ranking and everything else is basically a moot point.

That’s why a lot of people switch to a vertical page design now. Where you don’t have tabs as much anymore, but really make people scroll down the page for all the different bit and pieces. This is what we prefer too. I hope this gives you some insight. Good luck!

Take this opportunity to ask Yoast your question. We try to answer all questions and you might even get a personal answer on video!

Read more: ‘How to optimize your real estate site’ »

Do you encounter any difficulties configuring your Yoast SEO plugin? Want to know more about all features and settings of the Yoast SEO plugin? We have great news for you! Joost has made a series of Yoast SEO plugin tutorials for every tab on every page of the Yoast SEO configuration pages. A playlist of a total of 33 screencasts is available for free for all of our users!

In these video tutorials, Joost de Valk will explain all the settings of the Yoast SEO plugin in detail. He’ll talk you through all the possibilities the plugin has and explain why you should configure your plugin in a certain way. With every update of Yoast SEO, new screencasts will be released. Check out our extensive playlist! And, don’t forget to subscribe to our YouTube channel so you won’t miss any updates!

Yoast SEO plugin training for WordPress

Yoast SEO for WordPress training

For those of you who want to dive in all the possibilities our plugin has, we have developed a Yoast SEO plugin training. This training will be released on Wednesday the 30th of March. This training consists of these same 33 Yoast SEO plugin tutorials and of 8 in-depth instruction videos in which Joost de Valk gives extra information about all the settings. On top of that, the course comes with a lot of challenging questions, to test whether or not you truly understand the Yoast SEO plugin.

Once you have completed the Yoast SEO plugin training, you’ll receive a badge and a certificate to put on your site. The badge and certificate are a Yoast seal of approval. If you have completed our course, you’re qualified and able to set up and properly configure our plugin.

Want to become a Yoast SEO plugin expert? The plugin training will be available next week! The Yoast SEO for WordPress training will cost $129 for the first year and $69 in the following years. The first week you can even get it for $99! The videos and the questions will be updated after every major release of Yoast SEO, making sure you’ll remain a real Yoast SEO plugin Expert!

Read more: ‘Next week: Yoast SEO for WordPress training’ »

 

 

In Ask Yoast we handle your SEO question! This time we received a question from Stefan Wohlert from Venlo, just around the corner here in the Netherlands. He asks:

“How important is the Google PageSpeed score for SEO?”

In this video we explain what this score means for your site’s SEO:

Focus on a fast website

Can’t watch the video? Here’s the transcript:

The Google PageSpeed score itself is not important for SEO at all, because Google doesn’t – as far as we know – factor that score into the ranking. What they check is how fast your website loads for people across the planet. They just look how fast your website loads for users, so you don’t have to obsess over that specific score. You have to make sure your website is as fast as you can get it. So, if the difference in speed, between doing the fix to get the perfect score or not, is negligible, by all means don’t do it!

What you have to focus on is making the fastest website possible. Not only for Google, but just because every other metric on your site will do better when your site is fast as well. So don’t obsess over getting 99% or 100% in Google PageSpeed, but obsess over making your website as fast as you can. And if the changes you have to do just for the Google PageSpeed score are too hard, don’t do them, focus on something else. Good luck!

Read more: ‘Site speed: tools and suggestions’ »

Let us help you out with your SEO question! In the series Ask Yoast we take questions from YOU! So don’t hesitate and send your question to ask@yoast.com.

There is a significant difference between a real estate site and a ‘regular’ website. Real estate sites have temporary content: when an estate is up for sale, there is a page for it online. But when it’s sold, it tends to leave the internet. In this post, I’ll tell you how to deal with that.

how to optimize your real estate site

First things first

Next to the ever changing real estate pages, your website needs more static content as well.

About

Even though you’re basically selling bricks, your bricks are quite expensive. It helps when you make your website a bit more personal. Add your team and images of your team. Add a short story about how selling real estate became a passion of yours. A bit of history. All these things together make your website a lot more personal. A real estate agency that understands how to do this is Gottesman Residential.

Area

We have seen our share of real estate sites in our website audits. And I have to say that especially US based real estate agencies know how to promote their specific area. Try to create levels in this. First address your entire service area. If you’re serving the entire state, add content about what’s great about that state. Why should people move there? Why is buying a house there so very interesting for your visitor? Second, see if you can find districts of that main area, like Central Texas and Northeast Texas. Find the metropolitan areas and create pages for specific cities. Obviously, the number of levels will vary per agency.

Update this area based page regularly, for instance with an event calendar and things like that. Your real estate site should become the Wikipedia of local things. RealtyAustin doesn’t only tell you why Austin is nice, it also provides things like a list of schools, a relocation guide, and neighborhood videos.

Contact and location

Personal contact is important for a lot of real estate buyers and sellers. That means that you’ll have to list your contact details in a prominent spot on your website. Make sure your telephone number is listed in a sidebar or header and add a contact page with contact details, a contact form and a map with the location of your office. Our Local SEO plugin will help you a lot in optimizing these details by adding schema.org markup to your address details. It also provides an easy option to add that map and even an option for directions.

One more thing about contact forms: if you’re looking at a certain house, and like what you see, you simply want to contact the realtor. If the website has a contact form in the sidebar next to the estate details, that will make things easier.

IDX and MLS

IDX and MLS are ways of integrating estates, from yourself or other brokers, in your own website. WordPress offers plugins for that. An Internet Data Exchange (IDX) listing tends to be less detailed compared to a realtors Multiple Listing Service (MLS) listing. Both are based upon the same principle: add your listing to a central website and allow other realtors to share your listings via their websites. Both buyers and realtors benefit: every house that’s on sale is served to the potential buyer and the potential buyer will be able to make a better selection before contacting the real estate agent.

I have seen a lot of real estate sites adding an iframe with IDX listings to their website. Let me emphasize again that content in an iframe isn’t on your website and won’t help your website rank as such. If you’re using IDX/MLS, please make sure to import the content to your own website, and serve the listings in your own design. But this does mean duplicate content, as the listing are available on multiple websites. The best SEO practice for your real estate site is to create unique listings for your own website. Note that this might interfere with creating the largest reach for the property you’re selling if your website doesn’t have that many visitors (yet). A way to use both could be to serve unique content on your site and link ‘similar properties’ via IDX/MLS services below that.

The ever changing content of your real estate site

All real estate sites have one thing in common: your real estate listings come and go. Of course, you’ve added a great description to your unique listing of the property. This description, your images and the address of your property will help you rank. If your listing appears in the search result pages, it should feel like a waste to delete it from your website right after the sale. So simply don’t.

First of all, if your real estate is sold, it will pay off to keep that listing online for say three months. Clearly list that the property is sold (perhaps even add ‘within 16 days’ to show your agency gets the job done for sellers as well). Clearly list similar properties on that page to redirect people that searched for a property in that street or district.

Step two is an actual redirect. To optimize this properly, we first need to divide the location into a number of levels. In almost every case, it will pay off to create a hierarchical custom taxonomy for that location, going from state > city > district and as many more levels as your service area has. Doing so will make sure you’ll always have some kind of category to link to.

301 or 302 Redirect

If you redirect a listing and are confident that you can reuse the URL soon, you might consider using a 302 Redirect. That is a temporary redirect. If you are pretty sure the redirect is permanent, a 301 Redirect is the one to pick. That will also tell Google not to expect that page to return at all.

If after three months that property is still sold, redirect the page to a collection of properties in the same district. Preferably, you’d want these properties to have some similarities to the real estate you’ve just sold, like the same number of rooms, located near schools; you probably know what the majority of your customers values most. The page you are redirecting to could be a taxonomy page or even the search result page for properties in that district. If you can optimize that search page with a proper title and description, that would work perfectly well as a substitute for a category page.
Now if no such page for the same district is available, go up one level and redirect the page to a similar page with results from within the same city. Broaden your location bit by bit. Doing so will keep the temporary URL of the initial property valuable for your website for a longer period of time.

If the property is up for sale again within a few months (which sometimes happens), remove the redirect and reuse the initial URL. After six months to a year, feel free to remove the initial redirect, as Google will understand by now that the property is gone from your catalog.

Should I add my listings to sites like Zillow or Realestate.com.au?

I would add your listings to the larger real estate search platforms as well. Most of the searches for real estate are probably not done on broker’s sites but on sites like Zillow. These sites provide demographics on the neighborhood you want to move to and even things like crime rate. They probably have a larger team than you working on these facts and figures, their information might be a bit more accurate. All of this makes these sites very attractive for people that are looking for a new house. If you’re not on these websites and haven’t added a similar great description and the same number of images to your listing on these websites, you’re missing out on a lot of potential buyers.

Real estate sites like Zillow and Realestate.com.au should be considered an essential part of your marketing mix, like social media marketing most probably already is. Use them to show off all the great real estate you’re selling to a larger audience.

I trust this article has given you something to think about. If you have something to add to it or want to share your experience (or real estate website), feel free to leave a comment below!

Read more: ‘Local SEO: setting up landing pages’ »

Optimizing your images is one thing, but optimizing a page filled with images really is another. Of course, there is a certain overlap in how to optimize a single image and a gallery. In this post, I will go over the things you need to take into account when using photo galleries on your website.

Optimize your WordPress gallery

Social sharing of galleries

I have been testing social sharing of galleries on Twitter and Facebook but found that all the good stuff like Twitter Gallery Cards have disappeared.

Twitter and multiple images

The ‘Summary Card with Large Image’ replaces the deprecated Gallery Card, which allowed you to share photo galleries via Twitter as well. Currently, this isn’t possible. If you want to show multiple images in a Tweet (or Facebook post for that matter), you’ll have to upload these manually.

My recommendation when sharing a page that contains a photo gallery as the primary content of that page, is to upload three or four images from that gallery to Twitter, which will result in a tweet like this:

The fact that you have to upload these images to Twitter seems to fit the ‘all your content belongs to us’ trend in social media ;)

Facebook photo albums

Facebook allows you to add photo albums. As Google and Facebook have little overlap, you might want to consider creating a Facebook album for your photo gallery as well. Promote it separately. I totally understand that you primarily want your photo gallery on your website and your website only. From my personal experience, I can tell you that a proper Facebook album also works well. Be sure to add the right title and description. Your Facebook page might have a different audience than your website. Be sure to test this for your own content.

The title of your photo gallery

The one thing everyone can control is the title of your gallery. It’s the title of the page. And it’s the title that will most probably show up in Google’s search result pages as well. In our WordPress SEO article, we already mentioned that you should make sure the title is topical, and contains the keyword you want to optimize that gallery for.

In the article, and in our plugin’s content analysis, we recommend adding that keyword early in the title, which Google seems to like. Besides that, scanning the search result becomes easier with the desired keyword as one of the early words.

Introductory content for your gallery

Every time we discuss taxonomies, categories or things like that, we mention introductory content. Like your category pages, a photo gallery is a collection of things. It doesn’t matter if these collections contain posts or images, you want to tell your visitor (and Google) what the common ground for your collection is.

Explain why you set up the gallery:

  • Are these photos from a certain event?
  • Are these photos of a certain product?

My gut feeling tells me these are the two main reasons to set up a photo gallery (feel free to add yours). Describe the event or product and by all means link to other pages that contain more in-depth information about the topic. Your introductory content should be the glue that connects all the separate items together, but shouldn’t have to be the main content you want to rank with for a certain topic. If it is, your gallery is probably illustrative and not the main part of your page’s content.

The obvious: alt tags, captions, and file names

There are similarities in optimizing images and photo galleries. You have to make sure your alt tags are descriptive. For better scanning and a text to accompany the images in your gallery, you want to add captions. For more information on how to do this, I’d like to point you to my article on image SEO.

If you want to go all the way, every single image in your photo gallery needs to have a unique, descriptive file name. In practice, this will probably almost never be the case. It’s a lot of work and depending on the subject of your gallery, there will be matching subjects. Most galleries will have filenames like garden-flowers-01.jpg to garden-flowers-10.jpg, where dandelions.jpg to sunflowers.jpg will work better in the end. In reality, you’ll understand that this might be a bit too much hassle to be workable.

AMP and its carousel markup for galleries

We have been talking about AMP a lot lately. AMP has a way to deal with galleries as well. I have been testing this with a default WordPress gallery and the AMP plugin by Automattic. It creates a carousel of your gallery, which actually works pretty nice. This is done by adding the amp-carousel tag, which helps “displaying multiple similar pieces of content along a horizontal axis; meant to be highly flexible and performant.” Examples can be found here.

The code for that would look something like this:

<amp-carousel width=300 height=400>
  <amp-img src="my-img1.png" width=300 height=400></amp-img>
  <amp-img src="my-img2.png" width=300 height=400></amp-img>
  <amp-img src="my-img3.png" width=300 height=400></amp-img>
</amp-carousel>

Example taken from the ampproject.org website, by the way. The WordPress plugin creates just that for you.

The only thing that seems to fail sometimes is scaling. This is not an odd issue I think, as AMP prefers set dimensions. I’m sure this will be fixed over time. Overall this is a really nice solution for displaying galleries in accelerated mobile pages.

When I was looking into similar plugins for Joomla! and Drupal, I found this piece of information telling me it’s coming to Lullabot’s Drupal AMP module later on. Weeblr released a Joomla! plugin called WbAMP, but I can’t locate any information on how it deals with images or, more specific, photo gallery handling there. Anyone tried this one already?

Wrapping things up

I understand that there are things in this post you can’t ‘just’ implement like that. Changing every single file name to match the subject of the image is a lot of work. I would recommend against doing that for your existing photo galleries, but it might be something to test in future ones. Always add introductions to your galleries, and make sure your captions and alt texts are correct. Just remember that it’s the entire page you want to rank, not just that single image!

Read more: ‘Optimizing images for SEO’ »

My previous post about AMP lead to a ton of questions and rightfully so. We’ve been testing, developing and working hard in general on understanding what needs to be done to get AMP working without too many errors. This post is an update on where we stand right now, introduces an updated Yoast SEO AMP Glue plugin with new features and gives some more background on the why and what of it all.

The need for multiple plugins

The base AMP functionality is provided by the WordPress AMP plugin. In my previous post I recommended Pagefrog to add styling and tracking to your AMP pages. While it is a nice plugin, it caused more issues for us than it solved. The plugin adds a preview on every post edit screen. This preview is unneeded and there is no way to disable it, and it literally caused browser crashes in our backend.

The issues we had with Pagefrog made me decide to put in some time and created a set of design settings in our Yoast SEO AMP Glue plugin. When you update to version 0.3 of that plugin, you can safely disable Pagefrog and configure the styling on the SEO → AMP design settings tab:

Extra styling options

The Yoast SEO AMP Glue plugin also lets you put in manual CSS and some extra tags in the head section. This allows us, for instance, to have our preferred weight of our Open Sans font available and make the styling fit our brand a bit more.

You can also enable AMP for custom post types on the post types tab. The only post type that doesn’t work yet is pages, as support for that is being added to the main AMP plugin.

WordPress AMP design settings

Errors & testing AMP

We were getting quite a few errors in our Google Search Console AMP report for yoast.com. You can see our indexation and error graph here:

google-search-console-amp-errors

AMP debug mode

You can put any AMP URL into “debug mode” by adding #development=1 to the end of the AMP URL. If you then look in your browsers console – you might have to reload the page – you’ll see the AMP validation warnings. These are the exact same warnings that Google shows in Google Search Console. There are quite a few different types of errors and the Google search console report groups them for you.

I realize the error line in the graph above is not exactly convincing of our quality yet. The drop in errors we saw made clear that we were doing some things right. Now we have about a thousand posts on this blog, and almost a hundred on our development blog. So it’s clear that not all of our content is indexed as AMP yet, and not all of our AMP content is working nicely.

Missing featured images

The biggest source of our issues were Schema.org article errors. This was caused by one simple issue: a lot of our posts, especially the older ones, didn’t have a featured image. The WordPress AMP plugin then simply outputs schema.org JSON+LD tags without that image, causing those errors. The fix is simple: we now have a “default image” field in the design tab of our Yoast SEO AMP Glue plugins settings. It’s used when a post has no featured image. This solved half of our errors.

Testing Schema.org errors

To test whether you will be getting Schema.org errors, run your AMP URLs through the Google Structured Data Testing Tool. The output from that tool tells you which data is missing.

Missing site logo

The JSON+LD output also requires a site logo. While this is not an error we ourselves had, many reported this issue. The AMP plugin uses the logo set as your site icon in the Customizer, and omits it if you don’t have one set. We now let you upload a logo on the design tab of the Yoast SEO AMP Glue plugin too, if you want to use a different one.

Retrofitting AMP onto existing content

Part of what we’re doing with the AMP WordPress plugin and the Yoast SEO AMP glue plugin is “fixing” content that exists in your database to work with AMP. The posts on your site are stored as HTML in your database. The HTML of those posts does not necessarily to conform to what AMP HTML requires. For this purpose, the AMP plugin has a set of so called “sanitizers”. These are filters, run over your content, that remove tags and attributes on tags that aren’t allowed. They even remove some attributes when their values aren’t allowed.

We’ve added an extra sanitizer class in our own plugin to remove some more invalid attributes. Once we’re certain that these work, we’ll actually contribute these changes “upstream” to the AMP plugin. These changes have fixed the remainder of the issues we had.

Analytics integration

The only thing we lacked after Pagefrog was removed is tracking. Pagefrog took care of Google Analytics tracking for us. Luckily, adding tracking to AMP pages isn’t hard, so we coded a simple connection to our Google Analytics by Yoast plugin. If you have that enabled and configured, the plugin will automatically grab the account code from it and enable tracking for your AMP pages. You can, however, also choose to use a custom tracking code. If you do this, the plugin no longer integrates with Google Analytics by Yoast.

Facebook Instant Articles

Another thing Pagefrog takes care of is Facebook Instant Articles. There’s now a plugin from Automattic for that purpose, which we’re working on integrating Yoast SEO with. So you won’t need Pagefrog for Facebook Instant Articles either.

Conclusion

With all these changes, getting AMP to work on a WordPress site running Yoast SEO has become slightly easier and lots less error prone. We’ve updated our Setting up WordPress for AMP post with these changes. Good luck and do let us know of errors in the comments!

If your website is all about news, there’s quite a lot that can be done to optimize your news site for search engines (and your users in the process). In this post, I’ll address a number of things you need to consider for your news website.

Optimizing your news site

First of all, if you have a construction business and a news section on your website, that isn’t a news website. It will be a lot harder to get into Google News and your news might not be indexed as quickly as news on a website that is focused on news alone, like the Huffington Post or The Guardian. Google says this about it: “Google News is not a marketing service.” Send your press releases elsewhere :)

WordPress? News SEO!

If you’re using WordPress, we have a plugin that takes care of a lot of the things mentioned in this article. The News SEO plugin for WordPress “creates XML News Sitemaps, editors’ picks RSS feeds and allows for use of the standout tag and the meta news_keywords tag as well as helping you optimize some of the more advanced XML News sitemap options like stock tickers.” Be sure to install that plugin if you’re serious about your news website.

Accelerated Mobile Pages (AMP)

This goes for every website, but with the huge amount of people checking news on their mobile devices, you want to make sure this is done right on your news site. AMP’s goal is to show articles instantly on a mobile device, instead of loading the full-blown desktop site or all the fancy things we created in our responsive website. AMP strips all the design and fancy stuff and focuses on delivering the main content ASAP.

AMP WordPress plugin banner

There’s a WordPress plugin by Automattic to create Accelerated Mobile Pages. This plugin by PageFrog is a nice extra for that. Be sure to read our latest recaps and definitely read this post Joost did about Accelerated Mobile Pages (AMP) for WordPress if you want more information about the subject. It will also tell you about our plugin Glue for Yoast SEO and AMP, that makes sure AMP works with our SEO plugin.

Crawl speed

You want your news indexed and indexed fast by Google. There are a number of things that can help with this. You obviously optimize site speed, but also make sure that XML sitemaps are available. Note that a news sitemap isn’t like your regular XML sitemap. Google has some guidelines for that:

  • Your news sitemap can only contain news from the last two days (and articles will remain in the news index for 30 days).
  • Update your news sitemap continually with fresh articles as they’re published.
  • The limit for URLs per news sitemap is 1,000 URLs.
  • Don’t create a new sitemap per new article, but rather update your existing sitemap.
  • Don’t use the Google Sitemap Generator to create a news sitemap, but feel free to use our News SEO plugin for that.

More information about Google News sitemaps here.

Crawl speed will also be improved if you consistently post new articles on your news website. If Googlebot finds new stuff on your website with every crawl, it’ll come to your website more often. This also means you need to have a solid hosting server, so your website is up every time Googlebot visits your site. That very Googlebot might actually cause downtime, in case it visits your site too often and your server is crappy. Read more about fixing this here.

One more thing. If you serve excessive, unneeded content to that Googlebot, it’ll waste valuable crawl budget on less valuable pages. Optimize site structure, be sure to block unwanted pages via robots.txt or robots meta, and obviously avoid duplicate content at all times.

Site structure of your news site

We’ve done a post on site structure for blogs that works pretty similar for news websites. In your news website, you’ll have articles on one side, and taxonomies like categories and tags on the other. Be aware of that second structure, as it might be more important than your articles that come and go. Optimize these taxonomy pages to your very best effort.

Marieke told you about these steps:

  1. Evaluate your categories
  2. Add sub-categories and tags
  3. Add pagination
  4. And again: Get rid of outdated content

Read more: ‘The site structure of a growing blog’ »

Standout and news_keyword tags

There are a few extra tags you can add to your news articles that will help you get indexed properly by Google: the standout tag and the news_keywords tag.

Standout tag

The standout tag can be used to highlight your original reporting. It’s for news that you, as a news organization, have created yourself. This might lead to Google listing it as “featured” in the news results. More about it in the documentation of our News SEO plugin.

News_keywords tag

We said goodbye to meta keywords a while back, but Google is still using a variation of that tag: meta news_keywords. Just for Google News, by the way. It’s not something magic that’ll make you rank better, but helps Google News to determine what your article is about. No use using this tag when you’re not in Google News already. Again, more about it in the documentation of our News SEO plugin.

Update your article as news comes in

You want to publish news as soon as it finds you. Be sure to do so, and update your article during the day to make sure you’re bringing the right, current news. You don’t need to add a new article per update – please don’t. Google News wants you to keep a permanent link for your news article. What you should do, is update the title of your article, as that will tell Google your article is updated. This will allow you to grow for instance a news article about an event during the event itself, so you’ll always have the latest news in the article that is already in Google News by then.

Let’s ask the expert

Martijn Scheijbeler - TNW - news website expertI asked Martijn Schreijbeler, Director of Marketing at The Next Web, what he considered the most important thing when it comes to running a decent news website. This is what he told me:

What makes SEO different?

“You ever wondered what makes SEO different for publishers than it is to, for example eCommerce stores? We have the same issues as they have, increasing search traffic, making sure pages don’t have duplicate content. Although our lives are a little bit easier as we don’t have to worry about content marketing with a 15-people strong editorial team.
In our case, we have to worry even more about the speed of our users with new supporting technologies coming up, like Google AMP, Facebook Instant Articles and Apple News which all have their ways to support content on another platform.”

These are interesting times. I totally agree that it’s not just speed or structure that matters. You should always be aware that your audience uses a wide variety of apps, on multiple devices, to read your news. Martijn already mentioned a number of these, but what about Flipboard, smartwatches and Nuzzle. You have decisions to make. Do you want to serve the full article on all these apps and devices, or would you rather have people click to your website? Do you have sufficient social reach to leverage the ever growing number of apps that recommend news, based on your social followers and influencers? You just have to keep up with developments and find out what your target audience wants you to optimize for.

In conclusion

There are a lot of things to consider when you’re running a news website and want to use for instance Google News to the fullest. In this article we’ve told you about AMP, crawl speed, updating your titles, site structure and our News SEO plugin.

Technological developments go fast. I am sure that as soon as we all fixed AMP for our websites, something new will come along. I think Joost rightfully compared AMP to WAP. It pushes us to think about our websites and news sites in particular, as (speaking for myself) these are the sites we visit most on mobile devices. In a variety of apps, let me emphasize that again. It pushes us to make our websites adjust to new screen sizes, fit into new apps and make news sites blazing fast.

I hope this article will give something to think about, something to work with. As always, I am eager to learn from your own experiences. Drop me a note on Twitter or leave a comment below!

Keep reading: ‘Social Media strategy’ »