5 ways to boost your Core Web vitals

“If only I could simply wave my wand and have a super fast website!” This has probably crossed your mind as well, right? Optimizing site speed and user experience is a lot of work and gets technical — and complicated — really fast. Most site owners or managers quickly need to talk to their developers to get stuff done. Now, the new Core Web Vitals metrics give you more insights and pointers at what to fix. Let’s go over five things you can do to boost your Core Web Vitals score.

Table of contents

First, a disclaimer

Look, there’s not one thing that’s guaranteed to fix one specific issue. You have to take a broader view of optimizing your site. A lot of little fixes make up big results. So, while I’ll give you five things you can work on here, this is nowhere near definitive. Even Google says many elements work together to come up with scores, so it’s hard to pinpoint if you do this, then that score will go up.

What Google does give you, is insights into what’s slowing stuff down or what’s hurting the user experience. Many tools also give advice on how to fix stuff. Web.de/measure, for instance, doesn’t do in-depth results, but it does give you an idea of what the impact of a particular fix is.

Google’s Web.dev/measure tool gives you an idea of the impact a fix can have

Google’s upcoming page experience update

We’ve published a couple of articles about Google’s page experience update — coming sometime in 2021 —, so you can start here if you need more background information:

Five things you or your developer can do

Over the years, there’s been constant talk about the importance of site speed and user experience. But while there’s a ton of material out there on how to optimize your site, putting that knowledge into practice is hard. These past few months, Google once again put speed front and center with the page experience update happing next year. To help you get ready for that, it developed tools to give you insights and a lot of documentation to read.

For a lot of issues, the advice hasn’t really changed that much. It all boils down to getting the main content to your users as quickly as possible. Run through the test to see how your site performs, try to prioritize the fixes and get started! Below you’ll find a mix of old and new ways of enhancing your site.

Optimize your images

I’ll start off this list with a golden oldie: optimizing images. One of the most important things you can do for your site is properly optimize your images. Yes, we said that a million times but we’re going to say it again: do it. That one big unoptimized image on your homepage or landing page might hurt you. Large images are often the largest contentful paint (LCP) for any given site. Make sure you give your visitors a proper welcome by making that load quick!

We have a popular article on image SEO describing what you can do to get that image to load quickly. But in short, make sure you serve it in the size needed and compress it well. There are loads of tools to help you do this. Personally, I love the results I get with squoosh.app. Don’t think you need to keep that massive resolution for that image to be sharp on the most common screens.

Also try to adopt modern formats like WebP. These formats can deliver high quality images at a lower size. WebP is well-supported and even Apple has jumped on board! The upcoming Safari 14 release — both on MacOS as well as iOS — will support WebP. Yes, the new Chromium powered Microsoft Edge browser also supports WebP.

Your CMS is also a tool that’ll help you improve the loading images. Due in August, WordPress 5.5 will support lazy loading of images. This means it will only load those images that appear on screen and leaves the rest to load when the user interacts with that screen. This tells the browser to load large images only when they are needed.

Another piece of evergreen site speed advice is the use of a CDN, but did you know you can also use a CDN specifically for images? An image CDN gives you more control over how you want to serve and how you want these to appear. An image pushed by an image CDN gets a string of properties in its URL which tells the browser how the image should behave.

Stabilize loading by specifying room for images and the like

One of the new metrics is cumulative layout shift, or CLS for short. An example of this is when a mobile page looks ready and just when you want to hit a button, the content shifts and a slow loading ad appears in that place. This happens often and is one of the main causes of frustration for users. Now, while optimizing your CLS won’t necessarily make your page be faster it sure makes it feel faster.

CLS is caused by images without dimensions in the CSS. It can also be caused by ads and embeds without dimension, or dynamically injected content. When not properly given dimensions, these elements tend to jump just a bit during the loading process, making it appear jerky and unstable. This might also due to new content being inserted above existing content. Don’t do that, except maybe after an explicit interaction by the user.

One of the ways you can prevent CLS is by adding the width and height for images in the CSS. This way, the browser will reserve space for that image that’ll probably appear later than the text. Now, the jerkiness will disappear because the browser knows that something will be added in due time. You could think about adding some sort of low-resolution placeholder if you want sometime to appear quickly.

So, simply make sure that your images have proper width and height attributes set. Of course, you can also do this with regular responsive images. Just make sure that you are using the same aspect ratio for all sizes.

<img src="mountain.jpg" width="640" height="360" alt="Mountain underneath a cloudy sky">

To cope with jumping ads or injected content, please reserve space for these as well. In the end, your CLS might just come down a bit.

Speed up your server to get that loading time down

The faster your server responds to requests, the better. Getting that server to respond quicker directly improves a lot of site speed metrics. On complex sites, the server keeps busy with handling requests and serving files and scripts, so it’s best to optimize those processes.

Optimizing your server consists of several parts. First, upgrade your hosting plan. Don’t skimp on hosting. Pick one that offers good performance at a fair price. Also, there’s the business of how the server was set up — use a recent version of PHP! — and what hardware you picked. Maybe you should upgrade the hardware if you find that lacking. Also, you need to research how your databases work and see if you can make improvements. Use tools like the Query Monitor WordPress plugin to keep analyze queries on your site.

You can also look into how your server pushes files to clients. There are several ways to enhance that process, with link rel=preload for instance, or HTTP/2 server push. These are more advanced solutions that let you fine-tune how your server responds to requests. Here, again, a CDN can do wonders.

Look into critical CSS to load above the fold content quicker

When the browser loads a page, it has to get the HTML, render it, get the CSS, render it, get the JavaScript, render it, et cetera, et cetera. The more files you need to load your site and the bigger these are, the slower your site will load. Often, while the browser is busy doing stuff, it can’t load things in the background. Certain elements block the process. So-called render-blocking JavaScript and CSS influences everything.

Since the CSS loads late, it can often take a while for something to appear on screen. By taking the critical bits of your design — the part that appears above the fold — out of the main CSS file and inlining it in your code, you can get something on screen much faster. Fixing this, once again, doesn’t make your site faster, but it makes it appear faster. All for that ace user experience.

To get a set of critical CSS, you can choose from a number of tools or you can do it by hand. In addition, you can use WordPress caching plugins like WP Rocket. WP Rocket has a simple button called Optimize CSS delivery. Activating this helps eliminate render-blocking CSS and enhance the loading of your site. Of course, WP Rocket also does other cool stuff like minifying CSS and JavaScript and deferring the loading of JavaScript.

Improve loading of third-party scripts

For many sites, slowness also comes from outside. If your site relies on ad scripts, for instance, you are basically in the hands of the ad provider. You can only hope that they make their ads performant. If their ads load really slow, well, maybe it’s time to find another provider.

If you find that third-party scripts slow down your site, you should look into this. Ask yourself, do I really need that particular ad? What’s the value of these scripts? There might be a different option out there that’s a bit more optimized and less stressful for your server. Maybe try that?

If possible, you can experiment with hosting the script yourself. This way, you’re a bit more in control of the loading proces. If you can’t do that, see if you can get it to preload quicker.

At the least, make sure to load the scripts asynchronously or defer it till the last moment. This way, the browser can build the page first before getting and running the external script. Use async if the script you’re loading is important, like an analytics script. You can use defer for less critical resources. There’s loads of documentation on optimizing third-party scripts.

Boost Core Web Vitals: All small improvements count

With the upcoming page experience update, Google put site speed and user experience front and center again. We’ve always looked at SEO holistically — there are many moving parts and you should work on all of them to build the best site out there. Although the tips mentioned above can help you improve those scores, you really should be doing this to offer your visitors a better experience.

The post 5 ways to boost your Core Web vitals appeared first on Yoast.

A linguistic approach to creating content that ranks

Search engines are trying to understand language: they want to understand what users are searching for, and they want to be able to provide them with the best results. Before I started working at Yoast, I studied linguistics: the scientific study of language. During my years at Yoast, I’ve noticed that linguistics and SEO have a lot of overlap. In this article, I want to give you some SEO insights from a linguistic perspective. Let’s dive in!

Different aspects of language

Before we can go into the linguistic approach to SEO, we first have to understand what language is. Language consists of many different aspects. Think about it: we make speech sounds or write letters, which together form words. We put these words in a specific order, so they form sentences and phrases. And these sentences mean something to us.

Sometimes we also want to achieve something with language. For example, when we say “it’s cold in here,” we might not only want to express we’re cold, but we could mean it as a request to close the window. To study all of these aspects, we distinguish different levels of language in the field of linguistics.

Linguistic levels of language

The most basic level is the level of sounds and letters, which we call phonology (when it comes to speech) and graphology (when we talk about writing). Then, there’s the morphological level, which studies how these sounds and letters together make words and different word forms. For example, the word “house” can be combined with “tree” to make “treehouse” and with “dog” to make “doghouse,” but we can’t really combine it with “banana.”

The next level, syntax, describes the rules we have for creating sentences. There are a million words we can choose from that we could use to form an infinite number of possible sentences. But these syntactic rules allow us only a small number of ways in which these words can be combined.

The level of semantics studies the meaning of different elements of language. What do we mean when we say something, and how do we understand others? Finally, pragmatics looks at meaning within a context. For instance, someone could say: “I’m getting hot, will you crack open the door?” Semantically, “crack” would mean “to break,” but pragmatically, we know that they don’t actually want us to break the door; they want us to open the door to let in some fresh air.

Level of languageField of linguistics
Sounds and lettersPhonology (speech) & graphology (writing)
Words and word formsMorphology
Sentences and rulesSyntax
MeaningSemantics
Context and language use Pragmatics
Sources: Crystal (1987), Hickey (2005)

Which levels of language can Google understand?

Okay, but what does this have to do with search engines? Well, search engines are trying to understand language the way humans do. And they’re getting better and better at it. A couple of years ago, search engines could only understand basic elements of language: they could recognize keywords in your content. Because of that, it was common practice to optimize just for keywords.

But times have changed. Search engines are becoming smarter and smarter, and they are getting better at understanding more levels of language. Google is now trying to understand language at the level of syntax, morphology, semantics, and even pragmatics. How? Let’s find out.

Understanding what characterizes high-quality content

With every update, Google tries to get closer to understanding language like the human brain. The Panda update (2011) addressed thin content and keyword stuffing. People could no longer rank high with low-quality pages filled with keywords. Since this update, Google is trying to understand language at the semantic and pragmatic levels. They want to know what people deem high-quality content; content that genuinely offers information about the search term they used.

Read more: Google Panda »

Understanding the meaning of phrases

A few years later, with the Hummingbird update (2013), Google took a deeper dive into semantics. This update focused on identifying relations between search queries. It made Google pay more attention to each word in a search query, ensuring that the whole search phrase is taken into account, rather than just particular words. They wanted to be capable of understanding what you mean when you type in a search query.

Google took that even further. Since they rolled out the RankBrain algorithm in 2015, they can interpret neologisms (words that have not yet been fully accepted into mainstream language, like “coronacation”), colloquialisms (casual communication, like “ain’t” and “gonna”), and they can process dialogues.

Read more: A brief history of Google’s algorithm updates »

Understanding different word forms

Google also has become a lot better at understanding different forms of a word or phrase. You no longer have to stuff your articles with the same keyword over and over again. If you’re writing an article about [reading books], Google will recognize various forms of these words, like [read], [reads], and [book]. What’s more, Google also understands synonyms. Write about [novel], [chronicle], and [volume], and Google will still rank you for [book]. Using some variations in your wording makes your texts nicer to read, and that’s what Google finds important, too.

Read more: What is keyword stemming? »

Understanding pragmatics

But Google is not just trying to understand content by analyzing text. To identify which results are useful for people, they also use user signals, like the bounce rate, click-through rate, and the time people spend on a website. They are even researching users’ emotions to adapt their search results based on, for example, the choice of wording for a search query.

Understanding context

You might have heard about the most recent big update, BERT (2019). With their latest innovation, Google is again becoming closer to understanding language at a human level. BERT is a Natural Language Processing (NLP) model that uses the context and relations of all the words in a sentence, rather than one-by-one in order. With this update, Google can figure out the full context of a word by looking at the words that come before and after it. This helps them provide their users with even more meaningful and fitting results.

Read more: Google BERT: A better understanding of queries »

A linguistic approach to SEO

So, what does this mean for how you should optimize your content? Google is trying to understand language like we do. And with every update, they are getting closer to understanding language at a human level. They want to provide their users with high-quality search results that fit their goals.

Simply put, this means you should write for your audience, and not for search engines. Research your audience, try to get to know them, and provide them with the information and solutions they are looking to find!

Write naturally and mix things up

Moreover, try to write naturally. Don’t just stuff your text with the keyphrase you’re trying to rank for. That’s not only unpleasant to read for your visitors, but also bad for your rankings. Google can understand synonyms, different word forms, and the context of words, so make use of that! If you’re trying to rank for [cat], don’t just use [cat] over and over in your text. Use synonyms, like [kitty] or [puss]. Mix things up and use the plural form, [cats], and related phrases, like [litter box] or [cat food].

Yoast SEO Premium can help you with this. Our plugin also recognizes synonyms, different word forms, and related phrases, and helps you optimize your content for these words. This allows you to write in a more natural way, so you can satisfy your users and rank high in the search results!

The post A linguistic approach to creating content that ranks appeared first on Yoast.

Learn about the three Core Web Vitals: LCP, FID & CLS

Some time ago, Google caused quite a stir by announcing a new ranking factor for 2021: page experience. User experience has always been a essential part of building the best site out there, but now, it will play an even bigger role in helping you build awesome sites for your customers. All this is powered by new metrics, with at the centre: the Core Web Vitals. Time to meet LCP, FID and CLS!

Table of contents

The Google page experience update powered by Web Vitals

We’ve talked about this page experience update before, but in this post, we’d like to take another look at those Core Web Vitals. In general, site speed metrics tend to be hard to understand and confusing. Plus, they tend to change somewhat each time you test your site. You don’t always get the same scores. So, it’s easy to say that you just have to look at some metrics in the hope they turn green. 

Of all the possible metrics, Google now identifies three so-called Core Web Vitals. These are the focal point for Google in the coming year. Every year, Google might add or change these metrics as they evaluate these over a longer period of time. 

Core Web Vitals are the subset of Web Vitals that apply to all web pages, should be measured by all site owners, and will be surfaced across all Google tools. Each of the Core Web Vitals represents a distinct facet of the user experience, is measurable in the field, and reflects the real-world experience of a critical user-centric outcome.

web.dev/vitals/

The three pillars of page experience

For now, the three pillars of page experience are:

  • Loading performance (how fast does stuff appear on screen?)
  • Responsiveness (how fast does the page react to user input?)
  • Visual stability (does stuff move around on screen while loading?)

To measure these essential aspects of user experience, Google chose three corresponding metrics — aka the Core Web Vitals:

  • LCP, Largest Contentful Paint: This measures how long it takes for the largest piece of content to appear on the screen. This could be an image or a block of text. A good grade gives users the feeling that the site loads fast. A slow site can lead to frustration.
  • FIS, or First Input Delay: This measure how long it takes for the site to react to the first interaction by a user. This could be a tap on a button, for instance. A good grade here gives the user a sense that a site is quick to react to input and, therefore, responsive. Slow, again, leads to frustration.
  • CLS, or Content Layout Shift: This measure the visual stability of your site. In other words, does stuff move around on screen while it is loading — and how often does that happen? Nothing more frustrating than trying to click a button when a slow-loading ad appears in that spot.

Different tools use different metrics

Every page experience tool uses a number of Web Vitals, gathered from a variety of sources. As every tool has a different purpose, the metrics used differ per tool. The common denominator, however, are the Core Web Vitals as Google uses these in every page experience tool it has.

But what do all these numbers mean? What do you have to look for on your site? And when is your site fast enough? When do I have a good grade? There are a million questions you could ask about this metrics. And while Google is trying to close the gap between understanding and improving, this continues to be a complex topic. Measuring site speed and user experience is hard — there are so many things to factor in.

What are these Core Web Vitals?

The Core Web Vitals don’t work in isolation, as there are a whole lot of other metrics. Some are based on controlled lab tests, while others are metrics that only work with field data. After doing a lot of research, Google determined a new set called Web Vitals. These are a combination of metrics we already know, plus a set of new ones. The three Core Web Vitals are the most important ones and Google is specifically asking site owners to keep an eye on these scores and improve them where you can.

LCP: Largest Contentful Paint

Largest Contentful Paint measures the point at which the largest content element appears on the screen. Keep in mind that it doesn’t measure the time it takes for your page to fully load, but it simply looks at when the most important part loads.

If you have a simple web page with just a piece of text and a large image, that large image will be considered the LCP. Since this is the largest piece of content to load in the browser, it’s destined to make an impression. By getting that to load quicker, your site can appear much faster. So, sometimes, it might just be as simple as optimizing that image itself. 

In the past, there were metrics like First Meaningful Content, which measured time when the first piece of content appeared on screen that meant something to the user. But, unlike the name suggests, the FMC metric often couldn’t figure out what was the most meaningful thing that appeared on screen. Complex metrics lead to useless data.

Largest Contentful Paint is easy to understand: it is simply the time it takes for the largest element to appear on the screen. These elements might include images, videos or other types of content. 

What you need to know

Now you know what the LCP is you can start optimizing for it. According to Google, you should aim for the LCP to happen within the first 2.5 seconds of the page loading. Everything under 4 seconds needs improvement and you can consider everything over that as performing poorly. 

An overview of the scoring for LCP

The LCP is also dynamic, as the first thing that loads might not immediately be that large image. The LCP shifts to that large image when that appears on screen. 

Here’s an image from Google that explains how the works:

This image from Google gives you a good idea of how the LCP is measured

On the left, you first see the logo and ‘Visual stories’ line appear. In the second screen, the large headline appears as a candidate for LCP. In the last screen, however, you see that big image overtakes the header as LCP. If you have just one big piece of content, that might be the LCP the whole time.

If you look at the loading process in the image, you can easily see why this is such a handy metric. You can easily spot what the LCP is and optimize the loading of that element. 

Google offers several tools to help you find all these elements. PageSpeed Insights, for instance, offers a wealth of data on Web Vitals, plus a whole lot of advice to improve your page. If we run yoast.com on PageSpeed Insights, we get a number of scores and below that score, advice. In our case, the LCP was average and that’s due to it being a large image. In the screenshot below, you can see that PageSpeed Insights correctly identified that element. Now you now what to improve!

PageSpeed Insights identifies the large header image as the LCP on on our site

According to Google, the LCP is affected by a number of factors: 

  • slow server response times: so optimize your server, use a CDN, cache assets, et cetera.
  • render-blocking JavaScript and CSS: so minify your CSS, defer non-critical CSS and inline critical CSS.
  • slow-loading resources: so optimize your images, preload resources, compress text files, et cetera.
  • issues on client-side rendering: so minimize critical JavaScript, use server-side rendering and pre-rendering. 

Google has more documentation on the background of LCP and how to optimize for it.

FID: First Input Delay

The First Input Delay measure the time it takes for the browser to respond to the first interaction by the user. The faster the browser reacts, the more responsive the page will appear. If you are looking to offer your users a positive experience — who isn’t? —, then you should work on the responsiveness of your pages. 

Delays happen when the browser is still doing other work in the background. So, it has loaded the page and everything looks dandy. But when you tap that button, nothing happens! That’s a bad experience and it leads to frustration. Even if there’s just a small delay it might make your site feel sluggish and unresponsive.

A browser has to do a lot of work and sometimes it needs to park certain requests, only to come back later to them. It can’t do everything all at once. As we’re building ever more complex sites — often powered by JavaScript — we’re asking a lot from browsers. To speed up the process between getting content on screen and making it interactive, we need to focus on the FID. 

The FID measures all interaction that happen during the loading of the page. These are input actions like taps, clicks and keypresses, but not interactions like zooming and scrolling. Google’s new metrics call for an FID of less than 100ms to appear responsive. Anything between 100ms and 300ms needs improvement and you can view anything above that as performing poorly.

After testing the FID you get a score and you can work from there

What you need to know

One of the things you need to remember is that you cannot measure the FID if there is no user interaction. This means that Google can’t simply predict the FID based on the data they have from the lab — they need data from real users, or so-called field data. This also means that this data is less controllable as lab data as it collects data from users will all kinds of devices and who uses in different ways and environments. This is one of the reasons why you sometimes see data change.

If you are looking to improve your scores, you will often find JavaScript to be the culprit of bad grades. JavaScript helps us build awesome interactions, but it can also lead to slow websites with complex code. Often, the browser cannot respond to input while it is executing JavaScript. If you work on improving your JavaScript code and the handling of it, you are automatically working on improving your page experience scores.

This is the hardest part, though. Most sites can gain a lot by reducing the time it takes to execute JavaScript, breaking up complex tasks or removing unused JavaScript.

For instance, yoast.com has a pretty good score but it’s not perfect. There are still processes that prohibit us from getting perfect scores. Some of these are complicated to fix or we simply need this code for our site to function properly. You should look at your scores and determine what you can do. Try to find the improvements that are easiest to do or result in the biggest performance jumps.

There are always improvements to make, but you have to decide if that’s worth it — or even possible

Read Google’s documentation on FID and how to optimize FID.

CLS: Content Layout Shift

The third Core Web Vital is a brand-new one: Content Layout Shift. This metric tries to determine how ‘stable’ stuff loads onto your screen. It looks at how often stuff jumps around while loading and by how much. You can imagine that sometimes a button loads on the screen, inviting users to click it. In the background, however, there’s still a large content area being loaded. The result? When the content finally fully loads, the button pushes down a bit — just as you want to hit that button. Again, frustration mounts!

These layout shifts happen a lot with ads. Now, ads are a lifeline for many sites, but these are often loaded so poorly that they frustrate users. In addition, many complex sites have so much going on that these are heavy to load and content gets loaded whenever it’s ready. This can also result in content or CTAs that jumps around on screen, making room for slower loading content. 

Take CNN.com, for instance. News websites are typically very complex and slow to load, and CNN is no exception. It scores really badly on a PageSpeed Insights test. If you look at the issues and the corresponding tips further down the page, you’ll notice that no less than five moving elements were found at the time of writing. When loading this page, it leads to a lot of elements jumping around, and it takes a while for it to stabilize and be useful. And because users aren’t always that patient, they try to click a button at the moment it appears on screen — only to miss it because a big ad appears in that spot.

CNN.com doesn’t score too well in PageSpeed Insights. You can see it found five moving elements that contribute to the CLS

What you need to know

The Cumulative Layout Shift compares frames to determine the movement of elements. It takes all the points at which layout shifts happen and calculates the severity of those movements. Google considers anything below 0.1 good, while anything from 0.1 to 0.25 needs work. You can consider everything above 0.25 as poor. 

The scores for CLS

Of course, the score only looks at unexpected shifts. If a user clicks the menu button and a fold-out menu appears, that doesn’t count as a layout shift. But if that button does call a big change in design, you should make sure to keep that clear for the user.

I’ve already mentioned that ads are one of the main culprits of this. They are often in JavaScript and not well-optimized, plus they are served from an external server. Slowness is added in every step and you have to work hard to get your ads to appear in the right spot at a moments notice. But there’s another element that’s responsible for large layout shifts: images.

Developers don’t always specify the width and height of an image in the code and leaving it up to the browser to figure out how the image should appear on screen. On a page with some images and text, the text will appear on screen first, followed by the images. If the developer hasn’t reserved space for these images, the top part of the loading page will be filled with text, prompting the user to start reading. The images, however, load later and appear in the spot where the text was first. This pushes the text down, getting the user agitated. So, always specify the width and height of images in the CSS to reserve a spot for the images to load.

Google has a lot of background documentation on CLS, plus on how to optimize for CLS.

Tools to measure Web Vitals

There are loads of tools to help you monitor Web Vitals and improve the performance of your site. I’ve mentioned a lot of them in the first Page experience post I wrote some time ago. You can see them listed there. Here, I’d like to highlight the most important ones:

  • PageSpeed Insights: PageSpeed Insights has turned into a full-service measuring tool with both field as well as lab data. In addition, you get advice on what to improve.
  • Lighthouse: Google built Lighthouse as a tool to audit PWAs, but now it’s a great tool to monitor performance. It has several audits that PageSpeed Insights doesn’t have and it even has some SEO checks.
  • Search Console Core Web Vitals report: You can now get insights from your site straight from Search Console! Great to get a feel for how your site is performing.

These are the Core Web Vitals

Sometime in 2021, Google will update their algorithms to incorporate a new ranking factor: page experience. To measure page experience, Google developed a new set of metrics called the Web Vitals. Within these Web Vitals, you can find three core metrics: Largest Contentful Paint, First Input Delay and Content Layout Shift. These stand for performance, responsiveness and visual stability — the three pillars of Google’s page experience update.

Keep focusing on your users and build an awesome site!

The post Learn about the three Core Web Vitals: LCP, FID & CLS appeared first on Yoast.

Page experience: a new Google ranking factor

A couple of weeks ago, Google announced Web Vitals — a new set of metrics to measure the speed and user experience of websites. Last week, Google announced that these metrics will make its way into a core algorithm update as new ways of judging and ranking sites based on the page experience they offer. This update is due to arrive some time in 2021.

UX matters, for real now

In 2010, Google announced that it would take site speed into account while determining rankings. In 2018, Google followed up with the page speed ranking factor in the mobile search results. Now, Google announces a new update that looks at a variety of new or updated metrics — combined with other user experience factors, to form the page experience update.

Page experience you say? In an ideal world, you’d click a link in the search results and the corresponding page would appear instantly. But we all know that’s a pipe dream. Over the years, pages have only increased in size and the popularity of JavaScript made them ever more complex and harder to load. Even with lightning-fast internet connections and potent devices, loading a web page can be a drag. For users, waiting for pages to load can be stressful as well. Not to mention the maddening on-site performance that some websites offer that lead to miss-clicks and the like.

For years, optimizing the performance of websites mostly meant optimizing for speed. But loading times are only part of the equation and the other part is harder to define and measure. This is about how a user experiences all those optimizations. The site might be fast according to the metrics, but does it feel fast? Thus, it’s high time to take a drastic look at page experience.

According to Google, “Great page experiences enable people to get more done and engage more deeply; in contrast, a bad page experience could stand in the way of a person being able to find the valuable information on a page.”

Enter Web Vitals

Early May 2020, Google announced Web Vitals — a thoroughly researched set of metrics to help anyone determine opportunities to improve the experience of their sites. Within those new metrics, there is a subset of metrics every site owner should focus on, the so-called Core Web Vitals. According to Google, “Core Web Vitals are a set of real-world, user-centered metrics that quantify key aspects of the user experience.”

Each Core Web Vital looks at a specific piece of the page experience puzzle and together they help both Google and yourself make sense of the perceived experience of a site. Core Web Vitals are available in all Google tools that measure the page experience.

The Core Web Vitals will evolve over time and new ones might be added in due time. For 2020, Google identified three specific focal points:

  • Loading,
  • Interactivity,
  • Visual stability.

These focal points correspond with three new metrics:

  • LCP, or Largest Contentful Paint: This metric tells how long it takes for the largest content element you see in the viewport to load.
  • FID, or First Input Delay: The FID looks at how long it takes for a browser to respond to an interaction first triggered by the user (clicking a button, for instance)
  • CLS, or Cumulative Layout Shift: This new metric measures the percentage of the screen affected by movement — i.e. does stuff jump around on screen?
The new Core Web Vitals are aimed helping you improve the page experience of your site (image Google)

As you see, these core metrics don’t simply look at how fast something loads. They also look at how long it takes for elements to become ready to use. The Cumulative Layout Shift is the most forward-thinking of the bunch. This has nothing to do with speed, but everything with preventing a bad user experience — like hitting a wrong button, because an ad loaded at the final moment. Think about how you feel when that happens? Pretty infuriating, right?

Combining new metrics with existing ranking factors

The launch of Web Vitals was noteworthy on its own, but Google took it up a notch this week. Google is going to use these new metrics — combined with existing experience ranking factors, to help with ranking a pages. Keep in mind, Google uses an unknown number of factors to judge sites and rank them. Some factors weigh a lot, but most have a smaller impact. Combined, however, they tell the story of a website.

The new Web Vitals join several existing factors to make up the page experience ranking factors:

  • Mobile-friendliness: is your site optimized for mobile?
  • HTTPS: is your site using a secure connection?
  • Interstitial use: does your site stay away from nasty pop-ups?
  • Safe browsing: is your site harmless for visitors?

These are now joined by real-world, user-centred metrics, like the LCP, FID and CLS mentioned earlier. Combined, these factors take into account everything a user experiences on a website to try to come up with a holistic picture of the performance of your site, as Google likes to say.

The Core Web Vitals are combined with existing ranking factors to form the page experience factors (image Google)

Of course, this is just another way for Google to get a sense of how good your site is and it might be easy to overstate the importance of this particular update. It’s still going to be impossible to rank a site with a great user experience but crappy content.

While the quality of your content still rains supreme in getting good rankings, the performance and perceived experience users have now also come into play. With these metrics, Google has found a way to get a whole lot of insights that look at your site from all angles.

Our own Jono Alderson and Joost de Valk talked about the recent news in the latest instalment of SEO News, part of the premium content in our Yoast SEO academy subscription. Sign up and be sure to check that out.

Google page experience update in 2021

Google has often been accused of not communicating with SEOs and site owners. In the past, we have seen many core algorithm update happen without a word from a Googler. Today, however, Google appears more upfront than ever. In the case of the page experience update, Google warns us twice: one with the announcement of the page experience ranking factors and once six months in advance of rolling out the update in 2021.

By announcing this way ahead of time, Google gives site owners, SEOs and developers ample time to prepare for this update. There are loads of new tools to come to grips with how these metrics function and how you can improve your site using these insights. There’s a lot of new documentation to sift through. And you can start right now. Sometime next year, Google will give you a heads up that the update will be rolling out in six months time.

No more AMP requirements for Top Stories

You can find another interesting tidbit regarding the page experience update. Google will no longer require AMP for getting your news pages in the Top Stories section. Now, any well-built, Google News-validated site can aim for that top spot. Page experience will become a ranking factor for Top Stories, so your site better be good.

New page experience tools? You got it!

Google went all out for to get every site owner to adapt to the page experience changes. New or updated tools help you get the insights you need. They also help you to make sense of what it all means.

Start testing, start improving!

In the past, optimizing your site for user experience and speed was a bit like flying blind — you never had truly good insights into what makes a site fast and what makes one feel fast. Over the years, Google saw the need for good metrics and heard the cries of users in need of usable, safe and fast sites. By announcing these metrics — and by announcing them as ranking factors —, Google makes page experience measurable and deems it helpful enough to judge sites by.

Remember, the update won’t roll out until sometime in 2021, but the tools are there, so you can start testing and improving. Good luck!

The post Page experience: a new Google ranking factor appeared first on Yoast.

Why and how to use the search results to create intent-based content

Search intent is becoming more and more important. Google is getting better and better at guessing exactly what searchers are looking for when they type in their – sometimes cryptic – search terms. That’s why you need to focus on it as well! What is search intent, again? How do search engines approach user intent? And how can you assess if you target the right type of intent with your content? This post is all about that!

Search intent?

Let’s start with a quick refresher on the term ‘search intent’. You’ll recognize from your own online behavior that each search term is entered with a particular intent in mind. Sometimes, you want to find information. Other times, you’re looking to research or buy a certain product. And don’t forget all those times you enter a brand name because you don’t want to type out the site’s entire URL. We generally distinguish four types of searcher intent: informational, commercial, transactional and navigational. If this is new to you, head over to our SEO basics article on search intent, that’ll make understanding this post a bit easier.

Search engines try to predict user intent

Of course, for each of these four categories of user intent, there can still be a lot of variation in what exactly a user is looking for. Search engines use data to interpret what the dominant intent of a query is. They want to present results that match user intent exactly. Before we can use the search results to create our intent based content, we need to understand how search intent works for different queries. 

Search terms with dominant intent

Sometimes, a search term has one dominant interpretation. Those terms can be very straightforward, like [buy King Louie Betty dress] or [symptoms of diabetes]. For the first term, results will mainly show pages offering that particular model of dress for sale, or similar dresses by that brand. For the second, results are filled with answer boxes and websites offering medical information. 

Google also understands the intent behind terms that aren’t as literal. For example, whenever people all over the world enter [white house] as a search term, they’re not looking for information on painting their house white. They want to know something about the residence of the president of the United States, and search engines show results accordingly. 

Search terms referring to several entities 

In many cases, the same term can be used to look for very different things. Let’s take the search term [Mercury]. Some people will be looking for the planet, others for the element, even others for the Roman god of commerce, and a few might actually be looking for the lead singer of the band Queen. The reason for that is that this one word can be used to describe several distinctly different things – also described as entities. The context makes clear which entity a word is referring to. It’s important to be aware of how this works in search engines, so read up on the topic in Edwin’s post about entities and semantics.

All these searches probably have informational intent, but they’re not looking for the same thing. While it’s difficult – especially from one word, like in this example – search engines still try to figure out what their users really want when typing in their search term. So, if, for example, less people click to the ‘mercury-element’ results, than to the ‘mercury-planet’ results, they’ll deduce that more people want information about the planet Mercury, and alter the results pages accordingly. If we take a look at the search results for the term [Mercury], we’ll indeed see that most results relate to the planet. From that, we can conclude from that it’s the dominant intent: most people who type in this term are looking for the planet. 

Google tries to satisfy multiple intents for the search term [mercury]

Search terms without dominant intent

Some search terms don’t have one clear-cut intent, which leaves search engines guessing at what to show. You can recognize these searches when the results pages show many different results. Take the query [tree house], for example. Depending on your exact location, the search results show images of tree houses, information and videos on how to build one, advertisements for buying one, and businesses called ‘Tree house’, including a brewery, restaurants, holiday homes, and a code learning web platform. This variety means that Google has most types of intent behind this query covered. But it may make ranking more difficult.

Why should you use the search results to create intent-based content?

Simply put: because the search results give direct insight into what people are looking for when they’re typing in your keywords. You can easily lose sight of the SERPs (Search Engine Results Pages) as your most direct source of information. But if you focus on your site alone, or only look at things through the eyes of tools, you miss invaluable information about what your audience is looking for. Search results pages not only show you how you and your competitors are doing, but also where new opportunities are and whether you need to adjust your SEO strategy. For instance, if you see a lot of images in the results pages for your keyphrase, and you don’t have any, that’s an opportunity! 

To get the most objective idea of the search results for a query, make sure you use a private browser window. A local SERP-checker, such as https://valentin.app/ can also help you get even more objective results, and find out what the results look like in other cities. 

Creating intent-based content yourself

There are a few steps you can take to attune your content better to user search intent and work towards creating intent-based content.

  1. Choose keywords from your keyword research and enter them in a private browser window or SERP-checker.
  2. Analyze what you see on the results pages. Which type of intent is most common? (informational, commercial, etc?) Is there one dominant interpretation and if so, what is it? Do you see videos? Images? Related searches?
  3. Evaluate whether the content you have – or plan to publish – is in line with the things you found on the results pages. Do the types of intent match? Is your content in the right form?
    1. If yes, great! Perhaps you can find ranking ideas in related searches. And, have a quick peek at the competition, to see what you’re up against.
    2. Do you notice that things don’t match up, and the SERPs show intent that doesn’t align with what you offer in your content? Depending on what you find in the SERPs, you might still be able to rank. For example, perhaps there isn’t a dominant type of intent, in which case your content could still make the cut. However, if you find intent that unanimously doesn’t match what you have to offer at all, ranking will be difficult, unless you’re having a high authority site. In that case, consider whether it’s worth the effort to create your content, or if you need to adjust your strategy a little. A solution could be to target a slightly different related keyword, with better matching user intent, or to adjust your content. 

This all still sounds a bit abstract. So, let’s look at a few examples to give you a better idea of this process in practice!

Examples of research to create intent-based content

Using the right terminology in informational content

Results page for the search term [website maintenance]

Here at Yoast, we write about all aspects of SEO. One of those aspects is keeping your website in good shape. We had an article planned on this topic, and one of the most important terms we used in this article was ‘Website maintenance’. Our article was about keeping your site content fresh and your site’s structure well-maintained in the process. However, when we started looking at the SERPs, we noticed that wasn’t what people were looking for at all when they used that term. The content in the answer box wasn’t really related, and the other results almost exclusively consisted of companies offering services to work on technical site maintenance and hosting, with some results stressing the importance of this.

So, from analyzing the SERPs, we got two important insights. Firstly, many people using the search term [website maintenance] have commercial intent, rather than informational. Secondly, they were actually looking for something completely different. So, while we could write an article about website maintenance, to rank, it needed to be a completely different article. It should discuss things like hosting, technical site performance, etc, as that’s what searchers are looking for. 

We realized we had to make changes to the article, adapt our strategy and target a keyword with better matching intent. We changed some of the wording in this article (and related ones as well) from ‘maintenance’ to ‘keep old content’, ‘update or delete’ and ‘cannibalization’. Of course, you could argue that we didn’t pick the right words here in the first place. The reason for that might’ve been that we got a bit stuck in our own content bubble, and forgot about the user. Looking at the SERPs helped change perspective in this case.

Business case: selling recycled jewelry

Let’s look at another example, one that small business owners might relate to. Say, you run an online shop that sells jewelry made with recycled materials. One of your product groups is jewelry made of recycled nespresso pods. So, you’re thinking of trying to rank for [recycled nespresso pods] with a product page or category page. Is that a good idea? Time to look at the search results pages!

Of course, it somewhat depends on location, but prominent on the results page for [recycled nespresso pods] are results about how the pods are recycled. A few are from Nespresso itself, and you could also find some of their videos on recycling. Other results cover the process of recycling and how consumers can get their pods recycled. There is nothing on using recycled pods as a crafting material. So, now you know that this phrase will be difficult to rank for, as it’s not what users are looking for. 

the search results pages for the term recycled nespresso pods jewelry.
Results page for [recycled nespresso pods jewelry].

What about [recycled nespresso pods jewelry], then? As can be expected, the results align a lot better with what you have to offer. However, most results are geared towards informational intent. While the top result is Etsy -which would be difficult to compete with- other results show lists of ideas and tutorials. This means, again, it would be hard to get your commercial or transactional product (category) pages on this list. However, you could still rank if you’d change your strategy, and wrote a tutorial on crafting a basic piece of nespresso jewelry. In such a tutorial, you could easily refer people to your products if they’re looking for more intricate pieces. It might even be worthwhile to make a video tutorial, as there are video results on the results pages.

Conclusion

This post covers a lot of different aspects of search intent in this post. It deals with which types there are, whether or not there is a dominant interpretation and looking at the SERPs to gauge a query’s intent. The exact steps to take will differ on a case to case basis. A good takeaway, in any case, is to always take a good look at the results pages for keyphrases you’re targeting. Analyze what you see, as it’s valuable, first-hand information. Be realistic and be prepared to put in the work if you find you need to change your strategy. We’ve said it time and again: SEO is a lot of work, and you need to work hard to be the best result – for the right query.

Read more: On-SERP SEO can help you battle zero-click results »

The post Why and how to use the search results to create intent-based content appeared first on Yoast.

A brief history of Google’s algorithm updates

These days, the way we do SEO is somewhat different from how things were done ca. 10 years ago. There’s one important reason for that: search engines have been continuously improving their algorithms to give searchers the best possible results. Over the last decade, Google, as the leading search engine, introduced several major updates, and each of them has had a major impact on best practices for SEO. Here’s a — by no means exhaustive — list of Google’s important algorithm updates so far, as well as some of their implications for search and SEO.

2011 – Panda

Obviously, Google was around long before 2011. We’re starting with the Panda update because it was the first major update in the ‘modern SEO’ era. Google’s Panda update tried to deal with websites that were purely created to rank in the search engines, and mostly focused on on-page factors. In other words, it determined whether a website genuinely offered information about the search term visitors used. 

Two types of sites were hit especially hard by the Panda update:

  1. Affiliate sites (sites which mainly exist to link to other pages).
  2. Sites with very thin content.

Google periodically re-ran the Panda algorithm after its first release, and included it in the core algorithm in 2016. The Panda update has permanently affected how we do SEO, as site owners could no longer get away with building a site full of low-quality pages.

2012 – Venice

Venice was a noteworthy update, as it showed that Google understood that searchers are sometimes looking for results that are local to them. After Venice, Google’s search results included pages based on the location you set, or your IP address.

2012 – Penguin

Google’s Penguin update looked at the links websites got from other sites. It analyzed whether backlinks to a site were genuine, or if they’d been bought to trick the search engines. In the past, lots of people paid for links as a shortcut to boosting their rankings. Google’s Penguin update tried to discourage buying, exchanging or otherwise artificially creating links. If it found artificial links, Google assigned a negative value to the site concerned, rather than the positive link value it would have previously received. The Penguin update ran several times since it first appeared and Google added it to the core algorithm in 2016.

As you can imagine, websites with a lot of artificial links were hit hard by this update. They disappeared from the search results, as the low-quality links suddenly had a negative, rather than positive impact on their rankings. Penguin has permanently changed link building: it no longer suffices to get low-effort, paid backlinks. Instead, you have to work on building a successful link building strategy to get relevant links from valued sources.

2012 – Pirate

The Pirate update was introduced to combat illegal spreading of copyrighted content. It considered (many) DMCA (Digital Millennium Copyright Act) takedown requests for a website as a negative ranking factor for the first time.

2013 – Hummingbird

The Hummingbird update saw Google lay down the groundwork for voice-search, which was (and still is) becoming more and more important as more devices (Google Home, Alexa) use it. Hummingbird pays more attention to each word in a query, ensuring that the whole search phrase is taken into account, rather than just particular words. Why? To understand a user’s query better and to be able to give them the answer, instead of just a list of results.

The impact of the Hummingbird update wasn’t immediately clear, as it wasn’t directly intended to punish bad practice. In the end, it mostly enforced the view that SEO copy should be readable, use natural language, and shouldn’t be over-optimized for the same few words, but use synonyms instead. 

2014 – Pigeon

Another bird-related Google update followed in 2014 with Google Pigeon, which focused on local SEO. The Pigeon update affected both the results pages and Google Maps. It led to more accurate localization, giving preference to results near the user’s location. It also aimed to make local results more relevant and higher quality, taking organic ranking factors into account. 

2014 – HTTPS/SSL

To underline the importance of security, Google decided to give a small ranking boost to sites that correctly implemented HTTPS to make the connection between website and user secure. At the time, HTTPS was introduced as a lightweight ranking signal. But Google had already hinted at the possibility of making encryption more important, once webmasters had had the time to implement it. 

2015 – Mobile Update

This update was dubbed ‘​Mobilegeddon​’ by the SEO industry as it was thought that it would totally shake up the search results. By 2015 more than 50% of Google’s search queries were already coming from mobile devices, which probably led to this update. The Mobile Update gave mobile-friendly sites a ranking advantage in Google’s mobile search results. In spite of its dramatic nickname, the mobile update didn’t instantly mess up most people’s rankings. Nevertheless, it was an important shift that heralded the ever-increasing importance of mobile.

2015 – RankBrain

RankBrain is a state-of-the-art Google algorithm, employing machine learning to handle queries. It can make guesses about words it doesn’t know, to find words with similar meanings and then offer relevant results. The RankBrain algorithm analyzed past searches, determining the best result, in order to improve. 

Its release marks another big step for Google to better decipher the meaning behind searches, and serve the best-matching results. In March 2016, Google revealed that RankBrain was one of the three most important of its ranking signals. Unlike other ranking factors, you can’t really optimize for RankBrain in the traditional sense, other than by writing quality content. Nevertheless, its impact on the results pages is undeniable.

2016 – Possum 

In September 2016 it was time for another local update. The ​Possum update​ applied several changes to Google’s local ranking filter to further improve local search. After Possum, local results became more varied, depending more on the physical location of the searcher and the phrasing of the query. Some businesses which had not been doing well in organic search found it easier to rank locally after this update. This indicated that this update made local search more independent of the organic results.

Read more: Near me searches: Is that a Possum near me? »

2018 – (Mobile) Speed Update

Acknowledging users’ need for fast delivery of information, Google implemented this update that made page speed a ranking factor for mobile searches, as was already the case for desktop searches. The update mostly affected sites with a particularly slow mobile version.

2018 – Medic

This broad core algorithm update caused quite a stir for those affected, leading to some shifts in ranking. While a relatively high number of medical sites were hit with lower rankings, the update wasn’t solely aimed at them and it’s unclear what its exact purpose was. It may have been an attempt to better match results to searchers’ intent, or perhaps it aimed to protect users’ wellbeing from (what Google decided was) disreputable information.

Keep reading: Google’s Medic update »

2019 – BERT

Google’s BERT update was announced as the “biggest change of the last five years”, one that would “impact one in ten searches.” It’s a machine learning algorithm, a neural network-based technique for natural language processing (NLP). The name BERT is short for: Bidirectional Encoder Representations from Transformers.

BERT can figure out the full context of a word by looking at the words that come before and after it. In other words, it uses the context and relations of all the words in a sentence, rather than one-by-one in order. This means: a big improvement in interpreting a search query and the intent behind it.

Read on: Google BERT: A better understanding of complex queries »

Expectations for future Google updates

As you can see, Google has become increasingly advanced since the early 2010s. Its early major updates in the decade focused on battling spammy results and sites trying to cheat the system. But as time progressed, updates contributed more and more to search results catered to giving desktop, mobile and local searchers exactly what they’re looking for. While the algorithm was advanced to begin with, the additions over the years, including machine learning and NLP, make it absolutely state of the art. 

With the recent focus on intent, it seems likely that Google Search will continue to focus its algorithm on perfecting its interpretation of search queries and styling the results pages accordingly. That seems to be their current focus working towards their mission “to organize the world’s information and make it universally accessible and useful.” But whatever direction it takes, being the best result and working on having an excellent site will always be the way to go!

Keep on reading: Should I follow every change Google makes? »

Feeling a bit overwhelmed by all the different names and years? Don’t worry! We made a handy infographic that shows when each Google update happened and briefly describes what the purpose was.

Google's algorithm updates 2011-2020

The post A brief history of Google’s algorithm updates appeared first on Yoast.

Does social media influence your SEO?

Handling your social media is a necessary part of any marketing strategy, but it’s also a vital part of any good SEO strategy. The popularity of social media has risen and probably will keep doing so. That means Google and other search engines can’t ignore them, and you probably shouldn’t either. You even see recent tweets popping up in search results now! So let’s discuss: how does social media influence SEO?

Customers are looking for you

SEO is about being found, so let’s start with the basics. If people are looking for you, make sure they find you! Customers that have heard about your brand might look for you on social media, or even through Google. As a professional company or brand, they expect you to be there. You don’t want them to come up empty, or worse: stumble on another business with a similar name while thinking it’s you. For that reason, it’s a good idea to claim your profiles. Even if you’re not planning on using the platform right now, you might want to in the future.

If you do register on a social platform and are not planning to actively maintain the profile, let your visitors know. Platforms like Twitter and Facebook have the option to pin a tweet or post right at the top of your profile. In that post, explain that while you are not actively present there, they did find the right brand and yes, they can reach you. Point them to other means of communication, like email, make it easy for them! Another plus of claiming profiles: If in the future you do decide to start using a platform, you’re ready to go. Social media is ever-changing, so you never know if you might.

Setting up your social media accounts

When you register with social media platforms, do so seriously. Use a high-quality logo and fill out all the fields offered. If you have physical stores: add them, and their opening hours. A Google My Business account is especially valuable for this! In general, make sure your profiles look professional and up to date. And: be consistent. Use the same brand name across platforms, so people (and search engines!) know it’s you.

Social accounts showing in search results

Did you know that your social accounts can show up when people search for your brand name in search engines? For example, in Google’s Knowledge Panel. Here’s an example of what that looks like:

That looks professional, right? It adds trust as well because users have no doubt that if they click the profile buttons there, they’ll end up on your social account. Learn more: How to add social profiles to Google’s Knowledge Graph.

Latest tweets in search results

What we think is really cool is how Google regularly show an account’s latest tweets, right up there between other search results. Here’s an example of a so-called tweet carousel that shows when you search Google for [yoast].

This is a great way to showcase your business and what you’re all about, while enticing people to visit your profile and follow you for more. 

Claim your space in the search results

What’s also important: content like tweet carousels take up (way) more place in the search results! The same goes for your actual social accounts: they show up too. The more space you claim in the top search results, the more you push down other results. It definitely increases the chances of people clicking through to any of your places on the web!

More traffic to your website

Now you understand the actual benefits of claiming (and preferably regularly updating, but more on that in a later blog post) your social media accounts, how does that tie into your SEO strategy? The idea is quite simple: if people are talking about you, online or offline, you’re relevant. As you might know by now, that is what search engines are looking for: Google wants to present users with the most relevant results. They love serving up search results that they know others find interesting.

So if you offer awesome content on your website, why not spread it even further by referring to it in other places, like on Facebook? It’s your content so it’s yours to share. Help people discover you! By convincing people to click to find out more, read on, etc., your social media posts could seriously increase traffic to your website. 

Brand awareness through social media

Having success in social media also increases brand awareness. If you’re sharing great content, people will connect that positive experience to your brand name and experience. They might share your content or even just ‘like’ it. Either way, they’re helping you reach a new audience. Social media algorithms like content that other people like, they’ll help spread it further. Now, if these new people see and enjoy your content, they might start following you! They’ll get to know you and your products, services, or whatever it is you want them to know about. 

Social media and SEO

Wrapping up: we wouldn’t say that just the fact that you are on social media has a direct impact on rankings. But, as with many ranking factors, it can help indirectly. An increased brand awareness, more traffic, people enjoying your content, all of those could help your website’s success.

So you know that it can pay off to use social media for your brand. But one of the hardest, if not the hardest, things of social media is: what do you ‘do’ on them? What kind of content are you supposed to share? How do you make a social media plan for your business or blog? How much time will it take? Is it worth it? We’ll cover setting up a social media strategy in an upcoming blog post, so stay tuned!

Read more: Social media optimization with Yoast SEO »

The post Does social media influence your SEO? appeared first on Yoast.

Google BERT: A better understanding of complex queries

By announcing it as the “biggest change of the last five years” and “one that will impact one in ten searches”, Google sure turned some heads with an inconspicuous name: BERT. BERT is a Natural Language Processing (NLP) model that helps Google understand language better in order to serve more relevant results. There are million-and-one articles online about this news, but we wanted to update you on this nonetheless. In this article, we’ll take a quick look at what BERT is and point you to several resources that’ll give you a broader understanding of what BERT does.

To start, the most important thing to keep in mind that Google’s advice never changes when rolling out these updates to its algorithm. Keep producing quality content that fits your users’ goals and make your site as good as possible. So, we’re not going to present a silver bullet for optimizing for the BERT algorithm because there is none. 

What is BERT?

BERT is a neural network-based technique for natural language processing (NLP) pre-training. The full acronym reads Bidirectional Encoder Representations from Transformers. That’s quite the mouthful. It’s a machine-learning algorithm that should lead to a better understanding of queries and content. 

The most important thing you need to remember is that BERT uses the context and relations of all the words in a sentence, rather than one-by-one in order. So BERT can figure out the full context of a word by looking at the words that come before and after it. The bi-directional part of it makes BERT unique.

By applying this, Google can better understand the full gist of a query. Google published several example queries in the launch blog post. I won’t repeat them all but want to highlight one to give you an idea of how it works in search. For humans, the query [2019 brazil traveler to usa need a visa] obviously is about answering if a traveler from Brazil needs to have a visa for the USA in 2019. Computers have a hard time with that. Previously, Google would omit the word ‘to’ from the query, turning the meaning around. BERT takes everything in the sentence into account and thus figures out the true meaning.

As you can see from the example, BERT works best in more complex queries. It is not something that kicks in when you search from head terms, but rather the queries in the long tail. Still, Google says it will impact every one in ten searches. And even then, Google says that BERT will sometimes get it wrong. It’s not the end-all solution to language understanding.

Where does Google apply BERT?

For ranking content, BERT is currently rolled out in the USA for the English language. Google will use the learnings of BERT to improve search in other languages as well. Today, BERT is used for featured snippets in all markets where these rich results appear. According to Google, this leads to much better results in those markets.

Useful resources

We’re not going into detail into what BERT does, talking about its impact on NLP and how it’s now being incorporated into search, because we’re taking a different approach. If you want to understand how this works, you should read up on the research. Luckily, there are plenty of readable articles to be found on this subject. 

This should give you a solid understanding of what is going on in the rapidly developing world of language understanding.

Google’s latest update: BERT

The most important takeaways from this BERT news is that Google is yet again becoming closer to understanding language on a human level. For rankings, it will mean that it will present results that are a better fit to that query and that can only be a good thing. 

There’s no optimizing for BERT other than the work you are already doing: produce relevant content of excellent quality. Need help writing awesome content? We have an in-depth SEO copywriting training that shows you the ropes.

The post Google BERT: A better understanding of complex queries appeared first on Yoast.

15 questions about ranking factors – Yoast webinar recap

People are always talking about ranking factors. You know, the secret ingredients to Google’s magic algorithmic formula. If you know them and find a way to please these factors, you’re well on your way to that coveted number one spot — or so people seem to think. In general, chasing all these individual ranking factors is not a good tactic. Focusing on building the best site is. We thought it’d be a cool idea to play a game of “is-a-ranking-factor” in our latest webinar. Here are the results!

Haven’t watched the webinar?

If you haven’t watched the ranking factor webinar, please do. Jono Alderson gives an incredible introduction to ranking factors, why people are talking about it, and what we should be talking about. After that, Jono and Joost get to pick cards with questions about possible ranking factors. Their answers are very insightful! You can find it on YouTube and embedded below.

The ranking factor FAQ

To guide you through this minefield, we collected some of the ranking factors we mentioned on the show in this FAQ. Let’s kick things off with an answer to the question: What are ranking factors?

What are ranking factors?

Ranking factors are all the elements that search engines take into account to rank a specific page in the search results. This concerns technical considerations, content quality, site structure, links, user signals, user experience, reputation and many, many other elements. The number of factors that search engines take into account is unknown but run in the hundreds and maybe thousands.

Is user experience a ranking factor?

Joost de Valk: User experience is a ranking factor. User experience, however, is not something you can rate on a 0 to 10 scale. The problematic thing with a lot of these factors is that they’re all both direct and indirect ranking factors. If your user experience is horrible, no one will ever link to you. If your user experience is excellent, probably more people are willing to recommend you to their friends, search for you again and go back to your website. All these things tie in together.

If UX is a ranking factor how does Google determine that?

Jono Alderson: This is interesting because they’re not on your site measuring your site are they? So there’s a lot of conspiracy theories that they might read your Google Analytics or insights from Chrome, but that’s probably not true. What they are looking at when they visit your site is content, structure, speed, layout, color schemes et cetera. Not only that, but they’re also looking for those critical short clicks, bounce backs and pogo-sticking. They do check if people visit five other web sites when they visit this one. They’re analyzing their own search results. But it’s hard for them to quantify UX because they’re not there. They’re trying to work it out from the outside in.

Is word count a ranking factor?

Jono: There’s not one true answer for this. The point is, you need the right amount of content for answering the question that the user has. There’s no answer to how many words a post should need. There’s no obvious maximum and more isn’t necessarily better but more than enough is a good answer. If you can write 500 words on a topic and that feels right, then definitely don’t stop at 200. But in some cases, a short answer is what you want. 

Joost: At some point, I chose to put a minimum word count into Yoast SEO for a reason. I think most algorithms still need a bit of content to be able to determine a topic. If you don’t have enough content, then determining a topic becomes very hard. So don’t get too hung up about an absolute amount.

Is the weather a ranking factor?

Joost: If you think about this you’d say no, of course not. The weather doesn’t impact rankings. That’s true, not directly. But if you sell air-conditioning, people search differently during a heat wave than in regular weather conditions. Now, they’re looking for “ships today” or “delivered by tomorrow.” So it’s an outside factor. The weather influences the way people click. It changes their behavior and that click behavior can dramatically impact rankings quite quickly. All because of how Google works with these things. So the weather can influence rankings, but the question is can you play into it in a good way. That’s probably a lot harder, although not impossible.

Is bold text a ranking factor?

Jono: I think once upon a time somebody thought it was. People thought it was a good idea to put the keywords they want to rank for in bold because Google would  “recognize” those and deem them important. I don’t think it ever worked like that. Somehow, there are still people doing it. Maybe it correlates as being a _bad_ ranking factor. If you’re bolding your keywords instead of thinking about how to make this text good and readable, you’re probably making things worse.

Is bounce rate a ranking factor?

Joost: I think that bounce rate is a result of a lot of things happening on your site. It’s a very measurable thing and it’s one of the results of good user experience. Bounce rate is often misunderstood. There’s a couple of different things at play here. People search, then click on your website and going back to the search results and click on the next result. They didn’t find a result they liked so they bounced back to SERPs. This is called pogo-sticking and I think that is an important thing to look at.

It’s also about bounce rate in general, because there might be a certain number of people who come to your site and immediately click away because of whatever it is you have on your site, whether that is a pop-up or you have a horrible design. Fixing your bounce rate by genuinely improving your site is helpful and it will help you regardless of whether your rankings get better. 

Jono: Obviously, there are scenarios where bounce rate is fine. If you have a great article that answers the question the user has they come to read it and go away. That’s not a bad experience, because that’s what we want to happen. Plus, there’s something worth dwelling on here, which is the mental model we all have that somebody searches something and then clicks on a result isn’t how people behave. They search than change their search, they search again, they click on five different results and they see all these different brands and all these different pages and it’s that experience that decides whether they bounce and how they feel about the experience. That’s how we need to be thinking about search and optimizing. It’s not just why did they bounce from my site, but what was their experience and what role did I play in it.

Is site speed a ranking factor?

Jono: Yes, site speed is a ranking factor. Google has confirmed in various publications that site speed affects the ranking position of your site. Now they do say that’s only the case when you are very slow, so it only affects a tiny percentage. But site speed is a huge part of user experience. All research says that people prefer fast web sites. So even if site speed isn’t a huge ranking factor the experience users have of your site is. It means they’re more likely to read, less likely to bounce, more likely to link, etc. It is a huge part of user experience.

Is having a meta description a ranking factor?

Joost: The question is, does having a meta description by itself make you rank better? I don’t know whether we can answer that with a yes or no. If you’re lucky, your pages get a meta description in the search results underneath the title of your site. If you’re lucky, because in a lot of cases Google will show something else. So changing it might not directly impact what’s shown there. If it’s shown there and it’s good, it might influence the CTR from the SERPs to your website. So it might influence the number of people reaching your site, therefore, it might help your rankings overall et cetera.

Is having a progressive web app (PWA) a ranking factor?

Jono: Regarding progressive web apps, if you do it well and you take advantage of the technology, maybe that will affect your rankings, but is it a ranking factor? You might become eligible for rich results or use functionality that’s integrated into the search results. You might get the ability to book your restaurant directly from the search results, which might mean more people have a good experience, which gets you more good reviews, which might make you rank higher. It’s a technological platform, it’s not a thing that ranks you better or not but it unlocks capabilities for sure.

Can Google understand text and evaluate the quality of a text?

Marieke: I do think that Google knows what quality text is. They employ linguists. They know about language. They know that people can only have twenty words in their short-term memory, so longer sentences will be hard to read.

Joost: One of the things that our linguistic team learned while doing research, is that it’s hard to get the topic out of a text if the text is poorly written. So even if a text is more eloquent and uses more fancy words, it might actually be harder to figure out what the text is about. I think that good, readable and understandable text has a higher chance of getting Google to understand what it’s about.

Does CSS styling or the visual layout of the page influence ranking?

Jono: Google tries to understand pages like humans do. They have a famous patent called The Reasonable Surfer. Here, they look at the layout of the page and try to assess what’s what. They know that a link in a photo is probably less relevant than a link in the header. They go further than that. We know they render the page, we know they process and parse all the CSS, we know that broken layouts and hiding things impact things. So yeah, they are looking at the design. How that manifests in the system: who knows. Your CSS might impact your rankings. So if you have an ugly shade of pink as the background for your page or all your stuff is moving or half of it is invisible, that’s an issue.

Is having multiple languages a ranking factor if you offer products in more than one language?

Joost: I don’t think it’s necessarily a ranking factor. I do think that if you do all the technical stuff around multilingual SEO well and you have a page ranking well in English and you have a page in Spanish then the fact that you have an English page that hreflangs correctly to that Spanish page might be helping that Spanish page. In that case, it’s not the fact that you have multiple languages, but it’s the fact that you have multiple places in which you can rank and gather links and whatnot. Having a translated version of your website can be beneficial.

All this talk about ranking factors and no mention of links?

Joost: I still feel that links are the result of other stuff you do. So if you do PR well, if you do your marketing well, if you do a lot of these things and then you get links as a result. It is important to remember that the time of getting links artificially is over. At least for the English-speaking market and maybe in a few other languages. Unfortunately, in other languages, like in Dutch, getting a ton of spammy links still works when the other sites aren’t very good. When you have strong competition it becomes impossible to rank against them.

A final note on ranking factors

When Google was much simpler, it was easy to spot the specific tactics or patterns which you could use to get ahead of the competition. You could tweak your page titles, get some more links and what not. But that’s not how it works anymore — Google is too sophisticated. The secret is to focus less on all these individual tactics and focus more on becoming the best result for your users.

Google doesn’t want site owners trying to reverse-engineer how they rank sites. They simply want better sites. They want better results for their users and that makes it harder to know what will have impact and what not. It also means that you’ll almost always benefit from improving your site. Understand your audience and solve their problems.

We don’t want to say that ranking factors don’t exist. They do exist. They’re real, but we are saying that if you’re focusing on which ranking factors you should be optimizing for you’re probably missing the big picture. You need to work on the overall quality of your website. Every one of your pages has to be awesome and there’s no faking that. You have to be the best result for each phrase you want to be found for. Getting all of that right requires a lot of hard work and a holistic approach to SEO.

The post 15 questions about ranking factors – Yoast webinar recap appeared first on Yoast.

Google says it doesn’t use rel=prev/next for pagination

Sometimes you wonder if Google even knows how Google works. Search is getting more complex by the day and there comes a point where it’s anybody’s guess. Yesterday, Google ‘announced’ that its search engine doesn’t use the pagination markup rel=prev/next at all and hasn’t for years. That’s curious because they have been advocating using it until very recently.

So what are we talking about here?

The web standard rel=prev/next was introduced many, many years ago to help determine relations between part of URLs or different pages. In 2011, Google started using those links as a strong hint to discover pages that were related. Almost every site now uses these links to provide these hints. Yoast SEO automatically adds these links for our users. Now, it turns out Googlebot is deemed so ‘smart’ by Google that it doesn’t need help anymore.

Some smart SEOs found the official help docs for how to use this for pagination gone. Yesterday, Google was pressured into giving an official announcement on this. Here’s that announcement:

Nice and easy, right? Although it is unclear what the pagination changes mean for huge e-commerce sites, for instance: good luck trying to cram 10.000 products on a single view-all page.

The old advice

Google’s advice on its now deleted Webmasters Help page gave the following three options to handle paginated content:

Quote:

  • Do nothing. Paginated content is very common, and Google does a good job returning the most relevant results to users, regardless of whether content is divided into multiple pages.
  • Implement a View All page. Searchers commonly prefer to view a whole article or category on a single page. Therefore, if we think this is what the searcher is looking for, we try to show the View All page in search results. You can also add a rel="canonical" link to the component pages to tell Google that the View All version is the version you want to appear in search results.
  • Use rel="next" and rel="prev” links or headers to indicate the relationship between component URLs. This markup provides a strong hint to Google that you would like us to treat these pages as a logical sequence, thus consolidating their linking properties and usually sending searchers to the first page.

Unquote.

That page was available up until early this week and it’s not a good practice to simply delete such a page. It would have made much more sense to update the article or show a notice that something changed. Just deleting it without even redirecting it to something else useful feels off. For now, the original blog post announcing the use of rel=prev/next by Google is still available — with a new notice at the top.

What’s Google saying now?

Google’s current stance is that Googlebot is smart enough to discover the next page by analyzing the links on a page and, therefore, a strong signal like rel=prev/next isn’t necessary anymore.

That, however, doesn’t mean you should go and delete all those rel=prev/next links you’ve worked so hard to implement.

It’s important to remember that this is a web standard and that there are other search engines besides Google. Bing’s Frédéric Dubut already said they’re using rel=prev/next as hints for discovering pages and understanding site structure, but not to group pages or rank them.

Now what?

While we wait for the dust to settle and maybe see if Google details a new way of handling pagination, here are a couple of things you should keep in mind:

So, for the moment keeping everything as it seems like the most sensible option. As this is a W3C standard and not just something Google dreamed up, it’s best to stick to it. It is a good time to take a long hard look at your site structure though!

And what does this mean for Yoast SEO?

Yoast SEO has handled pagination for WordPress sites for ages. As I said, we automatically add everything search engines need to understand how things fit together, like rel=prev/next and a self-referencing canonical.

Not too long ago, we changed the way we handled indexing of paginated content. Initially, we offered the option of noindexing archive pages, but as Google mentioned several times that long-term noindexing eventually leads to them not following those links after some time. This makes adding noindex to page 2 and further of paginated archives a bad idea, as it might lead to your articles no longer getting the internal links they need.

As it stands now, we are talking about how to best go about handling pagination. The need for proper pagination is still there, but it might just turn out that Google has indeed become much smarter at figuring out how everything fits together — and what to show in search or not.

For now, we think that it makes sense to keep everything working the way it does at the moment. Pagination tags can still be useful to other systems — and, if paginated pages are just ‘normal pages’ now, then it makes it even more important not to noindex them.

Stay tuned!

The post Google says it doesn’t use rel=prev/next for pagination appeared first on Yoast.