How to use Meta Tags, Snippets and X Robots to control how search engines show your content

Controlling your content means controlling the way that it's shown in the search results, so that you can get the best click through rate possible. So make sure you know how to use the tools Google uses for this.

Some aspects of SEO have been around since the beginning and although the fabled 'meta keywords' tag doesn't get much use anymore, meta tags still play an important role in SEO. They allow us to tell Google which content they may or may not use and how we want that content represented in the search results.

Making sure you know how to implement them as well as the scope of different tags available gives you the power to dictate how searchers see your content in the SERPs.

In this guide we'll go through the different types of (content related) tags, how Google uses them and how you can implement them.

Meta Tags

A meta tag is an instruction that is given a specific format so that it can be universally recognised and understood, and is placed within the <head> tags of the page code. Here's an example:

<meta name="robots" content="noindex"/>

Breaking this down there are two core parts, the User Agent and the Directive. The User Agent is the name:

meta name="robots"

This tells whatever it is looking at the tag (like a website crawler for a search engine) whether it needs to pay attention to the tag or not. For instance this tag currently applies to all search engine crawlers. However you could make it apply only to Google by using a more specific name, such as:

<meta name="googlebot" content="noindex" />

The 'content' section is the instruction. In this instance it's telling the crawler to ignore the page and not index it, so it is not shown in the search results. Like the 'name' it can be changed:

<meta name="robots" content="nofollow" />

The instruction would now be informing all crawlers that they should not follow the links on the page.

Remember that the instructions given in Meta Tags may be completely ignored by bad actors. You are relying on the crawlers' discretion to follow the instructions you provide.

User Agents

Different search engine crawlers can be addressed by their different names as follows:

Google : Googlebot

Bing : Bingbot

Yahoo! : Slurp

DuckDuckGo : DuckDuckBot

Baidu : Baiduspider

Yandex : YandexBot

Directives - Indexing

There are many different directives. In this guide we're specifically looking at those which govern how content is indexed by Google and then displayed.

The following all relate to the indexing of your content:

all : There are no directions to follow, so access is not restricted.

noindex : Do not show this page in search results.

nofollow : Do not follow the links on this page.

none : Equivalent to noindex, nofollow.

noarchive : Do not show a cached link in search results.

Directives - SERPs Display Settings

These are rules for how Google goes on to display your content once it has indexed it. It's worth being familiar with what these are. Webmasters now have a far better level of control over this than they did previously (thanks to the EU, more on that later).

max-image-preview:[setting] : Controls the size of the image preview shown for the page in the SERPs

  • none: No image preview is to be shown.
  • standard: A default image preview may be shown.
  • large: A larger image preview, up to the width of the viewport, may be shown.

Example: <meta name="robots" content="max-image-preview:standard">

max-video-preview:[n] : The maximum allowed [n] seconds for a video snippet, for video content.

  • 0: At most, a static image may be used, in accordance to the max-image-preview setting.
  • -1: There is no limit.

Example: <meta name="robots" content="max-video-preview:-1">

notranslate : Do not offer to translate the page.

noimageindex : Do not index images on this page.

unavailable_after: [date/time] : Don't show the page in the SERPs after a specific date.

Example: <meta name="robots" content="unavailable_after: Sunday, 01-Sep-24 01:00:00 PDT">


Instead of showing the meta data you have selected, Google may instead choose to show a snippet. This means that it will pick out a chunk of text which is usually more relevant to the individual search and display that instead.

Snippets are automatically created from page content. Snippets are designed to emphasize and preview the page content that best relates to a user's specific search: this means that a page might show different snippets for different searches.

This used to be completely at Google's discretion. However, after France added the new European Copyright directive to the books last year, Google was forced to allow webmasters more control in how their content is used in the SERPs. That control comes in the form of adding snippet-specific Meta Tags and HTML attributes to your content:

Snippet-Specific Meta Tags

nosnippet : Self explanatory, do not show a snippet for this page.

max-snippet:[n] : Character limit [n] allowed for the snippet.

You can also use these values in the max-snippet tag:

  • 0: Do not show a snippet.
  • -1: There is no limit to the length.

Example: <meta name="robots" content="max-snippet:20">

Snippet-Specific HTML Attributes

It is also possible to mark out specific parts of text that you do not want to be used as a snippet. This is the data-nosnippet attribute and can be implemented within your normal span, div and section elements. So it's adding the property 'nosnippet' to a specific element. Here's an example of it in action:

<p>This text can be shown in a snippet <span data-nosnippet>and this part would not be shown</span>

You can find more in-depth examples in this section of the Google Guide.

The X-Robots-Tag

Google doesn't just show web pages any more. It also includes media such as PDF's within the search results. You can't add meta tags to these like a webpage, so the X-Robots tag is a way of adding this information where you can't add it to the document itself. Instead it is served within the http Header response from the server. You can check the header for any URL using online tools such as a header checker.

This is a server level response and would need to be added to your configuration files. You might need to get help from your hosting company with this if you are not confortable doing it. For most SEO's and webmasters it's enough to understand what they are and how they work, without needing to know the technical implementation. However Google does have more information available if needed.