Neither users nor search engines like to be kept waiting so keep page load times down to a minimum.
Search engines will only give your site so much 'crawl' time (time spent trying to find your site's pages). For them (as for you) time is money.
Google takes page load speed so seriously that it’s now a ranking factor. In other words, if your pages don't load quickly enough, your site may slip lower down on Google's results pages.
Users, of course, have even less patience than search engines. Keep users waiting too long and they will leave and never come back.
Check your page load speed on Google Webmaster Tools
You can check your page load speed on Google Webmaster Tools' (GWT) page speed report. You'll find this at:Labs > Site performance.
Site performance reports in GWT give the average time it takes for your site’s visitors to download everything into their browser (text, images, and scripts).
You’re given a written summary and a graph showing trends. Following is an example of a text report:
“On average, pages in your site take 4.6 seconds to load (updated on Jul 16, 2011). This is slower than 71% of sites. These estimates are of high accuracy (more than 1,000 data points). The chart below shows how your site’s average page load time has changed over the last few months. For your reference, it also shows the 20th percentile value across all sites, separating slow and fast load times.”
And here is the accompanying trend graph:
If your site is slow ...
If your site is slower than most (like the one used in the report above) then it’s time to speed it up. Try the following …
Optimize your images.
Speak to your site developers and ask them to check everything on the site as well as your hosting configuration.
Check your hosting location with a site like Domain Tools http://whois.domaintools.com/hosting-location.com
Perhaps most of your users are in the USA and your site is hosted in Europe?
If you get a lot of international traffic then perhaps you need to consider a content delivery network (CDN) with which your site is hosted in more than one place. This speeds your site up because users access the site from a 'local' server.
Google's PageSpeed tools
Google offers a range of ‘PageSpeed’ services to help you speed up your pages, including:
- PageSpeed Insights Browser Extension Runs browser-based tests on a page’s speed and gives recommendations on how to speed it up.
- PageSpeed Insights API Allows developers to integrate PageSpeed tests into website and tool development.
- mod_pagespeed An Apache module that will automatically rewrite web pages. If you think that sounds either incredibly useful or slightly creepy then check this one out ...
- PageSpeed Service Your site is served up via Google’s servers and Google rewrites your code. Time savings of 25% to 60% are claimed.
You'll find more about Google’s PageSpeed services here
As a last resort use the ‘
The noscript tag is used to provide an alternative content for users that have either disabled scripts in their browser, or have a browser that doesn’t support scripting. But don’t rely on noscript to get content indexed as search engines are suspicious of it because it’s been abused by spammers.
If your site uses AJAX to deliver different ‘pages’ that you would like indexed then you have a problem, because search engines can’t index them.
Here’s how an AJAX URL might look:
We’ll list two less-than-perfect solutions below. But this is a tough issue to fix. So ideally, for SEO, either:
- Keep content you want indexed out of pages delivered with # URLs.
- Repeat that content elsewhere (search engines aren’t seeing the # URL version so you are not duplicating the content).
If that’s not doable then here are those two solutions ...
Google’s AJAX crawling scheme
This requires you to create ‘hidden’ copies of the AJAX pages (called HTML snapshots) that are called via an ‘ugly’ URL and triggered via a ‘pretty’ URL with a ! after the #.
Google crawls the snapshots but only ever shows users the pretty URLs.
If it sounds complicated that’s because it is. And there are other drawbacks, too. Enjoy some detailed reading on Google's AJAX crawling scheme here:
pushState for AJAX URLs
history.pushState() is some fancy HTML5 code that can control the displayed URL, remove the # part of it and give search engines something to crawl and index.
This won’t work for all browsers but if you implement it well then users will still be able to navigate to the content you want them to see.
There's not much good documentation around, but here are some resources that may help:
SEOs really don’t like Flash. It doesn’t matter how much Google says it can crawl Flash, we don’t trust that it can do it well. And if it can, Flash content is harder to optimize.
Google doesn’t like Flash much either. Right after saying it can index any text a user can see on Flash files, Google’s help pages recommend you use Flash sparingly and HTML for "content and navigation".
Google also suggest using sIFR (Scalable Inman Flash Replacement) with which content and navigation is in text in your code (easy for search engines to read) but then rendered by Flash.
You'll find more help from Google with Flash and other ‘rich media’ here
Some SEOs dislike ASP almost as much as Flash.
Sites built in ASP always seem to have URL problems and it can be hard to get changes made to ASP sites. Although I should say I’ve had exceptions, including many years working with a great team of developers on The Great Hotels of The World
And then there’s ViewState …
ViewState is a variable that saves information about where a site user has been and what they’ve done during their visits. This sounds fine, but by default the variable is stored in the page code and it can get very big. Over 100K big.
If that big chunk of code slows down your page then it affects site usability and your SEO.
Maybe it affects your SEO in other ways too, like reducing the amount of time Google spiders spend on your site or how they rank your page. I doubt it. I think Google is smarter than that and knows a useless ViewState variable when it can ignore it.
But play safe with ViewState and …
- ideally turn it off, or
- move the code to the server (you’ll have to get your developer to look this one up)
If ViewState must stay then:
- Move ViewState to the bottom of the page’s code
- Optimize ViewState so it is only saving needed information
Frames can be used to make content from another page appear as if it’s on the page being viewed.
If Frames are used to display significant parts of the content that you want indexed then things get difficult for SEO because your page content is on another site. It’s not your site’s content. And, as we know, this is a 'bad thing'.
The simplest thing to do with frames is not to have them in the first place. There are workarounds that can make it possible to use frames and maintain some level of search engine friendliness. But you’re probably better off using the effort to rebuild your site without the frames.
The World Wide Web Consortium (W3C) defines and develops standards for the world wide web. W3C standards decide whether or not your site’s code is valid.
A question often asked of SEOs (and often hotly discussed amongst them) is: does Google care if your code is valid?
The answer is sometimes and maybe. The question has a subtext: do I really need to make my code W3C valid?
The first answer is again: sometimes and maybe. But a smarter answer is: it certainly won’t do you any harm but not doing so might.
So test if your site’s code is W3C valid and fix appropriately.
The snippet is the small sample of text that search engines display with results. As highlighted below:
Snippets are usually found in a crude way by matching the searcher’s query with the first instances of the query's words on the page’s code.
A rich snippet will use content found in specific markup code that is used to describe a page’s content.
The markup code makes it clear exactly what a piece of information is about. So, for example, a summary of product reviews and their average ratings can be displayed as such. See example below:
Rich snippets help searchers find what they are looking for. And lots of studies have shown that pages displaying rich snippets in the search results get higher clickthrough rates (CTR) than those without.
Higher clickthrough rates equal more traffic and response. So you’re interested in rich snippets, right?
Here's a real-life example from a search for site audit on Google. Amongst the results are some blog posts about Wordtracker's site audit tool See how the results that include a photo draw attention to the listing? That's the result of rich snippet markup.
Markup code used to convey rich snippet information can be either in microdata, microformat or RDFa formats. But (and it's a big but) Google, Yahoo and Microsoft have all committed to using microdata in the future and they’ve created a joint vehicle – schema.org - to teach us all how they want it done.
So get ready for microdata delivered with schema.org. At the time of writing, you can use it to give rich snippet information to Google from pages covering a wide range of topics including:
Get more help from Google on schema.org
Google's rich snippet testing tool
Google also provides a rich snippet testing tool to test if your rich snippet code is working properly.
Adding rich snippet markup to your pages requires a little knowledge. But, snippets done well can give you a terrific advantage over your competitors. It's worth investing the time learning how to mark up your pages. As always, run a few small tests to see how much your conversion rates change before completely redeveloping your site!
If you've questions about any other technical aspects of SEO, please let us know.
Get a free 7-day trial
A subscription to Wordtracker's premium Keywords tool will help you to:
- Generate thousands of relevant keywords to improve your organic and PPC search campaigns.
- Optimize your website content by using the most popular keywords for your product and services.
- Research online markets, find niche opportunities and exploit them before your competitors.
Take a free 7-day trial of Wordtracker’s Keywords tool