15 Steps to a Search Engine Friendly Website – 12 Post SEO Guide #4

Share to Reddit

In this section of the SEO Guide, we’ll look at some site setup issues that impact on your search engine rankings. 

Site owners don’t usually understand much about the technology used in their pages.  Those that do understand the technical end of things don’t usually understand, or care much, about SEO.  Somewhere in the middle lies the average web designer who does some things for design reasons, knows a little about SEO and does some things for technical reasons.  The result is that bad SEO practices get built into almost every site that goes up on the web. 

This article, part 4 of our 12 Post SEO Guide, is a 15 point checklist to help you end up with a search engine friendly website.

1. www vs non-www redirects and SEO

Typing in

www.mywebsite.com

and

mywebsite.com

should both lead you to your website in most setups. 

How this is achieved, however, makes a difference.  The first instinct of server people seems to be to set one address  up as an alias of the other.  This solves the user problem (it redirects people to the site they are looking for) but creates a problem for the search engines.  It creates duplicate content. 

A principal to follow when structuring and setting up your website is this:

Each unique piece of content should exist at one and only one URL. 

In this case, the solution is to pick one version of your site (either with the www or without) and redirect the other to it.  This needs to be done with a permanent (301) redirect.

2. Other URL Canonicalisation Issues

Canonicalisation is a long winded way of saying that each piece of content should have one and only one URL(address).  There are many situations in which this practice is breached.  Sometimes for good reasons and sometimes not.  One common situation where this arises is when your CMS links to the homepage of your site like this

/index.php

or

www.mysite.com/index.php. 

Most of the external links to your homepage will go to www.mysite.com.  These are 2 different addresses serving the same content.  Look out for canonicalisation issues in situations where the same data is presented in varying ways.  These can include sorting or scripted sections of your site where a return URL or some other data is appended to the page’s address.  Identify and deal with all duplicate content on your site.

3. Frames and Your SEO

These days there is pretty much no excuse for having your website in frames.  It seems to have fallen out of fashion anyway so I won’t go into too much detail about why it fails to be search engine friendly.  If your website is in frames the best option is almost always to do away with the frames.

4. Flash Animations

Flash is used a lot and can do a really good job to spice up website designs.  Flash can be used to play videos on your site or add slideshows and is often the tool of choice to add that touch of flair.  Flash content as an addition to your pages is fine but flash is not very search engine friendly. 

You should keep in mind that search engines can’t read images and can’t watch videos so a lot of the content contained in your fancy flash animation is not being read by the search engines.  Flash sites can be an SEO disaster.  The main content that you are trying to present on any given page should be presented as marked up text.  That way the search engine can read it and can make assessments as to what is important and what the page is about.

5. AJAX

Watch out for situations where you are clicking on the page and loading different content but the URL does not change.  Remember, each piece of content should have one and only one URL.  If you have content that isn’t accessed by a URL then it can’t be linked to directly.  This causes SEO problems.  With social media and sharing so important these days, the problem gets worse.  If separate content doesn’t have separate URLs then it can’t be shared.

AJAX can be used to greatly improve the user experience of your page and can even be used to benefit SEO.  If AJAX is being used on your site, make sure that you are aware of exactly what is being presented to the search engines.  Often there is more content on the page than it appears to human visitors and other times there is less.  Make sure that content of key importance can be linked to directly.  Usually, this has not been thought through and unfortunately the results are harmful more often than they are positive. 

6. Multiple Query Strings in URLs

What we are talking about here is web pages with an address like

www.mywebsite.com/index.php?func=display&cont=234&order=alpha&cn=usa&st=UT

You don’t see them as much as you used to but if this is what the URLs look like on your site or the sites you develop then it is time to think again.  Multiple query strings can be a barrier to indexing your content in the search engines.  You are better to stick with URLs with only one or possibly two parametres in the URL.  Even better, use URL rewriting to achieve addresses with URLs like

www.mydomain.com/products/the-name-of-my-product. 

These are often called Search Engine Friendly (SEF) URLs.  In fact you should also think of them as human friendly URLs.

You get a URL that the search engine will happily index and one that makes sense to humans. 

7. Unique Titles, Descriptions

Title and Meta-Description are two HTML elements which appear in the head section of your web pages.  The Title tag is an important signal to the search engines helping them determine what the page is about.  It is also displayed as the Headline of your search engine result when you do get rankings.  The description is not currently of much use in getting you better search results but again it is often used as the text describing your page when you appear in the results.

Why is this in the technical barriers section of the SEO Guide?  Well, if you have a CMS or a shopping cart or any backend system that allows you to manage the content of your site, this becomes a technical question.   In a search engine friendly website, each page needs a unique Title and possibly description.  You need to be able to create and change these and in some circumstances you may want to create rules to automatically create default Titles and/or descriptions for sets of pages.

There are other elements you need to have flexibility to control including heading tags and content elements.  Does your system allow you to do this?

8. Validation and SEO

I am not really that interested in whether your html validates.  It’s not like you are going to get a boost in rankings for valid code or for standards compliance.  There are ways in which bad code can make a web page harder for search engines to parse however and valid code should avoid these.  Sometimes HTML validators and even accessibility checkers can show up errors that might just cause you problems.  Making content accessible for humans (for example to the blind), aside from being a good thing to do, will often do a good job of making it accessible for search engines.  No harm.

A large part of making your website search engine friendly is making your content and pages accessible to the search engine spiders.

9. Robots.txt

Make sure that you know and understand the instructions contained in your robots.txt file.  Things to look out for include accidentally banning one or all search engines from major sections of your site.  Believe me, it happens.

10. Password Protected Content

If your visitors have to log in to read your content then remember this:  Google doesn’t have a login, so they can’t read your content.  Won’t index it either. 

11. Missing Pages Return 404

If a page doesn’t exist then it should return a 404 – Page Not Found error.  Some databased systems will accidentally serve a custom error page to the visitor but with a 200 OK code.  I’d rather not bore everyone with the details.  This is a bad thing.  Get it fixed.

12. Hosting Issues and Volatility

Your site should be available and reliable.  When individual pieces of content or your whole site regularly times out or goes down entirely, this is not good.  Human visitors will not be reassured as to the quality of your site and Google will take pretty much the same view.

13. Host Location & Search Location

I like to host my sites in their primary market.  Geo-location as a ranking factor for searches is an area that continues to undergo change.  Host location is just one of the possible signals that search engines can use and how they use it varies from engine to engine.  Better safe than sorry is what I say.  If you want to target your site at the USA, go host it there.  If your market is in the United Kingdom or Ireland, find a good, reliable local host.

14. Server Response Time

Your site should not be painfully slow.  Without getting into the page speed propaganda that Google sparked by announcing it as a ranking factor page speed is always an improvement that will help you.  Don’t expect speeding up your pages to massively impact your search engine positions but don’t risk being painfully slow.  Regardless of SEO a faster page will mean a better user experience.

15. Broken Links

Broken links happen in any site of any size that makes any effort to create and change its content.  Check for them regularly and aim to get them down as close to zero as possible.  Just to be clear, I am not suggesting this is a ranking factor for Google.  There is no calculation that says FACTOR1 + FACTOR2 x FACTOR3 – NUMBER OF BROKEN LINKS = FINAL SCORE.  That’s not how it works (at least I hope not, otherwise it would be pretty silly).   We check for them anyway.  If I come across broken links on a site, it’s not a sign that this is a high quality resource.  It doesn’t inspire trust.  I want humans and search engines to view my site as a trustworthy and high quality resource.  No broken links.

This post is part of our 12 Post SEO Guide, an ongoing series.  If you would like to keep up to date you can follow the blog through RSS or follow us on Twitter.

Related posts:

  1. The Perfect Search Engine
Share to Reddit

7 Responses to “15 Steps to a Search Engine Friendly Website – 12 Post SEO Guide #4”

  1. BizSugar.com says:

    15 Steps to a Search Engine Friendly Website – SEO Guide Part 4…

    Bad SEO practices seem to get built into almost every site that goes up on the web. This article presents a 15 point checklist to help you end up with a search engine friendly website. Part 4 of our 12 Post SEO Guide….

  2. Fantastic post made me realize where I am at :) Thanks! ….

  3. ks says:

    I cant decide where I should 301 redirect to, www or non www

  4. SiteStream SEO says:

    If your site has been up for a while, then take a look at which version has more links pointing to it. Also take a look at your business cards and your email etc. and see what way you tend to quote your own site. I’d tend to redirect to the one with the most links.

    All other things being equal, I usually pick the www version but there has been a bit of a fashion to go with the non-www version in the last few years. No idea why.

    If your site hasn’t been around for a while and doesn’t have many pre-existing links then it won’t matter much either way. Just make a choice and be consistent.

  5. Web Design says:

    This checklist is awesome and amazing. Thanks and keep it up.

  6. I am thinking about starting a web business, and I have spoken to a few designers and coders. How would you know if someone has a site that is search engine friendly. I am taking bids on a site desogn that is sort of a copy of features of another site. I spoke to one of the designers and he told me he looked at the other designers site and said that his site was search engine friendly, that he had a url friendly.Thanks

  7. SiteStream SEO says:

    If you want to make sure you get a search engine friendly site, take a look at the different areas above and make sure you hire a designer / coder who understands them. Ask them some questions and see if you get blank looks or answers like of course we do search engine friendly urls or, don’t worry about that we add all your keywords in there for you.