So, what ranking factors will be important for Google in 2011? What should your SEO be thinking about to future proof their work? We’ve got 4 suggestions below.
What is the challenge for Google?
When Google was created it solved one key problem. By using links as a measure of quality and anchor text as a signal of relevance, Google bypassed the spam. If someone puts up a Doorway page for the search engines or stuffs keywords in white text on a white background (1998 spam) and we miss it, no big deal. If the page doesn’t have quality content, no-one will link to it.
Today, in 2011 the challenge is essentially the same. Get quality, relevant content and avoid spam. Google works a lot on speed and UI. These changes are intended to get users to what they are looking for, quickly. The important thing being “what they are looking for”. Google’s job is to match a query to the most relevant, best quality resources in it’s index.
In 1998, when Page and Brin were working on the technology that would be Google, they could provide better results than available search engines with nothing more than Page Rank and Title matches. The problem of the last 10 years has been that whatever factor you pick to hang your rankings on, that is where webmasters will focus their manipulation.
Page Rank is important – ok, so webmasters will build links. Anchor text is important – so they’ll build links with “keyword1 keyword2″ as the anchor text. As soon as ranking factors are known, they are compromised. The site with more links is no longer the best site (if it ever was) its the one with the best link building.
If you are Google, your signals are constantly being compromised. You spend a certain amount of time fire fighting and protecting your signal with anti-spam measures but you also look ahead. You look for new ranking factors that are clean. Here is a quote from the introduction to an SEOmoz post on The Next Generation of Ranking Signals.
“Every 3-4 years, there’s a big shift or addition to the key metrics Google (and, to a lesser extent MSN/Bing and Yahoo!) uses to order competitive search results.
1996-1999: On-page keyword usage + meta data
1999 – 2002: PageRank + On-page
2002 – 2005: Anchor text + Domain name + PageRank + On-Page
2005 – 2009: Domain authority + Diversity of linking domains + Topic modeling + Anchor text + Domain name + PageRank + On-Page”
Take a glance at Rand’s assessment on the development of major ranking factors. What I want to point out here is summed up in one word: DIVERSITY. Google has been doing this long enough to understand that today’s clean signal is not going to solve their problem. It might get them out in front for a while but it will get spammed too. What can help is diversifying your signals and cross-checking.
1. User Data
User data has several advantages as a ranking factor. By tracking the reaction of users to the search results, Google gets direct and accurate feedback on the quality of those results. There are literally thousands of ways that this could be used. Google has data available from the use of its own site, from its Toolbar, from Analytics. Plenty enough to be getting on with.
Google has released a Chrome extension allowing users to block sites from their search results but that’s just one new piece of data that they can use. I don’t have space to go into all of them but here’s an interesting quote from an seo.com article on changes in Google’s serps.
“During Pubcon last November, Matt Cutts asked in his keynote how many SEOs where focusing on the snippet found in the search results and doing Click-Through-Rate optimization. Not too many people raised their hand. Matt grinned and said something to the effect that CTR optimization might be worth looking at.”
For those of you who don’t know, Matt Cutts is head of Google’s web spam team and Google’s main link to the SEO community. Good SEOs have always gone beyond rankings. So good SEOs have paid attention to Click Through Rates for years because it increases the value of the ranking. But does Matt’s grin imply that there might be ranking benefits to better Click Through Rates? Many SEO’s have thought so for a while.
Google’s recent spat with Microsoft over Bing “stealing” Google rankings has finally led them to admit use of Toolbar data as a ranking signal. Here’s a Search Engine Watch post that looks at statements from recent key hires at Google that give an indication of just how important this user data could be http://blog.searchenginewatch.com/110216-161109
“In this same Pubcon session, Matt Cutts said that SEOs who try to stay “ahead” of Google will be the most successful.”
Google doesn’t often give out good SEO advice but this one is a peach. The job of an SEO is to be in front of Google and the other engines, not behind.
2. Social Media
We already spoke about Twitter and Social Media as Ranking Signals for Google. This is definitely one to watch for the coming year. If your SEO strategy was half way reasonable for 2010 it should have already been taking Social Media into account. Regardless, it is clear that the search engines are diving into the social graph and believe that social signals can improve the quality of your results. There is much to explore here and its not clear yet where it will shake out but there is enough to guide the future proofing of your SEO.
Would you believe that my prediction is that on-page is becoming more important. What’s more I believe that on-page is already, and always has been, considerably more important than is commonly held among SEOs. There is a whole generation of SEOs that have learnt their trade believing that because links and anchor text work, they are the only things that work. They are wrong.
Consider the vast amount of research that is going into understanding the use of language and understanding user intent etc. Relevance is a big issue. You have to determine relevance somehow and link text is an extremely poor signal . Google’s not going to use keyword density or other 1980′s metrics to do that but it doesn’t need to.
In 1998, assessing the page didn’t work for the search engines. Webmasters were stuffing keywords into the page and the Search Engine’s assessment was so weak that they were fooled more often than not. But links don’t work either, because we spammed them too. And Social mentions won’t work because that factor will be the focus of this year’s spamming.
Search engines understand a lot more about language now than they did then and their assessment of page content and relationships between pages goes way beyond keyword density. The key issue here is DIVERSITY. Considering this factor together with the others to weed out spam and maximise quality and relevance.
4. Where do local and mobile come in
This isn’t so much a ranking factor as a trend. If you read the Online Marketing Digest here, you’ll notice that local and mobile just keep coming up. The increase in smart phone penetration will continue. The use of those smart phones to look for services near me will continue. Google will continue to drive development and UI changes to serve those needs.
In the main search, Google don’t seem quite settled yet on how Google Places will integrate into the web results. Integration of Google Places, Google Products, News, Video etc. and UI changes to blend in vertical search. These could radically change your SEO strategy depending on your keyword space.
So, those are our 4 top tips for Google ranking factors to watch in 2011. It’s important to think of this with diversity in mind. You don’t formulate this year’s strategy by replacing link work with spamming Twitter. Links will continue to be important for the foreseeable future but I think we can all agree with Matt Cutts that SEOs who try to stay “ahead” of Google will be the most successful.