Google rolls out ‘Over Optimization’ update
The official Google Search blog calls it: ‘Another step to reward high-quality sites’ understandably putting a positive spin on another algorithmic update aimed at combating webspam. This is the change which was flagged in Google’s announcements on ‘over optimization’ a few weeks ago. Matt Cutts told Search Engine Land:
“I think ‘over-optimization’ wasn’t the best description, because it blurred the distinction between white hat SEO and webspam. This change is targeted at webspam, not SEO, and we tried to make that fact more clear in the blog post,”
Under the spotlight are techniques like keyword stuffing and link schemes and Google reckon 3.1% of sites in English will be affected in a way that would be noticeable to the user. There is much discussion about whether or not this and other updates are penalising the right sites and doubtless there will be much more. Seemingly aware of a common SEO frustration in the face of the effectiveness of unapproved methods, Matt Cutts ends his post with:
“We want people doing white hat search engine optimization (or even no search engine optimization at all) to be free to focus on creating amazing, compelling web sites. As always, we’ll keep our ears open for feedback on ways to iterate and improve our ranking algorithms toward that goal.”
Panda updated (quietly) on April 19th
During all the confusion caused by Google’s announcement of over optimization penalties, many site owners and researchers have been attributing ranking drops to possible over optimization. This could be a little hasty, as Google also quietly slipped in its latest Panda update on the 19th of this month. If your site has seen ranking losses in the last week or so and you’re blaming over-optimization, the chances are you should be looking at Panda instead.
April 19th Panda Update: Winners and Losers
Danny Sullivan at Search Engine Land has published data on winners and losers from the latest Panda update. It makes interesting reading and there are significant winners.
Searchmetrics points to the following factors in relation to the main losers:
- Sites using databases to aggregate information
- Press portals and aggregators
- Heavily-templated web sites
As ever with this type of research, you can make no assumptions. If your traffic reports show you have lost out you can only use these findings to test the many reasons which may have caused it.
3 functions are being removed from Google Webmaster Tools
The Webmaster Central blog announced on Tuesday that the Subscriber stats feature, the Create robots.txt tool, and the Site performance feature will be removed in the next two weeks. If you use these features, do not worry. There are other ways. Site performance can be monitored using the Site Speed feature in analytics and subscriber stats can be found in Feedburner. There are many tools available online to help you create a robots.txt file if you need one.
How does a Search Engine work?
Not a question that can be answered in a 7 minute video but Matts Cutts youtube posting on Monday does give some useful basic insights into the workings of the Google engine.
It’s almost all about Google
Search engine, Blekko, are reporting a huge spike in traffic since January of this year. Their stats come from hitwise and are backed up by comScore. Bing, DuckDuckGo and Blekko are just about preventing it from being all about Google in the US, but only just.