Whether it’s Google SEO or social bookmarking sites like Tumblr and Digg, they are all on a simple mission. To create the best possible experience for their users by helping them locate sites that meet their information needs. Good content on the web is more important than it has ever been. Search engines, particularly Google now penalize sites that have poor quality, over-optimized articles by dropping them down the rankings. It’s the web equivalent of being sent to Siberia. The content languishes in a remote and inaccessible corner of the Internet, unless users are prepared to trudge through hundreds of pages.
Meet Farmer, Panda and Penguin
Google’s algorithms and associated updates go by such names as Farmer, Panda and Penguin. They may sound cute and cuddly, but if your sites contain low quality material they are your worst nightmare. They punish low quality and duplicate content and spam sites that have an excessive amount of inbound links. ‘Farmer’ was so called because it was created to demote content farms; websites that produce large amounts of content (typically poor quality) to attract traffic and use the page views to generate advertising revenues. When Panda was unleashed, many sites that had previously done well were sent far down the rankings. The biggest losers and how their visibility suffered were highlighted by SISTRIX. The company crunched through a slew of numbers from a dataset of one million keywords to compile its SISTRIX Visibility Index. The index is calculated on traffic from keywords, ranking and click-through rate on specific positions. The table below also shows the number of keywords found from the one million dataset for this domain before and after the algorithm-change. It is a clear demonstration of just how seriously sites can be affected by publishing low quality content. Table by SISTRIX Read more on the effects of Google updates on high profile websites at searchengineland.
Explaining the Changes
In a blog post to accompany some of the changes its algorithms are making, Google explained the philosophy guiding its actions: “Search results, like warm cookies right out of the oven or cool refreshing fruit on a hot summer’s day, are best when they’re fresh.Even if you don’t specify it in your search, you probably want search results that are relevant and recent.” The algorithms are constantly tweaked to return higher quality results for users. Sites that have good content and a high degree of relevance to the search terms that people use are given a better chance to appear higher in the rankings than articles that have poorer content and loaded with keywords. Google punishes high bounce rates. A bounce rate is when a user clicks on a webpage and then quickly returns to the search results. That’s a sign that visitors didn’t find the site informative or helpful.
Google’s algorithms also look at the layout of a page and if there are too many adverts at the top, the site is penalized and downgraded. This can result in the loss of as much as 95% of a site’s traffic. For some online companies, that could mean going out of business unless they make drastic changes. Google explained its reasoning in a blog post: “We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change.” In January 2013 the search engine behemoth rolled out another batch of quality controls. These place relevance on the quantity of new content produced. It was great news for those companies that regularly update their blogs, social media sites and web content, but not so good for static websites that do nothing. Google likes accurate, relevant and informative content. The simplest way to ensure that you can provide this is to ask yourself “How will my content help people?” Websites stand a better chance of improving their visibility when they can publish such material regularly.
“We’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change.”
Serving up Better Results
Today’s web content simply has to be informative, useful and compelling to readers. There is no other legitimate way to appear high up in the natural search engine rankings. It is in the interests of search engines to continue to update their algorithms to ensure that only quality content is being returned. They are in fierce competition with each other for users, and the winners will be those that deliver the most informative content in response to searches. In the midst of all this, low quality sites won’t even be able to get a foothold.