SEO is one of the most important things for a website. It can be responsible for a huge part, if not all, of your monthly visits. It’s a complex process, and one that is constantly evolving. You probably know that good SEO is strongly tied to having great content. But where did SEO and SEO content writing come from?
We take a look at the history of search engines, and the way they have created and shaped the demand for web content.
The first search engine
The first search engine was Archie (named after archive). While the internet was not around when Archie was developed, it was useful for searching the publicly available files on a small network at McGill University, Montreal. Archie kept an index of all the available files, so users could search for relevant information and download the files onto their computer. Because of limited space, Archie could only point you to the relevant file link, and not display its contents as we do today with web pages. While not connected to the internet, Archie for the template for search engines that would follow in the next few years.
The dawn of SEO
The first glimpse of SEO can be traced back to 1994, when Brian Pinkerton invented the first web crawler that could index entire web pages. Originally a desktop application, Webcrawler went live with a database of around 4,000 web pages. Before web crawlers became commonly utilised, search engines were powered by humans – researchers would collect data on websites and catalogue them in a database. Yahoo! combined both approaches to begin with – suggestions from its robot crawlers would only appear if the search term didn’t match anything catalogued in the database by its researchers.
In 1998 Google launched Page Rank, a way of assessing the incoming links on a web page to determine its importance. It was this move than begun the long relationship between SEO and links. In 2000 the company launched a Page Rank toolbar, so SEOs could check how well their website was performing. It helped them to work out which web pages would be best for getting links from. Unfortunately, this also led to the rise of ‘Google Bombing’, whereby webpages use unrelated links to appear in the results of searches irrelevant to them. SearchKing was perhaps the first example of Google penalising a site. It offered to broker deals between webpages for buying or selling text links, which Google didn’t like.
The next big development was the ‘no follow’ tag, which Google introduced to combat blog post spam. Bots were (and still are) used to distribute spam comments containing links across the internet. However, the ‘no follow’ tag became useful to SEOs, who could use it to change the way Google’s ‘link juice’ was distributed among web pages. This practice ended in 2005, with Google announcing that using a ‘no follow’ tag on a web page would no longer give more benefits to other pages on the site. In the same year, they launched Google Analytics, which allowed SEOs to accurately measure performance.
How has content writing changed with SEO?
Rather like the internet itself, content writing has evolved from something ugly and clumsy into a sophisticated and invaluable tool. To begin with content writing was all about providing what the search engine crawlers needed in order to identify your web page as relevant and important. This meant including as many keywords as possible, as many times as possible. Google attempted to put a stop to this popular practise in 2003 with its Florida update. This penalised sites for keyword-stuffing and over-optimising their anchor text.
By the mid noughties, the focus of copywriting had changed somewhat. The practice became more focussed upon giving the reader something of value as well as the search engine. This was still a tricky area, however, and some companies got into trouble for trying to find a way around this. One of these companies, BMW, was banned for using ‘cloaking’. This involved displaying different content to the user than the site did to the search engine – so while the user saw a page of products, the search engine found a spider-friendly page of text, filled with keywords. While a good attempt at satisfying both robot and reader, ‘cloaking’ could be easily manipulated, and so Google had to come down hard.
Hummingbird – Google’s latest update
Google’s latest update, Hummingbird, came into force in 2013, and is the first major update of its kind for the company since 2001. It greatly increases Google’s ability to analyse searches and understand the intent behind the search. Before, Google would analyse each particular word in a search. With the introduction of Hummingbird, the search engine also takes into account the context of the words in relation to the other search terms. It also utilises synonyms – so searching for the ‘advantages’ of product X will also return pages on the ‘benefits’ of product X.
What does Hummingbird mean for copywriting?
With every update, Google is trying to move away from the robotic search methods of the past, to create a more ‘human’ search. Every update has shifted it from focussing on quantity (number of links, density of keywords, etc) to the quality of the content. Pages are becoming ranked based upon their value to the searcher, not the search engine.
Which means it is vital that content is written to educate, entertain or inform its readers. It needs to be crafted with an understanding of what search engines need to see, but without compromising on the quality of information that is delivered to its reader.
Some SEOs are already suggesting that the future of search engines could be to take a step back towards databases of websites indexed by humans, and combine this with the speed and efficiency of a search engine spider. If this truly is the future of SEO, then the need for writing your content for a human reader, rather than a robot, is only going to increase.