SEO is often one of the most important aspects of creating a successful website. It can be responsible for a large part of your monthly visits. It’s a complex process, one that constantly evolves, and usually involves writing great content.
In this article we take a look at the history of search engines, and the way they have created and shaped the demand for web content.
The first search engine
The first search engine was Archie (named after archive). While the internet was not around when Archie was developed, it was useful for searching the publicly available files on a small network at McGill University, Montreal. Archie kept an index of all the available files so users could search for relevant information and download the files onto their computer. Because of limited space, Archie could only point you to the relevant file link, and not display its contents as we do today with web pages. While not connected to the internet, Archie was used as the template for search engines that would follow in the next few years.
The dawn of SEO
The first glimpse of SEO can be traced back to 1994, when Brian Pinkerton invented the first web crawler that could index entire web pages. Originally a desktop application, Webcrawler went live with a database of around 4,000 web pages. Before web crawlers became commonly utilised, search engines were powered by humans – researchers would collect data on websites and catalogue them in a database. Yahoo! combined both approaches to begin with – suggestions from its robot crawlers would only appear if the search term didn’t match anything catalogued in the database by its researchers.
In 1998, Google launched Page Rank, a way of assessing the incoming links on a web page to determine the page’s importance. It was this move than started the long relationship between on-page search engine optimisation and link building. In 2000, the company launched a Page Rank toolbar, so website owners could check how well their website was performing. It also helped them to work out which websites and pages would be best to receive a back-link from. Unfortunately, this also led to the rise of ‘Google Bombing’, whereby webpages used unrelated back-links to appear in the results of search therms not related to their website content.
The next big development was the ‘no follow’ tag, which Google introduced to combat blog post spam. Bots were (and still are) used to distribute spam comments containing links across the internet. The ‘no follow’ tag became useful to change the way Google’s ‘link juice’ was distributed among web pages. In the same year, they launched Google Analytics, which allowed SEOs to accurately measure performance.
How has content writing changed with SEO?
Rather like the internet itself, content writing has evolved from something ugly and clumsy into a sophisticated and invaluable tool. To begin with, content writing was all about providing what the search engine crawlers needed in order to identify your web page as relevant and important. For many unscrupulous webmasters, this usually meant including as many keywords as possible, as many times as possible on a webpage. Google attempted to put a stop to this popular practise in 2003 with its Florida update. This penalised sites for keyword-stuffing and over-optimising their text and meta-tags.
By the mid noughties, the focus of website content writing had changed somewhat. The practice became more focused on giving the reader something of value and not just placating the search engines. This was still a tricky area, however, and some companies got into trouble for trying to find a way around this. One of these companies, BMW, was banned for using ‘cloaking’. This involved displaying different content to the user than the site did to the search engine – so while the user saw a page of products, the search engine found a spider-friendly page of text, filled with keywords. While a good attempt at satisfying both robot and reader, ‘cloaking’ allowed marketers to manipulate the search engines, and so Google had to come down hard.
Hummingbird
Google’s update, Hummingbird, came into force in 2013, and was the first major update of its kind for the company since 2001. It greatly increases Google’s ability to analyse searches and understand the intent behind the users search. Before, Google would analyse each particular word in a search, but with the introduction of Hummingbird, the search engine could also take into account the context of the words in relation to the other search terms. It also utilises synonyms – so searching for the ‘advantages’ of product X will also return pages on the ‘benefits’ of product X.
What does Hummingbird mean for content writing, or blog writing?
With every update, Google is trying to move away from the robotic search methods of the past, to create a more ‘human’ search. Every update has shifted it from focusing on quantity (number of links, density of keywords, etc) to the quality of the content. Pages are becoming ranked based upon their value to the searcher, not the search engine.
This means it is now vital that content is written to educate, entertain, or inform its readers. It needs to be crafted with an understanding of what search engines need to see, but without compromising on the quality of information that is delivered to its reader.
Google’s search algorithm is improving every day. It takes into account a myriad of different factors, and the need for writing your content for a human reader, rather than a robot, is only going to increase.