A lot can be said (and has been said!) about Search Engine Optimisation, so instead of offering yet another analysis of recent data, this time we take a look at the past to see what there is to learn for the future.
Three main considerations for SEO
Before we go back in time, why even bother with history? Well, when working with search engines, three main factors have changed very little since the beginning:
How search engines work.
What your target audience is looking for (which keywords/phrases they use).
What search engines they prefer.
Within those factors, however, things have changed a lot. Behind the changes in the first of those, namely how search engines work, there are two driving forces:
Search engines want results to be as relevant as possible because the quality of search results is what keeps users coming back.
Brands (like Zooma) who want as many (relevant) visitors as possible to visit our online touchpoints.
In the continuous struggle for power between these two, search engines are continually evolving to deliver more accurate results. At the same time, brands try to find and implement methods to optimise, to gain a share of organic search traffic from the competition.
How SEO customised is your website? Try our website grader to get the answers.
The historical timeline of search engines and SEO
Below is a summary of how search engines and search engine optimisation have evolved over the past 20+ years.
When it all began (back in 1994)
If you wanted to find something online, you went to Yahoo.
To get into their index of websites, you sent a request to Yahoo so that they could manually enter your domain.
Automatic indexing of new websites and SEO as we know it today was not yet born. It would take another year, until 1995, for that that technology to emerge.
Early days of SEO (1995–1997)
This was the time when a lot of new search engines appeared for the first time. Among them Lycos and AltaVista (which I used a lot at the time and which later became known as Evreka in Sweden).
This meant the start of automatic indexing or ‘crawling’ of websites, marking the beginning of the rise of SEO. All of a sudden, words became important for your placement in search results and how pages on your site appeared to a search engine’s algorithm.
Website owners used this insight to experiment with simple methods to affect search results, for example by filling a page with (often irrelevant but highly searched for) words that were invisible to the naked eye but there for search engines to find. A classic example was white text on white background.
The problem with these methods was that they did not make the websites more relevant for visitors. Instead, they lowered the quality of search engine results.
Then came Google (1998–1999)
Google launched as a response to the need for more relevant search results.
Google had developed what they called the ‘PageRank’ algorithm that determined the ranking factor of a particular page by looking at how many links the page had and how influential those destinations were.
The idea was that if a website has many links, it is probably more popular and thus also of higher quality than sites with fewer links.
As a result (and virtually overnight) Google became the dominant destination for search, changing the priority of SEO away from the hidden, irrelevant content tactics of the past.
The rise of links (2000–2001)
In the early 2000s, Google launched a tool for browsers called ‘Google Toolbar’, which also showed the PageRank of an individual page.
This led to websites trading links with each other and mostly with other highly-ranked sites. It eventually got to the point where marketplaces to trade links appeared, and quality of search results started to evaporate once again.
Keywords is the next big thing (2002–2003)
To clamp down on link buying, Google decided to punish any websites that took part in the practice.
Following that action by Google, other shady methods once again took centre stage—for example flooding pages with keywords repeated over and over again. Just like with invisible text, this did not lead to pages being more relevant for users; it only served to trick search engines.
As a result, we got the ‘Florida’ update that punished websites for using these tactics and affected such a large part of the search results that it put the whole SEO industry on the spot.
The age of doorways (2004–2005)
With significant changes implemented, something new (of course!) had to be invented to allow websites to gain the upper hand. Consequently, in 2004 a technique called ‘doorway pages’ (known as ‘öppningssidor’ in Swedish) took centre stage.
These pages were often poorly designed from a visitor perspective, and their only purpose was to rank highly for specific search queries. Visitors would land on the doorway page, and from there they were sent on to the ‘real’ destination.
Of course, this new way of optimising traffic from search engines did not last long either, and the game of cat and mouse continued.
By 2008 it was apparent that social media was becoming a significant online activity. Facebook surpassed the once-mighty Myspace in popularity with some 140 million active users, and Apple, which had launched the iPhone one year earlier, was well on its way to change the online landscape forever.
The interest in social media had to be reflected in search results. So both Bing (launched in 2009) and Google included social media entries in search results, including tweets from Twitter.
Google does 350 to 550 adjustments per year to its search algorithm
In 2010 Google confirmed that it did between 350 and 550 adjustments to its search algorithm in 2009—meaning at least once a day - showing the rapid advancement required to stay on top of playing the game of SEO at the highest level.
Quality content matters, a.k.a. the Panda and the Penguin (2011–2012)
By 2010 a tactic had appeared where websites were created from large amounts of low-quality textual content, frequently updated and specifically designed to lure search engine algorithms. These sites were linked together, forming so-called ‘content farms’ whose only purpose was to drive search engine traffic and, just like doorway pages before them, send the incoming traffic to the final destination.
As a response, Google decided that the quality of the content should matter more for search engine rankings, and in 2011 launched its ‘Panda’ update, which effectively killed the practice. This was later followed by ‘Penguin’ which focused on websites that contained irrelevant links, sneakily added to the otherwise relevant content to the visitor.
Then focus moved towards mobile. First in 2014, with the launch of app indexing making apps appear alongside websites in search results, and then in 2015 with the so-called ‘Mobilegeddon’ update that made website mobile-friendliness a ranking signal in searches.
In addition to mobile, another clear trend is that we’re using longer and longer (more detailed) search queries such as ‘near me’, ‘for me’ or ‘should I’ phrases. Search engines are working hard at understanding such user intent. Today, 50% of search queries are four words or longer, and this is only set to increase. As we move to voice-controlled search on mobile and home speakers through our digital personal assistants like Amazon’s Alexa, Apple’s Siri, Google’s Assistant and Microsoft’s Cortana, search will increasingly be everyday speech. Powered by the increase in computing power and advancements in artificial intelligence, Comscore predicts that by 2020 50% of all searches will be by voice. Ultimately the increased involvement of artificial intelligence will have a significant impact not only on search engines but on marketing as a whole; how do you market to a machine? Will we even have websites or apps? However, that prediction is not for this post.
By looking at the past, what can we learn about the future?
Since search engines continuously evolve as we humans change our behaviour and adapt to new technology, search engine optimisation is something that needs to evolve with it (what worked well yesterday will not necessarily work tomorrow).
In the short term, it has been possible to ‘trick’ a search engine into believing your website is more relevant than it is, but long-term that has never been a winning tactic.
Successful search engines have always focused on the user experience and quality of search results. This benefits websites that focus on providing content that is useful and brings value to their target audience. Hence this should also be the primary focus of marketers (less talking about ‘us’ and ‘our’ stuff, more about topics that are of interest and value to ‘them’ - the visitors).
Understanding this makes it easy to understand why search engine optimisation and inbound marketing is naturally linked to each other.
Online Strategist at Zooma since 2012. 15+ years of experience as a manager, business developer and specialist within online and e-commerce. Has a perpetual drive for knowledge, and knows what to do with it. Find him on LinkedIn and Twitter.