The rise and fall of SEO

29. máj 2014 | technology

This article will try to clarify why the bubble around SEO was created and misused for years by the so-called SEO companies, why it is almost impossible to make miracles when it comes to search engines, as well as how to tackle the process of optimization.

The rise and fall of SEO

A bit of history

People have been looking for ways of directing the users to their websites without knowing their URL address ever since the first webs were launched. The very first attempts included lists of websites similar to phone books. They contained basic info about the site and its address, and the sites were usually sorted into categories. The lists were created manually. If someone launched a new website, they could register it to be included in the book. Rather than the actual content of the particular website, these lists featured information written down by the owners of the sites. Thus, these books were quickly becoming outdated and directed the users to either a non-existing website or a site that featured content significantly different from the description.

Then came the first indexing search engines that started to browse the websites automatically using their own scripts, or the so-called search bots. These browsed the websites and saved their content in the database, eliminating the need to fill out the lists manually. If the bots browsed the website often enough, they kept the database updated and the users could search for the content of the particular websites. Search results appeared according to the number of key words in the content. This was, however, still not Google as we know it today, although some think it still works this way :-) It was a significant progress in searching, but the rising popularity of the Internet and the enormous growth of websites and their content meant that it quickly became dated.

Then in 1996, Google was born as an indexing search engine. Similarly to other engines, it browses the websites automatically and autonomously, which is why a newly launched website or even an updated content are not immediately searchable. We simply need to wait until the bot finds the page and automatically updates the record of it. What made Google and other similar browsers different was the way of determining the popularity (importance) of the sites and listing them from the most visited ones. It used one of the oldest features of websites – links to other websites. The principle was simple: the more websites contained links to one particular website, the more popular the page was. The growing importance of the site meant a higher position in search results. This popularity index, or the so-called PageRank, was a revolutionary way of searching, as it ensured that the search results contained the most useful information for the user. Gradually, Google has become the most widely used search engine in the world.

Why did all the SEO (search engine optimization) companies arise?

As we already know, indexing search engines browse the Internet and the content of the pages to bring the users the information they searched for according to their relevance. That is their task. They keep improving, which is the reason we like them so much ☺ Then how come there is an entire industry that focuses on the exact opposite, which is website optimization for search engines? The reason behind this is the desire to be the best and take the top search positions with as little effort as possible, and be among the first few search results, even though the content of the page and its popularity are not that great. Not that long ago, Google was not the giant with limitless research resources that it is today. Its search algorithm had certain flaws, which gave birth to SEO. Often, the meaning of this abbreviation concealed various shady practices. Thus, if you implemented one or more of those in the early days of indexing search engines, your page could really score better positions in search results. These tactics included:

  • hidden content that was detectable only by search engines, resulting in significantly different page content visible to search bots and users
  • senseless texts and page headlines featuring key word (as well as pointless inclusion of random keywords into metatags, which are now ignored by search engines exactly for this reason)
  • exchange of links between unrelated websites, giving birth to specialized link exchange networks
  • content duplication

and many other tactics...

Google strikes back

As soon as the tactics to circumvent search algorithms arose, developers began to work on their revealing. In the meantime, Google got unlimited resources for research and so improved algorithms with cute names such as “Panda” or “Penguin” were born. These not only revealed the majority of the aforementioned tactics but in addition to popularity index, Google started to impose penalties. Even large companies did not avoid the penalization, which turned to be fatal for many of them. That was the beginning of a fight against fraudsters. Many sites faced a significant drop in search results, and some did not even had a clue that the recommendations or actions SEO companies got them engaged in a fraudulent link exchange network or other black hat SEO tactics. Let's use an analogy from sports; SEO used to be a form of an illegal dope. It temporarily made you perform better than your competitors, but a checkup meant you got disciplined until the end of your life.

The positives

The concept of SEO is, however, not entirely useless; some tactics are not only allowed, but also advisable to be part of every internet page nowadays. Today, the development and life of any website should be focusing largely on:

  • quality content / copywriting
  • content created for users, not for search engines
  • engaging and clear interface / UX design
  • code in compliance with standards

When developing a website, we do our best to stick to all of the positive tactics. Thus, if you let us program your site, the SEO (if you still wish to address it with this name) is an integral part of it. That also means the search position and popularity fully depend upon its content. As an owner of a website, you can improve it with the following tips:

  • create an engaging content that naturally features keywords, people will share it and link to it, thus boosting its popularity and attracting more visitors to it, who will potentially share it further, etc.
  • give true and meaningful page title, as well as its address (if CMS enables it) and description
  • include image captions (without captions, search engines are currently unable to detect what is on pictures)
  • update the content regularly and communicate with users
  • link content to social media
  • set and watch goals in usage statistics

Conclusion

Search engines like Google are already evaluating search results based on hundreds of parameters, including location, country, previous searches, use of other Internet services, gender, and everything else there is to be found about the user (the majority remains a well-kept secret ☺). This way, the search results are gradually getting more and more personalized, so if you and your colleague sitting opposite to you enter identical keywords, your search results might be listed differently. Therefore, if someone promises you a higher position in search results, it is clearly deceiving. Thus, it is more important to focus on new trends in searching, such as microdata that show extra info in search results. You can learn more about them on our page.

up