All Internet users rely on search engines to find information on the Internet. Every day, billions of files are put online or modified by website managers. Search Engines are online services (such as Google, Yahoo, Bing etc.) that sort through all this information and make it available to users based on “keywords” (queries made in their search bar) provided by users.
The inner workings of a search engine
A search engine uses robots (often called “spiders” or “crawlers”) to browse all the content on the Internet, storing it in huge databases. The content is then analyzed and sorted before being made available to users. It is impossible to browse all the files stored on the Internet in one day. Each search engine therefore has its own frequency and method of updating data.
All search engines rely on complex algorithms, which will identify the nature of documents put online (music, photos, videos, texts, software etc.), websites that publish these documents, and of course the quality of these sites. The goal of a search engine is to provide a response as quickly as possible for the various searches of its users. To do this, it must not only interpret the terms sought by its users but also display the best results instantly. If the results displayed do not satisfy users, it is likely that they will stop using this search engine.
The search engine algorithms are therefore updated very regularly in order to constantly improve the relevance of search results. With the increase in the number of files put online each day, the quality criteria are toughening for various search engines. Website owners (companies, bloggers, associations etc.) must comply with these new requirements in order to ensure a good position in results of research. This desire to optimize a website to be considered a quality site is called SEO (“Search Engine Optimization”). For more information regarding SEO, we invite you to read our article “What is SEO?“.
How is the quality of a website defined?
Search engines use robots to analyze websites and these robots use specific criteria to assign a quality score (called “Page Rank“) to websites. The main criterion taken into account by these robots is the popularity of the website based mainly on the number of visitors and the number of inbound links (links published on other sites) pointing to it. As you can see, the more a site is popular and the more inbound links are attributed to it, the more the search engines promote its referencing.
The popularity criterion based on inbound links was quickly discovered by SEO specialists who concentrated their efforts in search of these infamous links, sometimes even paying other sites to publish a link towards their sites.
This paid “link building” strategy therefore distorted the quality criterion because links were no longer added in a natural way. Any website with the means to pay for these links was therefore considered a “quality” site. Here is a typical message that we regularly receive from one of our websites My Little Big Trip.com:
“Hello, I work for an advertising agency for customers mainly in the travel sector and I am in charge of advertising. I am looking for articles between 200 to 300 words or more on various travel destinations with link inserted on specific keyword (ex: travel to Spain …) pointing to the site of one of our clients. The interest for us is the insertion of a link in the body text of this article. If the idea interests you do not hesitate to write me back. Thank you.”
This SEO technique of manipulating search engines is increasingly reprimanded by moderators (people in charge of monitoring these kind of practices) who do not hesitate to downgrade a site or even to ban it from search results. Websites that use this type of process to improve their positioning must therefore be careful because controls and investigations are becoming more and more frequent.
Other criteria are now taken into account to evaluate the quality of websites; these other criteria are growing in importance more and more. For example, Google is very careful about the way web page content (explicit titles, parts, subparts, images, etc.) is organized, the loading time required to display content, internal links, sites that adapt to all types of screen resolution (this technique is called “responsive design“.) The goal is always the same: to propose well-organized sites that offer a quality user experience.
Search engines actively work to provide content that is as relevant to their users’ searches as possible. The quality of web content is of major importance, as it takes on the role of “educators” and sometimes even “vigilante” from websites managers. The battle between those who respect the rules and the SEO experts who seek to exploit the slightest flaw to improve their positioning has turned into a game of cat and mouse. The consequences can sometimes be cumbersome for newcomers who seek advice to improve their SEO performance and who are unaware that the hired professional may be using risky practices.
If you enjoyed this article, you might like this one: What is negative SEO and how to protect against it?