Google PageSpeed Insights (PSI) is a free tool from Google that analyses and evaluates the performance of your website. Find out here which metrics are available and how you can optimize your website performance. ... Continue reading
06 min |
18 min |
The basics at a glance
What are the key factors that determine which results you get?
What are Google algorithm updates and how often do they occur?
Google algorithm and search engine optimization
Short excursus: Why are there different search results on desktop and mobile?
Conclusion: This is why the Google algorithm makes us happy
In order to understand the principles of Google’s universe, there are some basic terms you should know.
A (web) crawler is more or less the synonym for spider, robot, bot, etc. It analyzes and indexes data from the internet. That means it looks at your website, reads the content there, and then stores information about it. It also examines pages that have already been indexed for updates and new content. Through this process, information is displayed to users in a more targeted manner, so that they only receive relevant content in response to the search term. The crawler is therefore the basis of every search engine and indispensable. For this reason, a bot, spider, etc. becomes important whenever you want to optimize your website for search engines.
However, not always only one bot is used to search the internet. Google uses several, for example, one only for images. The bots therefore have a clear thematic assignment, according to which they are subdivided.
In principle, you can imagine the crawler like a spider – that’s why spider is one of its countless names. Similar to a spider, a bot “crawls” along the individual threads of the web. Of course, threads are only a metaphor here. In principle, however, the Internet works similarly: internal and external links connect individual websites like a large web.
The spider thus follows the individual links on the web pages and thus always finds further websites and their subpages. In addition, webmasters can submit sitemaps to the Google Search Console, which are also closely examined by the crawler. They then help the crawler to recognize what should end up in the index. Through these measures, more and more content is indexed and combined into a kind of continuously growing content library. All this content is then made findable for searchers on the Internet. The search engine algorithm then searches this index when you enter a search term in order to provide you with the appropriate answer.
You can help the crawler to find websites. For this purpose, internal links on pages and from the website navigation to various subpages are particularly relevant. This way you manage to make these pages visible to the crawler, as the bot can easily follow the references. Also, submitting a sitemap is useful, as it allows you to better submit updates to your content or new content to Google. Regular publications on your website are an important signal to Google and convey that your site should be crawled in continuous intervals.
Of course, you can also tell the Googlebot which content it does not need to look at or index. To exclude pages from the index, you can mark hyperlinks to the corresponding content with the noindex attribute. If you want to refine the whole thing, you can work with a robots.txt file. This is stored in the root directory of a domain and defines which URLs may be visited by the crawler. Here, you can specifically exclude the crawler from certain page areas via a disallow statement.
Of course, there are also moments when web crawlers reach their limits. This is the case, for example, in the deep or dark web. Content here is placed in such a way that it cannot be found easily and should remain invisible to the average Internet user.
After the crawler has searched and found various pages of the Internet, the algorithms come into play. These are used for the weighting and evaluation of the individual websites. They thus do the pre-sorting, so to speak, so that in the end you also receive the appropriate and relevant answer to your search query. The various ranking systems of a search engine search through several billion websites that are already in the index thanks to the crawler. So you get the answer to your question in seconds.
The birth of the algorithm was in the late 90s, under the name PageRank algorithm. It was invented by Google founders Larry Page and Sergey Brin. It was supposed to be the basis for the success of the search engine. The goal was to be able to search the Internet for websites in such a way that very specific pages were displayed in response to search queries. At the beginning, however, the weighting was only based on the linking structure – that is, the higher the number of external links, the more important the page and the higher the rating. Based on this weighting, the ranking in the SERPs was determined.
In the meantime, there are of course numerous factors that influence the ranking of a page. Links are only one of many. The problem is: for a search query, there are often several million possible pages on the web that provide the corresponding answer. The algorithm must therefore evaluate each individual page in order to select the one that is exactly right for the user. This is an extremely complex process. There are several hundreds of these so-called ranking factors that are used to analyze a website. Most of them are not even officially known. However, the individual factors can be roughly divided into different evaluation areas, in which, in turn, different criteria are considered.
There are very different areas that the algorithm analyzes in order to offer you the best possible result. These include, among others:
These criteria are part of the process that a search algorithm goes through to evaluate the relevance and quality of different content. Of course, for each individual category there are countless other subcategories that are also analyzed in detail. In order to be able to adapt your page accordingly, there are publicly visible guidelines from Google, which you can use to orient yourself.
According to Google, there are various factors that influence which results are displayed to you in the SERPs. Especially important are the following five categories, which build on each other step by step:
Semantic context has become particularly important when it comes to search queries. The search engine algorithm analyzes the context and meaning of the words entered in order to return the best possible results. For this purpose, there are various language models to help decipher the contexts. This word analysis is then used to find the perfect and most relevant hit in the index.
Word analysis, however, is not only about understanding the context. Rather, it is also about classifying search queries thematically in order to make the corresponding answer even more accurate. If a search contains keywords such as “opening hours”, the search engine knows directly that the user does not want to receive results on products or the history of the company, but wants to know specifically when the store opens and closes. If the search query contains the word “images,” the searcher is taken directly to Google Images. In addition, the algorithm analyzes the content for topicality in order not to show you any outdated and thus possibly no longer relevant content.
If the meaning of the word or phrase is clear, the search engine looks for pages that match the (search) query. Therefore, the search index is searched here to find matching websites. Special attention is paid to keywords that match the search query. For this reason, various parts of the text, such as titles, headings and even entire text passages, are searched for them. Once this keyword match check is done, previous interaction data is looked at to analyze how the result was received by users so far and whether it was helpful. This allows queries to be answered with even more relevant results.
Since the algorithm is a machine-learning system, it can draw conclusions from previous data and signals and apply them to future queries. In doing so, the algorithm looks at all search queries objectively, without interpreting anything into them. Therefore, Google does not consider subjective attitudes, intentions or points of view when matching the search term.
This section of the key factors includes the ranking factors, which have a clear influence on which results are considered good and subsequently receive a better positioning in the search results. Therefore, in addition to the usability of the page, an optimized loading speed, the quality of the content, etc., links, for example, are also an important signal for the search engine.
However, caution is advised here! Not every link has a positive effect. Make sure that your page is not linked to from spam websites. These websites often aim to deceive or mislead the search engine. Spam and the associated machinations are something that Google does not like to see. For this reason, in such a case, you can expect penalties and negative consequences rather than an improvement of your own ranking. In order to always follow the correct path and not run the risk of causing negative consequences for your website, you should follow Google’s guidelines for webmasters.
The next step is then to analyze whether the information really matches the search query. Correctly understood: The complete process so far runs in seconds in the background – so you still don’t have a result on your screen.
Often there are an infinite number of results that match a search query. So which are the ones that will be displayed to the user in the end? For this purpose, a wide variety of information and content from the index is analyzed and then the most suitable of these is displayed to you. This works because the individual ranking factors are constantly adapting to the constantly evolving web. Thus, even better search results can be delivered. For example, the following things are examined:
In the last step, before you are finally presented with the search result, the personal context is examined. For this, things that Google knows about you so far are included. This includes for example
Above all, the location is an important factor, which sometimes leads to completely different results being displayed for one and the same keyword.
In some cases, your previous search history also becomes relevant for a search query. Here, Google interprets interests in past searches and uses this knowledge to present you with results that are even better tailored to you in the future. However, if you do not want Google to use this, you can define via your Google account which data may be used to optimize the search results.
It all starts when you type a search query into the Google search box. Immediately after that, the invisible process starts, which searches for the perfect match for your query. Within a very short time (often less than a second), the algorithm searches the search index, which is a kind of library with all the previously crawled content of the Internet. This “library” has already been sorted and ordered long before your search, which makes the algorithm’s work much easier. After it has understood the semantic context of your query, matched search terms, and looked at the rankings of the most relevant pages, it finds you the right result and then includes important personal background information that puts the query in the right context. Once this entire process is complete, you get the visible result with the most helpful and relevant answers.
According to speculations, the Google algorithm is changed or optimized around 500 to 600 times per year. In most cases, this is done silently and under the table. Often, this is due to the fact that continuous improvements are made and therefore often only small things are changed. Nevertheless, there are typically cases in which a Google update is deliberately communicated officially. This is usually the case when many webmasters are affected by it, or it involves huge changes to the algorithm that can greatly affect rankings. This is the case, for example, with the Core Web Vitals, which are inserted as a new ranking factor.
In addition, there are always adjustments to technical innovations, such as smartphones, voice search, etc., which are incorporated into the Google algorithm. Continuous improvement is important here in order to always be able to achieve the best possible search results and to meet the user’s requests.
SEOs all over the world are regularly kept on their toes by these updates. In addition to the already numerous ranking factors that have to be considered in the search engine optimization process, new ones are constantly being added. But even worse, adjustments that have already been made may become irrelevant out of nowhere.
The goal of SEO is to understand the algorithm in such a way that adjustments can be made to websites that will then lead to top rankings in the search results. For this purpose, OnPage and OffPage optimizations are carried out. Both involve countless variables that impact the performance and visibility of a website. Additionally, it is important to focus on the right keyword when creating and updating website content – a keyword analysis will help you with this. If you manage to adjust your website and its content exactly to the algorithm through search engine optimization, you will achieve a better ranking.
Sometimes it happens: You get different results for one and the same search query on Google. Often it is due to the personal adjustments that Google makes for each search query. In addition, there can be various causes:
There is also the fact that mobile and desktop searches differ significantly. This is because Google has its own search index for mobile devices: The so-called Mobile First Index. In it, Google collects information on mobile versions of websites, which it then retrieves when a search query is made on a mobile device. In this way, as part of the “mobile revolution,” the search engine giant wants to build on the fact that in the future more smartphones and tablets will tend to be used than desktops.
Of course, the Google algorithm is not alone. Every search engine has a similar tool to best answer users’ search queries. Yahoo’s algorithm, for example, is called Slurp. All these systems are complex processes that often turn the SEO world upside down, especially due to regular updates. Nevertheless, search engines have long been an integral part of our lives, and we can no longer imagine life without them. Something “googling” has long been established in our vocabulary, and rarely has knowledge been obtained faster and more uncomplicated than via search engines. Google and Co. make our lives and the acquisition of knowledge easier, and all thanks to crawlers and algorithms.
You want to learn more about how to optimize for search engines and their algorithms?
You are currently viewing a placeholder content from HubSpot. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.