CAPTCHA beschreibt ein Spamschutzverfahren, dessen Ziel es ist, automatisch generierte Eingaben von menschlichen zu unterscheiden und entsprechend herauszufiltern. CAPTCHA ist ein Akronym und steht für „Completely Automated Public Turing test to tell Computers and Humans Apart”. Zu den bekanntesten Varianten gehört das sogenannte reCAPTCHA von dem US-Unternehmen Google, welches im Web weit verbreitet ist. ... Continue readingCaptcha
Cloaking
What is cloaking?
Cloaking is a technique in which a website provides different content for search engines and human visitors. The aim is to deceive the search engines in order to achieve a better ranking in the search results.
The fine line between optimisation and manipulation makes cloaking a fascinating aspect of search engine optimisation.
How is cloaking used?
Cloaking encompasses various methods and practices that aim to deceive search engines. Common methods include:
- User-agent cloaking: this is a technique to recognise which browser or crawler is visiting the website. The website then delivers a different version for this specific user agent.
- IP-based cloaking: IP-based cloaking uses the IP address of the visitor to determine which content or version of a website should be displayed. It is often used in the context of geo-targeting, i.e. to present specific content for users who are in a certain geographical location. This includes, for example, the language version or regional offers.
- Keyword stuffing: “Stuffing” the content to the point of illegibility with relevant keywords for search engines, while real visitors are shown appealing content.
- Hiding affiliate links: Affiliate links are hidden from search engines to avoid penalties.
What consequences can cloaking have?
- Short-term ranking improvement: By making targeted adjustments for search engine crawlers, cloaking can temporarily lead to higher rankings in search results. In the long term, however, an SEO strategy without manipulation and tricks is more promising.
- Penalisation of the website/manual action: Search engines such as Google regard cloaking as a violation of their guidelines. Websites that use this technique risk being penalised, which can affect their visibility. Search engines can also remove affected websites from the index in whole or in part.
Avoid unintentional cloaking: Tips for website operators
You can avoid unintentional cloaking with these steps:
Check the robots.txt file: Make sure that your robots .txt file is configured correctly. This file gives instructions to search engine crawlers as to which pages or content may be crawled and which may not. Incorrect settings can lead to cloaking.
Consistent content: Make sure that the content that search engine crawlers see matches the content that human visitors see. Avoid delivering different versions of the same page.
User agent check: If you provide different content for different devices or browsers, use the correct user agent checks. Make sure you don’t accidentally deliver the wrong content.
Testing and monitoring: Regular testing and monitoring of your website is important. Use tools like Google Search Console to detect and fix potential cloaking issues. This is what it might look like in your GSC if cloaking is suspected:


Transparent practices: If you use personalised content, ensure transparency for users and inform them about the reasons for personalisation. There should of course be an option to customise this.
Keep your hands off cloaking!
This method may seem tempting at first glance, but the risks often outweigh the potential benefits. Website operators should rely on transparent and ethical SEO practices to achieve long-term success.
You want to learn more about exciting topics?