The Backlinkscreening – Link Detox for a clean link profile

07 min

Backlinks are an important part of the internet. Without links you wouldn’t get far and websites would look very lifeless, furthermore interaction would be missing in many places. You would not be able to link further information to the users, internal linking would be non-existent as well. What happens on website A, webmaster A can keep an eye on, he can make adjustments and monitor the status. If something here is no longer correct or needs to be updated, this can be done onpage.

However, other websites also link to website A and the webmaster usually has no influence on that. Many offpage links are set by other webmasters, for example to refer to offers of a store or to tell their community about a product. Besides these harmless links, however, there are also harmful ones, which you should not underestimate, because such backlinks can harm a website. If a toxic website links to a non-toxic website, it is quite possible that the non-toxic website will lose rankings or visibility. It is also conceivable that Google will take manual action or that there will be a negative impact due to a Google update. Therefore, it is important that you keep an eye on your own backlink profile and check it regularly to prevent negative effects. As is often the case, the saying “prevention is better than cure” applies here.

To follow this, you first need a comprehensive list of external backlinks. You can obtain this by using certain tools. Once you have this list, you can start with Link Detox and manually evaluate and screen the links. Harmful results are uploaded to the GSC at the end via Google’s Disavow tool, so as not to have any further influence on the ranking and rating of a website.

Preparation – Create the master list

In order to be able to examine all external links, you should use different data sources. The best way is to export the links from various tools such as SISTRIX, LRT, and others. In the next step, this data is combined in an Excel file and reduced by duplicates.

Afterwards you let the whole thing crawl by an appropriate tool. The Link Detox from LRT (Link Research Tools) or the Backlink Audit from Semrush are good places to start. The tools first crawl based on their data pools. Then you upload your own exported links. This way you get a portfolio of external links that is as comprehensive as possible.

The tools provide an initial assessment of which links are harmful or harmless. However, since you cannot rely on this 100%, it is necessary that you manually examine all domains or URLs (depending on the procedure).

The assessments of the tools provide a first point of reference that you can use as a guide. A distribution like in figure 1 should be examined more closely, because the tool has identified many harmful backlinks. If you already have a disavow file, you can upload it so that the links are taken into account by the tool. This can already change the Toxicscore of a tool-side analysis.

Apart from the tools, it is recommended to perform an analysis regularly (for example once a year) to find harmful links, so that they do not have a negative impact on your website. You can also incorporate the new links into your monthly or weekly SEO process and react directly this way.

In order to examine the link profile for harmful links, you need to be clear about what characterizes such a harmful link. A small comparison between “good” and “bad” should bring light into the darkness:

Characteristics of harmful websites/linksCharacteristics of harmless websites/links
Security messages from Google Harmful programs, phishing etc.Clean imprint
Websites of spam networksReal contributions
Scraper, data grabberimages are correctly embedded
Duplicate content and masses of money keywordsClean linking, sensible integration
Links on completely off-topic websites (a bank is linked on a website with sexual content)Appropriate keywords and anchor texts
Casino and gambling websites full of ads and spamTagging and link attributes must match
pure link listDo not overuse keywords
Website encourages users to install seemingly legitimate softwareTopic relevance
Websites posing as other vendors (example: spam site posing as Microsoft or PayPal)Reader/user must be the focus

Table 1: Characteristics of toxic and harmless backlinks

It should be noted that a website is not bad per se if it does not have a valid https certificate. While this is a quality feature that you can use to evaluate your website as good, it is not a criterion for a backlink screening followed by an upload in the Disavow tool.

Furthermore, it is not relevant for a Link Detox whether a website is currently still receiving contributions. While you pay attention to the aspect of topicality when searching for backlink partners, backlink screening is exclusively about finding harmful and potentially harmful websites and reporting them to Google in the next step. By using the disavow file, these links will not be considered further for your website’s backlink profile.

Incidents during the rating

Some websites you recognize directly as malicious, others you have to examine more closely. An incident is a website where it is worth to contact the external webmaster. This would make sense if…

These manual contacts in the last two cases are more worthwhile for stronger pages that are also visited. Incorrectly marked posts, however, should always be tackled. It is best to mark them during the manual evaluation and contact the webmasters afterwards. If that doesn’t work, it is advisable to add these URLs to the disavow list as well, so as not to violate the Google Webmaster Guidelines. This can also lead to your website losing visibility or keywords or in the worst case being penalized. Due to regular Google updates, this process happens several times a year, which is why you should not only consider internal factors.

Posts that are tagged always need an appropriate attribute.

<a href=“/beispiel.xy“ rel=“nofollow“>Anchor Text</a>

<a href=“shop.xy“ rel=“sponsored“>Anchor Text</a>

On the Internet, you as a webmaster cannot influence who links to your site. This often happens with a positive ulterior motive, by linking a store or a service and reporting about it in forums or a blog. But it can also happen with a bad intention. It can be that so-called scrapers or data grabbers grab data and publish it on other sites. This often happens with image material, but can also refer to complete texts. In this way, duplicate content is quickly created for which you are not responsible.

It is also conceivable that a competitor operates Black Hat SEO and wants to place your website on harmful pages on purpose or has many links set with one and the same money keyword. You can counteract this by regularly checking for new external backlinks.

There are also various spam networks circulating online that you should not underestimate. Although Google claims to recognize these pages better and better and not to include them in the ranking, it is better if you keep track and can decide manually what should not be included. Loosely based on the motto: “Trust is good, control is better.”

Figure 1: Spam pages that consist only of link lists should be placed on the disavow list

Once the master list has been created and you are aware of the criteria you want to use to evaluate the pages, you should think of a few, but meaningful comments. This allows for quick assignment and offers more potential for analysis down the road. These comments could be as follows:

If needed, these comments can be extended. But you should keep it as short as possible. Additionally it is advisable to insert a justification column. Often you do not know afterwards, why a page was rated with “Disavow”. Common reasons would be spam, phishing, porn or casino.

Another preliminary consideration to make is whether you want to look at all links on URL level or whether the analysis should get by with one link per domain. If you have few links in front of you and want to know everything, you can look at the URLs, but in most cases this is not useful, because there are often a lot of backlinks and you would be busy for a very long time. Often you can tell from one link per domain how the page should be ranked.

Since some tools like Semrush do not output the current status code of a website, you should crawl the pages to be analyzed using a tool like Screamingfrog. Hereby you create new filtering possibilities and can speed up the manual analysis.

To make fast progress, it is essential to filter again and again. You can filter by keywords, topics, anchor texts or other terms. This can save you work and speed up the analysis. Often certain pages can be grouped. This applies, among other things, to forums or press releases as well as opening hours information.

Use the Disavow tool

At the end of the analysis, you filter out the backlinks that should be on the Disavow list and paste them into a text file (.txt). The links will then be uploaded to Search Console accordingly and will not be considered further by Google for the selected domain in the future.

You can invalidate the links either on domain or URL level. Most of the time it makes more sense to disavow the whole domain directly. Domains need the addition “domain:”, URLs don’t need that.


Disavow at a domain: domain:example.xy

Disavow at a URL: http://beispiel.xy/blog/mode-der-50er-jahre.html

Google has compiled more info on the support page about “Disavowing links to your website“.

Webmasters don’t have full control over backlinks that are placed to their site, so you should know your backlink profile and always keep an eye on which new links are created naturally or unnaturally. It is advisable to perform extensive screening and monitor new backlinks via certain tools. This way you can always see which new links are created and can act if necessary. Link Detox is an important step for a clean link profile, which in turn is an important building block for a successful and healthy website. Onpage you can control a lot, but what happens offpage is rarely in your hands.


Just contact us

  +49 9381 5829000