Skip to main content

Disallowed URLs with Backlinks

URLs which were found in your backlink list, but were disallowed by your robots.txt.

Priority: Critical

Impact: Negative

What issues it may causeโ€‹

Although the disallowed pages still have the potential to be indexed in search engines, they cannot be crawled. This means that there are no internal links to other pages visible to search engines, and PageRank from the backlinks cannot be passed to other pages on the site.

How do you fix itโ€‹

Consider removing the disallow rule from the robots.txt that affects these pages.

What is the positive impactโ€‹

The rankings of other pages may be improved as PageRank can be passed to other pages and distributed throughout the site.

How to fetch the data for this report templateโ€‹

You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query:

query GetReportForCrawl($crawlId: ObjectID!, $reportTemplateCode: String!) {
getCrawl(id: $crawlId) {
reportsByCode(
input: {
reportTypeCodes: Basic
reportTemplateCodes: [$reportTemplateCode]
}
) {
rows {
nodes {
... on CrawlUrls {
pageTitle
url
description
foundAtUrl
deeprank
level
robotsTxtRuleMatch
redirectsInCount
linksOutCount
linksInCount
foundInGoogleAnalytics
foundInGoogleSearchConsole
foundInBacklinks
foundInList
foundInLogSummary
foundInWebCrawl
foundInSitemap
}
}
}
}
}
}

Try in explorer