Skip to main content

Disallowed URLs with Backlinks

URLs which were found in your backlink list, but were disallowed by your robots.txt.

Priority: Critical

Impact: Negative

What issues it may cause

Although the disallowed pages still have the potential to be indexed in search engines, they cannot be crawled. This means that there are no internal links to other pages visible to search engines, and PageRank from the backlinks cannot be passed to other pages on the site.

How do you fix it

Consider removing the disallow rule from the robots.txt that affects these pages.

What is the positive impact

The rankings of other pages may be improved as PageRank can be passed to other pages and distributed throughout the site.

How to fetch the data for this report template

You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query:

query GetReportStatForCrawl(
$crawlId: ObjectID!
$reportTemplateCode: String!
$after: String
) {
getReportStat(
input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode}
) {
crawlUrls(after: $after, reportType: Basic) {
nodes {
pageTitle
url
description
foundAtUrl
deeprank
level
robotsTxtRuleMatch
redirectsInCount
linksOutCount
linksInCount
foundInGoogleAnalytics
foundInGoogleSearchConsole
foundInBacklinks
foundInList
foundInLogSummary
foundInWebCrawl
foundInSitemap
}
totalCount
pageInfo {
endCursor
hasNextPage
}
}
}
}

Try in explorer