Skip to main content

Disallowed Pages

All URLs included in the crawl which were disallowed in the robots.txt on the live site, or from a custom robots.txt file in Advanced Settings.

These URLs were crawled by Lumar because the 'Check disallowed links' setting in Advanced Settings > Scope > Link Validation was enabled. If the setting is disabled, they will not be crawled and appear in the Disallowed URLs report.

Priority: None

Impact: Neutral

How to fetch the data for this report template​

You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query:

query GetReportForCrawl($crawlId: ObjectID!, $reportTemplateCode: String!) {
getCrawl(id: $crawlId) {
reportsByCode(
input: {
reportTypeCodes: Basic
reportTemplateCodes: [$reportTemplateCode]
}
) {
rows {
nodes {
... on CrawlUrls {
pageTitle
url
description
foundAtUrl
deeprank
level
disallowedPage
robotsTxtRuleMatch
duplicatePage
noindex
nofollowedPage
foundInGoogleAnalytics
foundInGoogleSearchConsole
foundInBacklinks
foundInList
foundInLogSummary
foundInWebCrawl
foundInSitemap
}
}
}
}
}
}

Try in explorer