Skip to main content

Disallowed URLs with Traffic

URLs which had Google Search Console clicks or Analytics visits during the report period, but were disallowed by your robots.txt.

Priority: Critical

Impact: Negative

How to fetch the data for this report template

You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query:

Operation: query GetReportStatForCrawl( $crawlId: ObjectID! $reportTemplateCode: String! $after: String ) { getReportStat( input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode} ) { crawlUrls(after: $after, reportType: Basic) { nodes { pageTitle url description foundAtUrl deeprank level disallowedPage gaVisits searchConsoleTotalClicks searchConsoleTotalImpressions searchConsoleTotalCtr searchConsoleTotalPosition robotsTxtRuleMatch foundInGoogleAnalytics foundInGoogleSearchConsole foundInBacklinks foundInList foundInLogSummary foundInWebCrawl foundInSitemap } totalCount pageInfo { endCursor hasNextPage } } } }Variables: {"crawlId":"TjAwNUNyYXdsNDAwMA","reportTemplateCode":"disallowed_pages_with_traffic"}
GetReportStatForCrawlTry in Explorer
GraphQL
query GetReportStatForCrawl(
$crawlId: ObjectID!
$reportTemplateCode: String!
$after: String
) {
getReportStat(
input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode}
) {
crawlUrls(after: $after, reportType: Basic) {
nodes {
pageTitle
url
description
foundAtUrl
deeprank
level
disallowedPage
gaVisits
searchConsoleTotalClicks
searchConsoleTotalImpressions
searchConsoleTotalCtr
searchConsoleTotalPosition
robotsTxtRuleMatch
foundInGoogleAnalytics
foundInGoogleSearchConsole
foundInBacklinks
foundInList
foundInLogSummary
foundInWebCrawl
foundInSitemap
}
totalCount
pageInfo {
endCursor
hasNextPage
}
}
}
}