Skip to main content

Disallowed CSS (Uncrawled)

CSS files which are disallowed in robots.txt

Priority: Critical

Impact: Neutral

What issues it may cause

If the resources are disallowed in the robots.txt, the search engines may be unable to render the pages correctly.

How do you fix it

The disallowed CSS resources should be reviewed using tools such as the Search Console Live URL Test or Mobile-Friendly Test to determine if the resources are required for the page to render correctly. If so, the robots.txt should be updated to allow the URLs to be fetched.

What is the positive impact

Search engines will be able to render the pages correctly and process all the content and meta data so the pages can be indexed as expected.

How to fetch the data for this report template

You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query:

query GetReportStatForCrawl(
$crawlId: ObjectID!
$reportTemplateCode: String!
$after: String
) {
getReportStat(
input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode}
) {
crawlUncrawledUrls(after: $after, reportType: Basic) {
nodes {
url
foundAtUrl
foundAtSitemap
level
restrictedReason
robotsTxtRuleMatch
foundInWebCrawl
foundInGoogleAnalytics
foundInGoogleSearchConsole
foundInBacklinks
foundInList
foundInLogSummary
foundInSitemap
}
totalCount
pageInfo {
endCursor
hasNextPage
}
}
}
}

Try in explorer