# Disallowed CSS CSS files which are disallowed in robots.txt **Priority**: Critical **Impact**: Neutral ## What issues it may cause If the resources are disallowed in the robots.txt, the search engines may be unable to render the pages correctly. ## How do you fix it The disallowed CSS resources should be reviewed using tools such as the Search Console Live URL Test or [Mobile-Friendly Test](https://search.google.com/test/mobile-friendly) to determine if the resources are required for the page to render correctly. If so, the robots.txt should be updated to allow the URLs to be fetched. ## What is the positive impact Search engines will be able to render the pages correctly and process all the content and meta data so the pages can be indexed as expected. ## How to fetch the data for this report template You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query: ```graphql query GetReportStatForCrawl( $crawlId: ObjectID! $reportTemplateCode: String! $after: String ) { getReportStat( input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode} ) { crawlUrls(after: $after, reportType: Basic) { nodes { url foundAtUrl level httpStatusCode disallowedPage failedReason css robotsTxtRuleMatch foundInGoogleAnalytics foundInGoogleSearchConsole foundInBacklinks foundInList foundInLogSummary foundInWebCrawl foundInSitemap } totalCount pageInfo { endCursor hasNextPage } } } } ``` **Variables:** ```json {"crawlId":"TjAwNUNyYXdsNDAwMA","reportTemplateCode":"disallowed_css"} ```