# Error Pages in Sitemaps **Priority**: Low **Impact**: Negative ## What issues it may cause Search engines are encouraged to crawl the broken pages which may waste crawl budget and incur additional server costs. ## How do you fix it Remove the broken pages from sitemap. ## What is the positive impact Fewer non-200 status code pages will be crawled which positively impacts crawl budget and saves on server costs. ## How to fetch the data for this report template You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query: ```graphql query GetReportStatForCrawl( $crawlId: ObjectID! $reportTemplateCode: String! $after: String ) { getReportStat( input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode} ) { crawlUrls(after: $after, reportType: Basic) { nodes { pageTitle url foundAtUrl foundAtSitemap deeprank level sitemapsInCount httpStatusCode indexable duplicatePage foundInGoogleAnalytics foundInGoogleSearchConsole foundInBacklinks foundInList foundInLogSummary foundInWebCrawl foundInSitemap } totalCount pageInfo { endCursor hasNextPage } } } } ``` **Variables:** ```json {"crawlId":"TjAwNUNyYXdsNDAwMA","reportTemplateCode":"error_pages_in_sitemaps"} ```