Skip to main content

Failed URLs

URLs which did not respond within the project's rendering timeout setting (defaulted to 15 seconds).

Priority: Critical

Impact: Negative

What issues it may cause

Search engines will only make a limited number of concurrent requests, so slow loading pages may reduce the overall number of pages they can crawl.

If the pages are persistently slow then it increases the chances of hitting the timeouts of search engines, which could eventually result in the pages being removed from the index.

The rankings of the pages could be lowered by search engines as they will be assumed to provide a poor user-experience relative other websites.

Any updates made the to pages are less likely to be discovered by search engines.

How do you fix it

If the same URLs are appearing in subsequent crawls, it suggests the issues are with specific pages.

If a different set of URLs are appearing in each crawl, or the quantify of pages varies over time, it suggests the problem is with the web server performance.

The solution to improve the page speed may require developers and engineers to resolve as it could be caused by bad code, sub-optimal database performance or under-powered servers.

What is the positive impact

Crawl budget can be saved so other pages may be crawled more frequently, or save on server costs.

Pages are more likely to be kept in the search engine indexes, and rankings improved resulting in more traffic and conversions.

Updates to pages are reflected in search results quicker and crawl rate increases because crawlers are using their time more efficiently.

How to fetch the data for this report template

You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query:

query GetReportForCrawl($crawlId: ObjectID!, $reportTemplateCode: String!) {
getCrawl(id: $crawlId) {
input: {
reportTypeCodes: Basic
reportTemplateCodes: [$reportTemplateCode]
) {
rows {
nodes {
... on CrawlUrls {

Try in explorer