HTTP Pages
All pages using the non-secure HTTP scheme.
Priority: Low
Impact: Negative
What issues it may cause
Users referred from search engines may consider the HTTP page a poor user-experience, resulting in a higher bounce/exit rate reducing conversions.
If the pages are served on both HTTP and HTTPS, then additional crawl budget may be used by search engines crawling both versions.
How do you fix it
These URLs should be 301 redirected to HTTPS versions if they are available.
Any internal links to the HTTP URLs should be updated to use the HTTPS protocol.
If the pages are included in Sitemaps, they Sitemap links should be updated to use the HTTPS protocol.
What is the positive impact
Users will not see the pages as insecure, resulting in an improved user-experience and a lower rate of bounce back to search results. This may positively impact the rankings and increase traffic.
Crawl budget spent on the HTTP versions of pages may be reduced, allowing crawl budget to be used on more important pages, or save on server costs.
How to fetch the data for this report template
You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query:
- Query
- Variables
- cURL
query GetReportStatForCrawl(
$crawlId: ObjectID!
$reportTemplateCode: String!
$after: String
) {
getReportStat(
input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode}
) {
crawlUrls(after: $after, reportType: Basic) {
nodes {
pageTitle
url
description
foundAtUrl
deeprank
level
https
foundInGoogleAnalytics
foundInGoogleSearchConsole
foundInBacklinks
foundInList
foundInLogSummary
foundInWebCrawl
foundInSitemap
}
totalCount
pageInfo {
endCursor
hasNextPage
}
}
}
}
{"crawlId":"TjAwNUNyYXdsNDAwMA","reportTemplateCode":"http_pages"}
curl -X POST -H "Content-Type: application/json" -H "apollographql-client-name: docs-example-client" -H "apollographql-client-version: 1.0.0" -H "x-auth-token: YOUR_API_SESSION_TOKEN" --data '{"query":"query GetReportStatForCrawl( $crawlId: ObjectID! $reportTemplateCode: String! $after: String ) { getReportStat( input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode} ) { crawlUrls(after: $after, reportType: Basic) { nodes { pageTitle url description foundAtUrl deeprank level https foundInGoogleAnalytics foundInGoogleSearchConsole foundInBacklinks foundInList foundInLogSummary foundInWebCrawl foundInSitemap } totalCount pageInfo { endCursor hasNextPage } } } }","variables":{"crawlId":"TjAwNUNyYXdsNDAwMA","reportTemplateCode":"http_pages"}}' https://api.lumar.io/graphql