Pages with Large Network Payloads
Priority: Critical
Impact: Neutral
What issues it may cause
Slower Page Load Times: Heavy webpages with excessive byte weight can result in longer load times, particularly on slower network connections or devices with limited processing power.
Increased Bounce Rates: Large resource sizes can contribute to slower page rendering, impacting user experience and potentially increasing bounce rates.
Higher Bandwidth Costs: High total byte weight can consume significant bandwidth, leading to increased data usage for visitors accessing the website, especially on mobile devices with data caps.
How do you fix it
Optimize and Compress Files: Optimize resource sizes by compressing images, minifying CSS and JavaScript files, and removing unnecessary code or resources.
Use Efficient Formats: For images, using formats like WebP, which provides high-quality results at smaller file sizes compared to traditional formats like JPEG or PNG.
Lazy Loading: Implement lazy loading techniques for images and other non-critical resources to defer their loading until they are needed.
Minimize and Merge Resources: Combining multiple CSS or JavaScript files into a single file can reduce the number of HTTP requests needed for loading a page.
Use CDNs: Utilize content delivery networks (CDNs) to distribute resources geographically closer to visitors, reducing latency and download times.
Learn how to reduce payload size.
What is the positive impact
Improved Page Load Times: Optimizing total byte weight results in faster page load times, enhancing user experience and reducing bounce rates.
Reduced Data Usage: By minimizing the size of resources, you can decrease the amount of data transferred, benefiting users with limited data plans.
Enhanced Performance: Lighter webpages load more quickly and efficiently, leading to smoother interactions and improved overall performance.
How to fetch the data for this report template
You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query:
- Query
- Variables
- cURL
query GetReportStatForCrawl(
$crawlId: ObjectID!
$reportTemplateCode: String!
$after: String
) {
getReportStat(
input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode}
) {
crawlSiteSpeedAudits(after: $after, reportType: Basic) {
nodes {
url
auditId
title
displayValue
totalSizeKib
auditResult
}
totalCount
pageInfo {
endCursor
hasNextPage
}
}
}
}
{"crawlId":"TjAwNUNyYXdsNDAwMA","reportTemplateCode":"total_byte_weight_failed_audits"}
curl -X POST -H "Content-Type: application/json" -H "apollographql-client-name: docs-example-client" -H "apollographql-client-version: 1.0.0" -H "x-auth-token: YOUR_API_SESSION_TOKEN" --data '{"query":"query GetReportStatForCrawl( $crawlId: ObjectID! $reportTemplateCode: String! $after: String ) { getReportStat( input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode} ) { crawlSiteSpeedAudits(after: $after, reportType: Basic) { nodes { url auditId title displayValue totalSizeKib auditResult } totalCount pageInfo { endCursor hasNextPage } } } }","variables":{"crawlId":"TjAwNUNyYXdsNDAwMA","reportTemplateCode":"total_byte_weight_failed_audits"}}' https://api.lumar.io/graphql