Pages with Duplicate JS
Priority: Critical
Impact: Neutral
What issues it may cause
- Increased File Size: Including duplicate code increases the total size of JavaScript files, leading to longer download times. 
- Longer JavaScript Execution Time: More code takes longer to execute, which can delay the time it takes for a page to become interactive. 
- Resource Wastage: Redundant code consumes unnecessary bandwidth, memory, and CPU resources, which can affect overall performance. 
- Debugging Challenges: Having duplicated code can complicate the debugging process, making it harder to maintain clean and efficient code. 
How do you fix it
- Identify and Remove Redundant Code: Analyze your JavaScript code to find and remove any duplicates.
- Use Common Libraries: Instead of repeating similar code across scripts, use common libraries or shared functions. 
- Optimize Bundling and Loading: Bundle your scripts and load them efficiently to avoid duplications and minimize requests. 
What is the positive impact
- Reduced File Size: Removing duplicated JavaScript reduces the total file size, improving download times and page speed. 
- Faster JavaScript Execution: Less redundant code means faster script execution, resulting in quicker time to interactive. 
- More Efficient Resource Usage: Streamlining JavaScript code improves the efficiency of bandwidth, memory, and CPU resources. 
- Easier Maintenance and Debugging: Cleaner, non-duplicated code is easier to maintain and debug, leading to better development practices. 
How to fetch the data for this report template
You will need to run a crawl for report template to generate report. When report has been generated and you have crawl id you can fetch data for the report using the following query:
- Query
- Variables
- cURL
query GetReportStatForCrawl(
    $crawlId: ObjectID!
    $reportTemplateCode: String!
    $after: String
   ) {
      getReportStat(
        input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode}
      ) {
        crawlSiteSpeedAudits(after: $after, reportType: Basic) {
          nodes {
            url
            auditId
            title
            displayValue
            savingsKib
            pageviews
            productOfSavingsKibAndPageviews
            itemsCount
            auditResult
          }
          totalCount
          pageInfo {
            endCursor
            hasNextPage
          }
        }
     }
   }
{"crawlId":"TjAwNUNyYXdsNDAwMA","reportTemplateCode":"duplicated_javascript_failed_audits"}
curl -X POST -H "Content-Type: application/json" -H "apollographql-client-name: docs-example-client" -H "apollographql-client-version: 1.0.0" -H "x-auth-token: YOUR_API_SESSION_TOKEN" --data '{"query":"query GetReportStatForCrawl( $crawlId: ObjectID! $reportTemplateCode: String! $after: String ) { getReportStat( input: {crawlId: $crawlId, reportTemplateCode: $reportTemplateCode} ) { crawlSiteSpeedAudits(after: $after, reportType: Basic) { nodes { url auditId title displayValue savingsKib pageviews productOfSavingsKibAndPageviews itemsCount auditResult } totalCount pageInfo { endCursor hasNextPage } } } }","variables":{"crawlId":"TjAwNUNyYXdsNDAwMA","reportTemplateCode":"duplicated_javascript_failed_audits"}}' https://api.lumar.io/graphql