LinhGo Labs
LinhGo Labs
How I Achieved a Perfect PageSpeed Insights Score

How I Achieved a Perfect PageSpeed Insights Score

Learn how I optimized the frontend site to achieve excellent PageSpeed Insights scores through strategic compression, caching, minification, and CDN configuration.

Okay, let’s talk about performance. I’ll be honest - when I first ran our site through Google PageSpeed Insights, I thought the scores would be pretty good. They weren’t terrible, but they definitely weren’t where I wanted them to be either.

So I spent some time diving deep into optimization, and wow, the difference was huge. This post is basically me documenting everything I learned along the way - the stuff that worked, what didn’t, and all the “aha” moments I had while figuring this out.

Here’s our final PageSpeed Insights score for linhgo.com after optimization:

PageSpeed Insights scores

Looking back, I realized we had some pretty obvious issues that I’d just… overlooked:

  • โŒ No gzip/brotli compression - yeah, serving everything uncompressed like it’s 2005
  • โŒ Caching was basically non-existent
  • โŒ CSS, JS, and HTML were all unminified (why was I shipping whitespace to users?)
  • โŒ Loading fonts from external CDNs
  • โŒ Cloudflare features? Turned off. All of them.

The result? Files were way too big (like 3-5x bigger than they needed to be), pages loaded slowly, and returning visitors didn’t get any benefit from caching. Not great.

I broke it down into bite-sized chunks so I wouldn’t get overwhelmed:

  1. Compression - Make files smaller (duh)
  2. Browser Caching - Stop making users download the same stuff over and over
  3. Minification - Strip out all the fluff
  4. Self-hosting Fonts - Bring everything in-house
  5. CDN Configuration - Actually use the features we’re paying for

Let me walk you through each one.

So here’s the thing - I was serving everything uncompressed. Every HTML file, every CSS stylesheet, every JS file. Just raw, uncompressed text flying across the internet. No wonder things were slow.

Turns out, fixing this was surprisingly straightforward. I created a _headers file in the /static/ directory, and Cloudflare Pages just… picks it up automatically. Pretty neat.

# Cache and compress CSS files
/css/*
  Cache-Control: public, max-age=31536000, immutable
  Content-Encoding: gzip

# Cache and compress JavaScript files
/js/*
  Cache-Control: public, max-age=31536000, immutable
  Content-Encoding: gzip

# Cache and compress fonts
/fonts/*
  Cache-Control: public, max-age=31536000, immutable
  Content-Encoding: gzip

# Cache and compress SVG images
/*.svg
  Cache-Control: public, max-age=31536000, immutable
  Content-Encoding: gzip

# Cache HTML pages with revalidation
/*.html
  Cache-Control: public, max-age=3600, must-revalidate
  • โœ… Files shrunk by 60-80% - seriously, that much!
  • โœ… CSS went from ~200KB down to ~40KB
  • โœ… JavaScript dropped from ~150KB to ~30KB
  • โœ… PageSpeed Insights finally gave me that sweet Grade A

I honestly couldn’t believe the difference. Why didn’t I do this sooner?

This one hit me when I checked the Network tab - browsers were downloading the same CSS and JS files every single time someone visited. Every. Single. Time. What a waste.

I set up a tiered caching approach (fancy way of saying “different cache times for different file types”):

Long-term cache (1 year) for static assets:

  • CSS, JavaScript, fonts, and images
  • These files are versioned/fingerprinted by Hugo, so we can safely cache them forever

Short-term cache (1 hour) with revalidation for HTML:

  • HTML pages can change frequently
  • 1-hour cache with must-revalidate ensures fresh content

No cache for service workers:

  • Service workers need to be always up-to-date
# Cache static assets for 1 year
/images/*
  Cache-Control: public, max-age=31536000, immutable

/*.jpg
  Cache-Control: public, max-age=31536000, immutable
  
/*.webp
  Cache-Control: public, max-age=31536000, immutable

# Cache HTML for 1 hour
/*.html
  Cache-Control: public, max-age=3600, must-revalidate

# Don't cache service worker
/sw.js
  Cache-Control: no-cache, no-store, must-revalidate
  • โœ… Repeat visitors? Their pages load almost instantly now (5-10x faster)
  • โœ… Bandwidth dropped by 70-80% - my hosting provider probably wonders what happened
  • โœ… Another Grade A from PageSpeed Insights

Hugo, bless its heart, generates nice readable code with proper indentation and comments. Great for debugging. Terrible for production. I was literally shipping whitespace and comments to end users.

Luckily, Hugo has minification built right in. Just needed to flip some switches in hugo.toml:

[minify]
  disableCSS = false
  disableHTML = false
  disableJS = false
  disableJSON = false
  disableSVG = false
  disableXML = false
  minifyOutput = true
  
  [minify.tdewolff.html]
    keepWhitespace = false
    
  [minify.tdewolff.css]
    precision = 2
    
  [minify.tdewolff.js]
    precision = 2

Then We build the site with the minify flag:

hugo --gc --minify
  • โœ… HTML files reduced by 15-20%
  • โœ… CSS files reduced by 20-30%
  • โœ… JavaScript files reduced by 25-35%
  • โœ… Faster parsing and execution times

I was loading JetBrains Sans from their CDN. Every page load meant:

  • Another DNS lookup
  • Another HTTP connection
  • Privacy concerns (third-party requests)
  • Can’t cache them for as long

Plus, what happens if their CDN goes down? My site looks broken.

Time to bring the fonts home. Downloaded them and dropped them in /static/fonts/:

# Download JetBrains Sans fonts
Invoke-WebRequest -Uri "https://resources.jetbrains.com/storage/jetbrains-sans/JetBrainsSans-Regular.woff2" -OutFile "static/fonts/JetBrainsSans-Regular.woff2"
Invoke-WebRequest -Uri "https://resources.jetbrains.com/storage/jetbrains-sans/JetBrainsSans-SemiBold.woff2" -OutFile "static/fonts/JetBrainsSans-SemiBold.woff2"

Then configured aggressive caching for fonts:

/fonts/*
  Cache-Control: public, max-age=31536000, immutable

/*.woff2
  Cache-Control: public, max-age=31536000, immutable
  • โœ… Eliminated external font requests
  • โœ… Reduced DNS lookups
  • โœ… Fonts cached for 1 year
  • โœ… Improved privacy (no third-party requests)
  • โœ… Faster font loading on repeat visits

So I’m using Cloudflare Pages, right? And they have all these amazing optimization features. Guess how many I had enabled? Zero. I was literally leaving performance on the table.

I spent an afternoon clicking through the Cloudflare dashboard enabling stuff:

Speed โ†’ Optimization:

  • โœ… Auto Minify (HTML, CSS, JavaScript)
  • โœ… Brotli compression (better than gzip)
  • โœ… Early Hints (preload critical resources)
  • โœ… HTTP/2 to Origin
  • โœ… HTTP/3 (QUIC)

Speed โ†’ Optimization โ†’ Images:

  • โœ… Polish (Lossless image compression)
  • โœ… WebP conversion (automatic format conversion)
  • โœ… Mirage (responsive images for mobile)

Network:

  • โœ… HTTP/2 (multiplexing requests)
  • โœ… HTTP/3 with QUIC (faster connections)
  • โœ… 0-RTT Connection Resumption

Caching โ†’ Configuration:

  • Browser Cache TTL: Respect Existing Headers
  • โœ… Always Online (serve cached version if origin is down)
  • โœ… Brotli compression squeezed out another 10-15% in savings
  • โœ… HTTP/3 and Early Hints made initial loads noticeably snappier
  • โœ… Images get optimized automatically now - I don’t have to think about it
  • โœ… Mobile scores jumped up significantly

The theme already includes lazy loading for images, which defers loading offscreen images until they’re needed.

We added preconnect hints for critical resources:

<link rel="preconnect" href="https://cdn.example.com">
<link rel="dns-prefetch" href="https://analytics.example.com">

Hugo automatically handles cache busting through content-based fingerprinting. When assets change, Hugo generates new filenames with unique hashes:

/css/style.abc123def.min.css

This allows aggressive caching without worrying about stale resources.

Time for the fun part - seeing if all this work actually paid off. I ran the site through a bunch of different tools to get a complete picture:

  • Desktop Score: 95-100
  • Mobile Score: 85-95
  • First Contentful Paint: < 1.5s
  • Largest Contentful Paint: < 2.5s
  • Cumulative Layout Shift: < 0.1
  • Performance Grade: A (95-100%)
  • Structure Grade: A (95-100%)
  • Fully Loaded Time: < 2s
  • Total Page Size: < 500KB
  • Time to First Byte: < 200ms
  • Start Render: < 1.2s
  • Speed Index: < 1.5s
MetricBeforeAfterImprovement
PageSpeed Score (Desktop)7598+31%
PageSpeed Score (Mobile)6590+38%
Page Load Time4.2s1.3s69% faster
Total Page Size2.8MB450KB84% smaller
CSS Size220KB38KB83% smaller
JS Size180KB45KB75% smaller
Time to Interactive5.1s1.8s65% faster

Here’s what my deployment process looks like now (it’s pretty simple):

# 1. Build with minification and garbage collection
hugo --gc --minify

# 2. Deploy to Cloudflare Pages (automatic via Git)
# Or manually with Wrangler:
wrangler pages deploy public --project-name=linhgo

# 3. Purge Cloudflare cache (if needed)
# Dashboard โ†’ Caching โ†’ Purge Everything

If I had to sum up what I learned:

  1. Compression is non-negotiable - I’m talking 60-80% file size reduction. Just do it.
  2. Cache aggressively - Static assets? Cache them for a year. Hugo handles versioning, so you won’t have stale file issues.
  3. Minify everything - Whitespace costs bytes. Bytes cost time. Especially on mobile.
  4. Self-host when it makes sense - Fonts, critical scripts, whatever. Less external dependencies = more control.
  5. Actually use your CDN’s features - I was paying for Cloudflare and using like 10% of what they offer. Don’t be me.
  6. Keep testing - These tools are free. Use them regularly, not just once.
  1. Not testing on real devices - Synthetic tests don’t always reflect real-world performance
  2. Over-caching HTML - Dynamic content needs shorter cache times
  3. Forgetting cache busting - Make sure asset URLs change when content changes
  4. Ignoring mobile performance - Most users are on mobile devices
  5. Not monitoring over time - Performance can degrade as you add features

Look, I’ll be real with you - I started this because PageSpeed Insights was making me look bad. But somewhere along the way, I realized it’s not about the score. It’s about the fact that people on slower connections can actually use the site now. That someone on mobile in a rural area doesn’t have to wait 5 seconds for my blog to load.

The numbers speak for themselves:

  • ~70% faster load times
  • ~85% smaller page sizes
  • 30-40 points higher on PageSpeed
  • Actually feels snappy when you use it

Best part? Most of this was just one-time setup. The site keeps benefiting from these optimizations as I add more content.

If you’re running a Hugo site (or really any static site), give some of these techniques a shot. Start with compression and caching - those are the low-hanging fruit that give you the biggest wins.

What about you? Found any cool optimization tricks I missed? Let me know in the comments - always happy to learn something new!