Smashing The Webpage Speed Barrier

Search Engine Optimisation Tips

Getting out of the red with your page speed

So, you are looking to speed up your website and provide that better user experience to your clientele, right? Well! Having studied page speed issues in great depth over the years, I’m going to give you some tips and tricks on some common faults that most people will encounter. We’ll go through where to start, some useful resources and how to overcome these to get your site into the green or maybe even hit that magical 100% goal.

Quick disclaimer, our website is not at the 100% mark for page speed, but it is mightily close. The reason is that we load in a lot of external scripts which we have tried to optimise, but it’s not always possible. That said, we have implemented every single one of these best practices where we can to keep it running very fast.

One last thing: Please make sure you take a backup of everything before doing any of these tasks.

You may struggle to understand each issue at first so I’ll try to explain without being too technical. I’ll explain the things to do for each of these and finish up with some neat tricks we have done on our own website to fix some more complex problems.

It is quite an in-depth article but I do hope it can benefit you, your business and help in making the web a better place. So here goes!

Problem 1. Optimize images

This is the first and foremost problem that will slow your website down. Some sites have loads of images that are taking a long time to load so you need to tackle this head-on. There are two problems that can be causing this, so firstly run a Google page speed test and download all your images that it lists to a folder on your desktop.

Before you go down the road of compressing them, note that they may need resizing. This is because your HTML or CSS may be resizing them to the required size before rendering them out to the browser. If this is the case, we need to re-size this image to the correct pixel size before we optimize the image. If we optimize it first then re-size it we will lose on precious compression. The best way I have found to do this is in Microsoft Paint. Simply open up your image and select the correct size for the slot on your site.

What Next?

So by now you have re-sized all your images to the correct pixel size. Now head over to, which is a great resource for compressing images. It can compress both png and jpg. I’ve never seen any loss of quality while using it, so I can vouch for it. Of course, you may have an option that you prefer, so feel free. You can upload 20 images at a time with TinyPNG, but remember, when you download them back to your folder to keep the same image names.

Just to recap, so far, we have:

1: Re-sized all your images

2: Compressed all your images

You can go ahead and upload these back to your site and re-run a Google page speed test. If you had many images causing problems, you should see a good improvement. If you only had one or two image causing problems, you should have one less problem listed in Google page speed results.

PLEASE NOTE: If you are running on mobile responsive devices, you may have frames around images and so on that are a different size. These issues will have to be overcome with your mobile development coding.


Problem 2. Eliminate render-blocking Javascript and CSS in above the fold content

This can be a big headache and there can be a number of problems which could cause these issues. The goal here is to get the CSS loaded before the Javascript or, even better, to get the critical CSS loaded first.

There are two parts to the CSS: the critical CSS for the “above the fold content” and none critical CSS “below the fold content”.

Critical CSS: This is when your CSS is loading the “above the fold”, meaning, it is what the user sees on their screen or device, without any scrolling.

None Critical CSS: This is when the CSS I loading “below the fold” is what they don’t see (maybe a footer section, footer menu or something of that nature).

We can defer none critical CSS as it’s not needed until the user wants to see more. The idea is to get the user experience as fast as possible to them and by the time they scroll down to the footer, the “below the fold” content would have or should have loaded. Generally, the “above the fold” content is around 600px in depth.

Every website is different in the way your files are loaded, but the above mentioned should give you an idea of the working of things. Going in depth to rectify all the issues is for another article, but there are some small fixes you can implement to minimise the impact of these:


1: Asynchronously load your Javascript. Some people go for the defer command but, to be honest, I wouldn’t recommend it, as it is not supported by some browsers and I’m not sure on the future support of this command.

The command will be something like this:

<script src=”js/yourscript.js” async>

The below diagram shows a simplified view of how async works:

Please note: On more responsive developments, it may be a Javascript library file that is required when rendering something like a scrolling form, so you will need to research the issue further. Always make sure to continuously test.

2: Minify your javascript, this will not stop the blocking of your CSS, but will eliminate problem 6 below.

3: Optimize the delivery of your CSS. As explained earlier, try to deliver the CSS above and below the fold. Also, try and combine all your CSS files into one CSS file. Most of the time you can easily just copy and paste one CSS file to the bottom of the other and then remove the calls to the old file (a great resource for further information on this is Patrick Sextons Varvy).


Problem 3. Leverage Browser Caching

When your website is displayed there are many calls to be made to your images/logo, CSS files, JavaScript files and so on. This can take up those vital milliseconds. With browser caching, your web-browser has already loaded these resources. We need to get the browser to remember these from memory without having to make any further external calls to the resources. This is leveraging browser caching becomes useful. If you are familiar with the .htaccess file, you can make use of this by specifying the resources you wish to cache and for what period of time.

You will see something like this:

<IfModule mod_expires.c>

ExpiresActive On

ExpiresByType image/gif “access plus 1 week”

ExpiresByType image/jpeg “access plus 1 week”

ExpiresByType image/jpg “access plus 1 week”

ExpiresByType image/png “access plus 1 week”

ExpiresByType image/ico “access plus 1 week”

ExpiresByType text/CSS “access plus 1 week”

ExpiresByType text/javascript “access plus 1 month”

ExpiresByType text/x-javascript “access plus 1 month”

ExpiresByType application/x-javascript “access plus 1 week”

ExpiresDefault “access 1 day”


What’s happening here, is you are telling the .htaccess file to cache resources such as static jpg and png images for 1 week so they are available from memory.

Also remember to set an ExpiresDefault “access 1 day” to ensure your document is fetched from the cache rather than the source until the time has passed, e.g. 1 day. After that, the cached copy is found to be “expired” and “invalid” and a new copy must be obtained from the source. This is a good idea, especially if you do change things, as users are not going to see out of date resources held in the cache.


Problem 4. Minify CSS, Minify Javascript & Minify HTML

With Minify CSS, Minify HTML & Minify Javascript they are all similar concepts in what we need to do. Basically, we need to strip out any unnecessary white spacing and comments and compact our code to a minified state, this will ensure faster loading of the resources in question.

You may not want to minify half of your files because you may want to re-code some of your HTML, JavaScript or CSS and, once minified, it can be frustrating to try and read & understand them. If you are going to do this a good measure is to keep a backed-up copy of the files in a none-minified state for readability. If you know they are pretty much never going to change then minifying your files can be a great idea.

There are plenty of tools out there and you might have a preferred one. Here are some I’ve found useful in the past:


Problem 5. Enable Compression

By enabling compression of your files will allow you to load these using less bandwidth which can save a great deal of time.

Let’s not get confused between bandwidth and latency. Think of bandwidth as the width of a pipe which your data has to travel down. Think of latency as the length of the pipe (fixed length), so no matter how much bandwidth you have it is still going to be the same distance your data has to travel to and from each end of the pipe. All we are doing is allowing more data to travel from point to point by saving on bandwidth thus increasing loading times of your resources.

To enable compression, we need to return to the .htaccess file, where we can enable this with some commands like:

<ifModule mod_gzip.c>

mod_gzip_on Yes

mod_gzip_dechunk Yes

mod_gzip_item_include file .(html?|txt|CSS|js|php|pl)$

mod_gzip_item_include handler ^cgi-script$

mod_gzip_item_include mime ^text/.*

mod_gzip_item_include mime ^application/x-javascript.*

mod_gzip_item_exclude mime ^image/.*

mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.*


Once again remember to take backups and rigorously test.

I’ll point you to some more resources below at the end of the article that goes more in-depth into these discussions.


Problem 6. Avoid landing page redirects

Sometimes, you need to avoid redirects from one URL to another to cut out any additional wait time for your users. For whatever reason, there will be additional round-trip latency time with additional redirects.

For instance, you may have a resource file sitting at destination A and you might be calling it from destination B(HTTP) destination C(HTTPS) and maybe even D (a none WWW version of your domain). What we have here is 3 different starting points, so B, C & D should redirect directly to A. B C & D should not redirect to each other. Again, this could turn out to be an in-depth article on resolving problems like this so I’ll point you to another great resource that I would highly recommend if you want to learn more which is

With our own website, we found some real problems such as the external calls to Google Analytics and external Google Fonts causing our webpage speed to a run a whole lot slower. We also found that we were making unnecessary calls for extra Google fonts we did not need. So here are some neat tricks we used to eliminate those problems.

Resolving the Google Fonts issue

1: Our external calls to Google fonts looked like this


and this

href=”// Oswald:400,300,700

This was always an issue during our page speed testing so we thought what on earth can we do to speed these two calls to Google fonts. Here’s what we found out:

A: Firstly, we found that we could actually combine these two resources together using the pipe command:

<link href=”//,400italic,600italic,700italic,800italic,400,300,600,700,800|Oswald:400,300,700″ rel=”stylesheet” type=”text/CSS”>

Great! Now we only need to make 1 call!

B: We slowly went through all these fonts to see if we actually needed them all and, hey presto, we didn’t! We found we actually only needed:

<link href=”//,400|Oswald:400″ rel=”stylesheet” type=”text/CSS”>

Even better now we have only one external call to Google fonts and we have fewer fonts to call. We wanted to see what else we can do, so we broke it down further. Next step, let’s paste the URL call into the browser and see how big the file is and see what it is doing.


We saw it a very small CSS file which is actually making more calls to further resources. We found we could download this CSS file and save it to our web server. We then added this to the bottom of our bootstrap.CSS file and call with just the one file along with the rest of our resources.


So, what have we solved here?

  • 2 calls too many Google external fonts rolled into one.
  • Stripped out the unnecessary fonts and called fewer resources from Google
  • Downloaded the CSS file it calls and merged this into our main CSS file
  • Now we have only 1 main CSS file which we can then look into further for the above and below content to serve the resources accordingly.

Resolving the Google Analytics issue

Right lets now deal with Google Analytics. It’s well known that this file is fairly weighty.

During our page speed tests and when our webpage loads, we are making an external call to Google Analytics which is hindering our page speed.

We had this in our code



(i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)})(window,document,’script’,’//’,’ga’);ga(’create’, ’UA-108371576-1’, ’’);

ga(’send’, ’pageview’);


You will see here that our call to grab and load the Analytics.js script is (mostly known as ga.js).

We did further research and found that we could set up a cron-job and load the Analytics.js onto our web server on a nightly basis. That allows us to now call it from the browser cache and make use of the leverage browser caching control. By just changing one command in your Analytics code, we can call this directly from the web server, e.g.

Rather than go into a real in-depth article on how to do this step by step I’m going to point you to a fantastic resource which we actually took reference from, which is It explains, in-depth, how to do this correctly.

another site optimised

We hope this article can assist you in conquering your page speed goals. Please feel free to share with your colleagues and friends.

Below are some useful tools that will get you well on your way to learning more about the complex issues you may be facing. As always, if you need any help, feel free to contact us.

Helpful resources

Patrick Sexton’s Varvy (for compressing your images)

diy wp blog (for eliminating external calls to Google Analytics) (for your HTML)

Google page speed


This article was written by Alan Lynch, Head of Search at Ascent Digital. He has many years’ experience and oversees the SEO division. He is a skilled developer with in-depth knowledge of structured markup, page speed issues, all things SEO and, of course, an interest in footy.

This article was co-written by Liam Normanton, an apprentice/test subject at Ascent Digital

Sign up to our newsletter