5 Tips For Faster Loading Web Sites

Recently I came across Aaron Hopkins' Optimizing Page Load Time article. It explains in depth how you should optimize your web pages for a faster browsing experience. It's full of brilliant theory, but not to so much about practical tips for the average blogger. So here is a list of 5 things you can do to optimize your web pages without having to redo your site design or set up distribution caches around the planet. It helped me to reduce load time of this web site by about 70% on average.

Some of these tips require some control over your web server, like being able to edit .htaccess or even the Apache server config. If so, they will be marked accordingly. Others can be done with every shared hosting plan.

Stylesheets and Javascripts

Hopkins states that many small external objects like images or stylesheets hurt your performance more then just a few big objects with the same total size. This is due mostly to the overhead created by multiple TCP connections but also by the fact, that the average user is connected with DSL and has a decent download rate but only limited upload bandwidth. Thus, many requests from the browser for small objects create a bottleneck in the upstream. Unless you serve static content from a subdomain, cookies are sent to the server with every request - even for static files.

But it's even worse. While up to four images are loaded in parallel in most browsers, stylesheets and javascripts are not. The browser waits until the first stylesheet has finished loading, then requests the second one and so on. In my tests, Firefox didn't actually load any images before all stylesheets and javascripts were done loading. A test with a simple HTML page over a DSL connection shows that one big stylesheet of 50 KB can speed up load time by factor 2 compared to five stylesheets that are 10 KB each in size: (look further down in this article for a detailed chart)

  • 5 Stylesheets (10KB each): 1100ms
  • 1 Stylesheet (50KB): 500ms

So here's my first tip: use one single stylesheet and javascript file instead of many small ones. I know one big file is harder to maintain, but once your site goes into production, you shouldn't have to change these files often anyway. Some sites like slashdot.org or digg.com reference dozens of .css and .js from their front page. Slashdot takes about twenty seconds for me to load from a clear browser cache. Don't go this mad with your css unless you have enough regular readers already. For someone new to your site coming over from Google this will be a major turn off.

If your stylesheets are static files and you would like to keep them seperate for better maintenance, you can bundle them into one request dynamically:

<?php
# File css.php
readfile("stylesheet1.css");
readfile("stylesheet2.css");
?>

As Kevin Palms points out, you will have to set the header information manually in PHP like this:

# File css.php
header('Content-type: text/css');
# ...

Save this code as a file called something like css.php and reference it from your HTML:

<link rel="stylesheet" type="text/css" href="/css.php" />

Caching is your friend

Needs Apache module mod_expires and .htaccess

Many webmasters don't like the fact that their pages are being cached because they fear of loosing control over their content and not being able to track statistics. They put meta statements in the head of their HTML documents that tell proxies and browsers not to cache itself. But caching works on a lower level. It would be silly to download and read the file first to know wether or not the cached version should have been used. Proxies and browsers will always try to cache your pages, regardless of this. By setting up decent rules for caching with HTTP headers, you can at least gain some control over it.

You need to have Apache's mod_expires on your server for this to work. If you have access to your servers config files, check if the following line is commented out in the load modules section:

LoadModule expires_module modules/mod_expires.so

In your .htaccess or preferably virtual host container insert something like this:

ExpiresActive On
ExpiresByType text/html "access plus 30 seconds"
ExpiresByType text/css "access plus 2 weeks"
ExpiresByType text/javascript "access plus 2 weeks"
ExpiresByType image/png "access plus 1 month"

Modify to your needs. For every file type you would like to cache, insert an extra statement. For my server, I don't want text/html to be cached for long, because they are dynamic anyway and I want to see how often pages are requested. I use compression for these files, but we'll talk about that later.

If your files have changed, use a new file name. You can trick browsers into thinking the file URL has changed by adding a useless query string to static files like this: "stylesheet.css?new".

If your weblog features some kind of HTTP header manipulation for caching, you should turn it off so it doesn't interfere with your settings. I wasted much time trying to figure out why caching on a Textpattern website was not working right until I noticed a setting called "Send Last-Modified Header" was turned on in the admin preferences. It seems that Textpattern was forcing the Last-Modified header from within PHP for pages like css.php?n=default which resulted in unnecessary conditional GET requests. If you set a decent expiration date for your files, there's no need for conditional GET!

For more information about caching, head over to Mark Nottingham's Caching Tutorial.

Compress text files

Needs Apache 2, mod_deflate, mod_headers and access to server config.

So now your static content is being cached properly, but it still needs to be downloaded once at least. Also there's your dynamic html. Compression can save you a huge amount of bandwidth and deliver pages faster. In the above example about stylsheets, compression decreased download size from 50KB to about 13KB. Here is a complete graph for the same test run again with and without compression for an uncompressed download size of ~50KB total:

This will not only save bandwidth, but also make pages load faster. Because pages are served faster, your server will be able to deliver more pages in the same time. The difference might be even bigger for users with low bandwidth. Keep in mind, that a slow client on dial up from New Zealand will keep one Apache child process (or thread, depending on your MPM) busy, while it is downloading the content.

To make this happen, you will need to have two Apache modules loaded:

LoadModule headers_module modules/mod_headers.so
LoadModule deflate_module modules/mod_deflate.so

The following directives don't work in .htaccess. You need to place them in your servers config, i.e your virtual host container:

# Compress some text file types
AddOutputFilterByType DEFLATE text/html text/css text/xml application/x-javascript

# Deactivate compression for buggy browsers
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html

# Set header information for proxies
Header append Vary User-Agent

The first line says what file types should be compressed. Add the uncompressed file types you host on your server. It makes no sense to compress already compressed files like images or archives. Compressing these files again will only increase your server's load. The next three lines just exclude some know buggy browsers from the benefits of compression. The last line tells proxies not to deliver the same content to all user agents, because some of them just can't handle it.

The downside of mod_deflate is the CPU time consumption. You may have to tweak the compression level to meet your needs:

# n = 1..9 with 9 being the highest compression level. Standard is 6.
DeflateCompressionLevel n

For more in depth information see the Apache 2 mod_deflate Benchmark by Frank Schoep.

Compress your images the right way

Saving your images with Photoshop and "save for web" doesn't mean they are optimized. There is more to image optimization than saving photos as Jpeg and navigation elements as PNG. Even lossless formats like PNG can often be compressed to 50% of it's original size without quality loss if you use the right software.

I have successfully used the freely available OptiPNG. OptiPNG compresses PNG, GIF and other formats. It comes with source code for Unix or as a precompiled binary for Windows. Build instructions for Unix are found in the README. It is a command line tool and can be run with wildcards to automatically optimize all images in a given directory.

Apache fine tuning

Needs access to .htaccess or server config

If you have access to your servers configuration, there is some more settings you can play with. For one, disable host name lookups. It will safe time on every request. You will not have host names of clients available in CGI scripts and access_log, but there is better tools for that anyway in post processing of your log files. Preferably in your server config or on a directory level in your .htaccess turn it off like this:

HostnameLookups Off

In one last step, configure persistent connections in Apache's config. Persistent connections let the browsers request multiple files with one TCP connection, thus reducing overhead for TCP handshakes and speeding up requests a lot. These are the standard settings with Apache 2.2:

KeepAlive On
MaxKeepAliveRequests 100
KeepAliveTimeout 15

The standards are fine actually, but it's a good idea to experiment with these under your server's regular load conditions. MaxKeepAliveRequests is the maximum number of requests in one persistent connection. More is better, but bad spambots that request many pages at once may suck up all available connections if that number is too high.

KeepAliveTimeout is the timeout after which a persistent connection is dropped. The optimal value here depends on your server load and the clickyness of your web site. If visitors of your web site make many clicks in a short time, you can tweak that value to let one visitor always click within one persistent connection. Note that this is only advisable if your overall traffic is rather low. If you expect high peaks in traffic, don't set this value too high because you might end up having no connections left to serve new incoming requests.

That's all folks.

Comment on this article [46]

  1. Great guide! Only thing you might want to add: you should put a
    header(“content-type: text/css;”) in your dynamic PHP stylesheet.

  2. Thanks, Kevin! I have updated the article above.

  3. Cheers for his round up. I also found that compressing JavaScript with tools like Dean Edwards JS-Packer or the dojoo toolkit javascript compressor system can reduce the size of your JS files quite a bit and thus speed up page loading time significantly.

  4. Implemented. One question: I tried this above-mentioned hack in my .htaccess:
    HostnameLookups Off
    and it gave a 500 error. Any ideas?

  5. Jonathan: I doubt it’s the ‘HostnameLookups’ that gives you the error. Check your Apache’s error log file to see what caused the problem.

  6. “This is due mostly to the overhead created by multiple TCP connections but also by the fact, that the average user is connected with DSL”

    Actually, this is wrong. the highly populated countries of Asia DLS / Broadband is not a common connection type, yet they are 75% of those online. They use DIALUP.

    Even here in North America there are millions of people who cannot get a high speed internet connection, they have no option but dialup if they are going to be online. I do wish that people would stop spreading the lie that dsl / broadband is the most common connection type.

  7. Jaqui - I have no idea where you get your stats from, unless it’s oooold search engine data. On the sites I manage, checking logs generally daily, even in the UK, which has a relatively slow broadband uptake, the average is 95% DSL/Cable and approximately 3% dialup. And yes, these are sites with several hundred unique visitors per day, strictly from UK domains.

    Sidenote, IE7 has overtaken IE6, and FireFox trails at about 30% penetration.

  8. Interesting comments on using CSS. Actually, CSS experts regard using more than 1 stylesheet or internal CSS as improper to start with. Using proper containers, one should be able to cover all possible style requirements on only 1 external stylesheet, and as such it would not be called on every page load.
    I agree with Kevin, but also suggest including the “content-type: text/css” in the stylesheet. A proper DOCTYPE statement is critical to correct display of a page.
    RE IE7/6/etc … Most CSS experts I know have long given up trying to code for IE, since MS refuses to conform to HTML4 and CSS 1/2 standards. I don’t even bother using IE hacks anymore - most people I know use either Firefox or Opera, which do conform.

  9. Robb,
    Here in Canada, where there is a high incidence of dsl adoption, towns under 40,000 in population have dial up access only, and we have several thousand such towns.

    I have spoken with people in most countries and outside of major metropolitan centers dsl is not commonly available to them.

    by refusing to accept that not every country has a 90+ % of dsl availability, designing websites that are focused on the bloat that dsl enables is chasing customes from the company site, costing them in BAD public relations and lost business.
    Europe and North America have the highest incidence of dsl availability in the world. the Middle East, North and South Africa the next.

    India and China, largest segment of world population, with the least access to high speed / dsl internet. do the math, 66% of world population can’t access through high speed / dsl, it is not available.

    My own server logs show a fairly high level of dsl clients, but I’m in North america so I expect to see that, I do see an increasing percentage in dialup access, including access by cell phones. [ at 5 cents per kb, cell phones are expensive to use online, and that is the rate here for cell access to the internet ]

  10. You can as well consider bundling your css and js files. Rather than asking ur developers to build a monolithic code

    Refer this great article about bundling

  11. Subramanian: I don't think you read/understood this article.

  12. although I know most of your suggestion it is fine to see some graphs ..

    thank you ..

  13. Hi Nico

    Would grouping css files together also work if you are compressing them. I’ve gzipped my css files separately with 7-zip and serve them under htaccess. So if your method would also work with my gzipped one it would be icing on the cake.

  14. I try to compress my JS, but always forget to compress my CSS file(s).

    I am one of those who thought that PS optimized the image enough for web use, thanks for the tip.

  15. “Hopkins states that many small external objects like images or stylesheets hurt your performance more then just a few big objects with the same total size.”

    more THAN

    You FAIL. Go back to grade school.

  16. Phil E. Drifter, nobody cares.

  17. Today I released a tool for analyzing your web page composition.

    It categorizes and orders the resources of your web page, and shows whether or not compression and caching is supported.

  18. I agree with another poster, to save time you want to combine your style sheets using php, less server requests equals faster loading pages. You might also want to mention the use of sprites.

  19. Thank you for this explanation. I am setting up a cloud server on rackspace because I keep running into resource issues with the scripts I’m developing when I was using a virtual dedicated server. Your article has helped me to rethink how I do web development, and given me many things to try.

    I might suggest that you add information about testing memory usage with tools like free, top, and vmstat as well as looking at what happens under various loads with ab and httperf.

  20. It is beyond me why there’s no standard for resource bundling (packaging all required files) for HTML files.

  21. Great article. I have a question about the css.php thing. Will this same method work for javascript? Would you use it for js?

    I have a lot of different js pages and it makes a lot http requests.

  22. Great tips. Sites really should consider loading time a lot more, when they are media planning.

  23. It breaks my heart when I see people doing so much optimization ‘by hand’ - I remember it taking so much time. Why not use a website accelerator? We use Aptimize WAX to do it all automatically for us. It also allows us to to expert level optimization but we haven’t found a need to. Ok, i’ve just noticed this article is nearly 3 years old. It would be great to see this updated to reflect technology developments in this area.

  24. Thanks for the information. Your tips are very helpful . I found that Google Webmaster Tools offers a firefox add on that can help speed up loading pages . The tool is Page Speed browser .
    Moreover Google offers a lot a changes to make a website more faster loading . I hope this will help . :)

  25. Thank you for the tips.
    I use readfile for load my css dynamically.
    I dont use HostnameLookups on apache.
    If use HostnameLookups to off and restart my apache server, my server crash.
    Have you this problem ?
    For information: i have three different domains on this server, bug possible ?

    Thank you.

  26. See my thoughts here

  27. Don’t forget that poorly constructed SQL in dynamic sites can slow your web site down as well. The mysql site will give details of caching SQL requests

  28. Wouldn’t making all of your images into sprites make the site load faster? Just like having 1 stylesheet instead of 5, wouldn’t it be faster to have 1 image instead of 10-15 per say? it may be more work, but a fast site means happy viewers.

  29. All 5 of these tips are good for normal websites. What about wordpress sites? When you get different plugins with different css files and different js files, most of them just append them to the header. Which gives you multiple calls, most of my friends have this a problem.

    Thanks for the optimization tips :)

  30. These are five handy tips. Having been a person that utilized tables in html as well as in CSS extensively a few years ago this is a good reminder. I quit using them unless they were an essential function and it’s definitely speeded up the page load times.
    I can see why we’d want to avoid utilizing html to code the image to scale it down. It’s not a function that I use, but it’s now one I’ll never use. :) Thanks for an informative post that will stay bookmarked.

  31. These are some good tips. Thanks. Also, I’ve been looking for a wordpress plugin that has js & css compression with a gzip option. My current hostgator account does not support gzip unless I upgrade to VPS or dedicated server. Not sure I want to spend the money to do so though.

    I do use a cache plugin but it doesn’t necessarily make my pages load faster. Any suggestions would be great. Thanks again for your tips!

  32. JS-Packer or the dojoo toolkit javascript compressor system can reduce the size of your JS files quite a bit and thus speed up page loading time significantly.

  33. I really recommend the pack tag library for jsp pages. It compresses css and js files incredible easy. Very good article!

  34. Hi!

    You say “The following directives [AddOutputFilterByType] don’t work in .htaccess. You need to place them in your servers config, i.e your virtual host container”

    But according to the Apache docs, AddOutputFilterByType is available in .htaccess too:

    Can you explain more why you said it isn’t available in .htaccess?

  35. Hi Yuzz,

    this blog entry :) It’s possible Apache has changed the directive since then.

  36. Thanks for sharing but can somebody help me to do that in Windows server?

    Thanks,

  37. One of the existing tools which can help you to apply these best practices is: wro4j (web resource optimizer for java)

  38. Kudos to you for this post, four years later your optimization information is still helpful and relevant.

    We optimized last month and applied some of your suggestions. User feedback has been positive.

    A few additional tips from our experience that may benefit your readers are to use sprite graphics where possible, and also to load external css files in the head of the page. Using graphics sprites made a big difference for our mobile website version.

  39. thanks for sharing but can somebody help me to do that in Windows server?…

  40. Thank for sharing!
    I think, you should reduce KeepAliveTimeout to avoid overload server when serving alot of connections at the same time. About 3 - 5 is recommend :)

    KeepAlive On
    MaxKeepAliveRequests 100
    KeepAliveTimeout [3-5]

  41. Nice tips to be musing on, thanks Niko.

    One will note that when using a css.php file method, apache will serve that file each time (200) on each page reload, as opposed to refreshing from cache (304), whereas the plain old css file(s) will always refresh from cache (no 200) on reload.

    This is easily addressed using php page caching method of your choosing. At the outset I am using a method employing the Last-Modified and If-Modified-Since headers with apache’s mod_headers, and its very manageable. However, this is an old thread, and my chosen method is also a bit dated, but finding this article very helpful I thought I’d chime in.

    Thanks again, Niko!

    (trouble posting - hope it didn’t multiple)

  42. I recently published a standard for all the front end developers as a MUST follow

  43. Image optimization is good, but you also should add a little tip that you should always define height and width of an image. That way browser can load that image after it have loaded text. That way user have something to do while your page loads, and more likely he/she will not leave your page.

  44. Its really help to develop website loading fast. I try some tips in my site and it work. Thank you!

  45. What about PNG-8?
    The header of this site, 10.png, compressed

    Source: 14K RGB PNG
    Compressed: 7.1K (-46%) Indexed-RGB PNG

    Absolutely no visible difference.