Categories
Web

Using YSlow to Optimize Web Site Performance Continued

The second part of an article/tutorial on using the YSlow firebug extension to optimize web site performance.

This article is a continuation of a previous article. If you haven’t yet, read the previous article: Using YSlow to Optimize Web Site Performance.

In this post, I’ll  cover rules 5-13 and summarize the results of the optimizations that I made to my site.

YSlow’s 13 Rules to Improve Web Site Performance (rules 5-13)

A number of these rules are fairly simple, so I’ll cover some together.

5. Put CSS at the top

6. Put JS at the bottom

These rules are fairly easy to follow and most sites shouldn’t have to change anything. Putting CSS in the document makes the page appear to load faster because of progressive rendering. It turns out that javascript blocks parallel downloads, so ideally you want to load javascript when everything else has already loaded. Also, if the javascript is hosted externally and the external server is slow, the rest of the page should load without problems.

7. Avoid CSS expressions

I’ll be honest here. I didn’t even know CSS had expressions until I saw this rule. YSlow recommends avoiding them because they are evaluated an absurd number of times.

8. Make JS and CSS external

External files can be cached indepently. If your users usually browse to more than one page on your site or you have frequent returning visitors, this should improve load times.

9. Reduce DNS lookups

DNS lookups can still take a fair amount of time even on a broadband connection. Basically, try to keep the number of unique host names fairly low. There is one competing performance benefit to serving page components from multiple hosts. It turns out that most browsers will only make two simultaneous connections to a given host. So, serving content from multiple hosts can speed up page load times especially if there are a lot of components.

10. Minify JS and CSS

Actually, in YSlow this rule says Minify JS, but on the explanation page, they added CSS. Minifying these files simply makes them smaller even if you’re already compressing them. The best minifier I found is the YUI Compressor, which handles both CSS and javascript. Here are the results of minifying my site’s stylesheet:

Uncompressed gzipped
Original 9814 bytes 2806 bytes
Minified 6164 bytes 1645 bytes
Savings 37.2% 41.4%

The YUI Compressor is pretty easy to use, but using it is going to get annoying unless you automate. To solve this problem, I once again modified my publish CSS script to do the minifying automatically. If you read the first part of this article and you’re wondering how many times I’m going to modify my publish script, this is the final version:

#!/bin/sh

SERIAL_FILE=serial.txt
OLD_SERIAL=`cat ${SERIAL_FILE}`
SERIAL=$((${OLD_SERIAL} + 1))
YUICOMPRESSOR_PATH=/home/avery/yuicompressor-2.4.2/build/yuicompressor-2.4.2.jar

cat	style.css.original \
	../../plugins/wp-recaptcha/recaptcha.css \
	../../plugins/deko-boko-a-recaptcha-contact-form-plugin/display/dekoboko.css \
	| java -jar ${YUICOMPRESSOR_PATH} --type css > style-${SERIAL}.css

sed "s/REPLACE_WITH_SERIAL/${SERIAL}/g" < header.php.original > header.php

rm style-${OLD_SERIAL}.css

echo ${SERIAL} > ${SERIAL_FILE}

Using publish scripts for stylesheets and javascript might seem like overkill at first, but that simple script has allowed me to make significant improvements that otherwise would be too time consuming to implement.

11. Avoid redirects

12. Remove duplicate scripts

These two rules are fairly easy to follow. Sometimes you can’t avoid redirects such as when you move to a new domain or restructure your site. Those sort of redirects are good and they maintain the reputation your site has built with search engines. That said, a lot of redirects can be eliminated simply by adding a trailing slash to a URL. Be especially mindful if you’re working on hand-coded sites rather than CMS-based sites. Remove duplicate scripts is pretty obvious. Unfortunately, Google Adsense serves the same scripts for each ad on a page and there’s no way to fix it.

13. Configure ETags

I won’t say much about ETags. This YSlow rule actually means configure ETags or remove them entirely. For a site being served off a single web server, ETags offer no benefits and some drawbacks including the fact that Apache generates invalid ETags for gzipped content. I recommend turning them off unless you’re willing to configure them per Yahoo’s ETags best practices. To disable ETags in Apache add the following to the virtual host configuration or the main .htaccess:

FileETag None

Results

Finally, here are the results that I obtained. First, my YSlow score:

Final YSlow Performance Tab
Final YSlow Performance Tab

I’m somewhat disappointed that after all of that, my score barely improved, moving from an F (59) to a D (63). Unfortunately, most of the remaining areas of optimization are either unrealistic as in Use a CDN or are out of my control.

What about real performance? The following tables include the minimum, maximum, and average times out of 10 page loads. First, with a cold cache:

Page Load Times Before and After Optimization (cold cache)
Minimum Maximum Average
Before Optimization 1.243s 1.890s 1.422s
After Optimization 1.024s 1.410s 1.215s
Speed Increase 17.6% 25.4% 14.6%

I’m actually fairly satisfied. Most of the optimizations I did would benefit repeat visitors rather than new visitors. A 15% performance improvement is not bad at all. Next, with a warm cache:

Page Load Times Before and After Optimization (warm cache)
Minimum Maximum Average
Before Optimization 0.817s 1.176s 0.968s
After Optimization 0.732s 0.920s 0.815s
Speed Increase 10.4% 21.8% 15.8%

These results baffled me. The improvements are only slightly better than the cold cache numbers and I really thought page loads would speed up more for repeat visitors. After some investigation, it turns out that I was loading pages in the wrong way. Specifically, I was hitting the f5 key to reload the page in my warm cache tests. Reloading with f5 has a special meaning in at least Firefox. It tells the browser to check that every item in the cache is the correct version even if the item is set to expire in 10 years. Here are the numbers without using f5:

Non-f5 Page Load Times After Optimization (warm cache)
Minimum Maximum Average
0.397s 0.525s 0.448s

Now, those are some fast page loads. I’m not interested enough to revert all my changes and re-test, but some statistics from YSlow suggest that I probably sped up warm cache page loads by a fair amount. Here’s the initial Stats tab, prior to any optimization:

Initial YSlow Stats Tab
Initial YSlow Stats Tab

And here’s the final Stats tab after all the optimizations:

Final YSlow Stats Tab
Final YSlow Stats Tab

Of particular note: in the case of a primed cache, I reduced the number of HTTP requests from 25 to 15.

Overall, I’m satisfied with the performance improvements. Even if the improvements are not amazing, working with YSlow is not particularly hard or time consuming. In fact, writing this article took much longer than the actual changes I made to my web site. A 15% reduction in load times is worth an hour or two of writing scripts and changing server configurations.

2 replies on “Using YSlow to Optimize Web Site Performance Continued”

I’ts a shame that normal bloggers like us have to skip the CDN tip, checked the prices and they are really high, I guess we will have to wait either to earn some money or the prices get lower.

Thanks for the article, really useful.

Comments are closed.