The Google always has its eye on you whether you like it or not, but sometimes, you can turn around that Panopticonic gaze to do you some good as in the case of the recent “Site Performance” release as a new part of the Google Webmaster Tools suite. Sometimes just adding fans to propel your websites faster into the internet ether just isn’t enough to make a difference with a distinction.
When you login to Webmaster Tools, you’ll see a “Labs” link and then a sub-link to “Site Performance” — click on it and then be prepared to get sick as you learn just how slow your websites appear on the internet to Google.
I pay a lot of money for a dedicated QuickServe server from Pair Networks to host all of my websites. My server is superfast, crisp, and not as sloggy as Google, at times, suggested.
Here is the Google report for my BolesBooks.com website. My site is faster than 94% of the sites on the internet. I was pleased with that speed report even though Google suggested I start using GZIP to make my site even faster.
Other plain HTML sites of mine — the same as Boles Books — were much
slower… like being in the lower 50% of speedy sites on the web. It
made no sense.
I decided to check the speed of some of my Movable Type 5.01 blogs hosted on the same server.
Here is the Google Site Performance report for DramaticMedicine.com — and that blog is a shocking 88% slower than other websites on the internet. Huh? What? Not possible!
I’m on all my websites all day lone and none of them — not even the Movable Type blogs — load slower than 3 seconds at any time. Google claims Dramatic Medicine loads in 8.1 seconds? Confounding!
I fired up the Google Site Performance result for another Movable Type 5.01 blog hosted in the same install and CarceralNation.com is FASTER than 83% of sites with a load time of 1.6 seconds.
Egad! Huh? That doesn’t make any sense either compared to Dramatic Medicine.
So do my blogs suck speed or burn it up?
Google were not really being helpful, so I decided to directly ask Pair Networks QuickServe support about the issues and to get more information on using GZIP to speed up all my sites since Google were adamant in every Page Speed suggestion to start employing that compression scheme.
Here is the Pair reply from the fantastic, and always helpful, Kevin O. —
It can help in certain cases. If a page is returning a large amount of
data that is text / HTML / code, using GZIP on it can reduce the download
time of that page. For many HTML pages though, the page load time is
mainly a factor of how many page elements there are, which determines how
many individual requests the visitor must make to the server to download
all elements. Likewise, if the bulk of the data was images, rather than
text code, using GZIP wouldn’t help much (almost all image formats are
already compressed, so trying to compress them again doesn’t yield a
decrease in size).
We do have Apache mod_deflate enabled by default on the new servers running
FreeBSD 7.2. As that is now the default, we can install mod_deflate on
your server for free if you like. That can turn on the compression through
a .htaccess file in your Web space. The mod_deflate module is explained at:
Using mod_deflate will add a tiny bit of overhead to the CPU in the
compression of the content, but typically it isn’t enough to be noticed.
You can also turn it on and off in your .htaccess file, so it shouldn’t
hurt to try it. There is no downtime aside from a restart of Apache (which
typically just takes a few seconds) for the installation of it. If you
would like us to install that, just let us know.
Otherwise, you may want to check into how many elements the slower pages
have, and see if they are relying on any external content. Some external
advertisement sources can run slowly, as can some external site statistics
packages. Commenting out some of the elements and testing load time, then
gradually adding each element back in while re-testing load time can yield a
good idea of where exactly in the page code the delay is coming up from.
I asked Kevin O. to install mod_deflate for me, and he quickly did.
My pages are generally coded to be pretty quick. I use Google Analytics, a Twitter Widget and a FeedJit widget — but that’s about it when it comes to anything fancy or special on the blogs. We run a pretty vanilla gig to keep everything fast and clean. We run no advertising. All images load from the same dedicated QuickServe Pair server.
Here’s the line of code from the Apache site Kevin O. mentioned that I added to the .htaccess files for all my websites — yes, I took the “impatient route” — but it seems to be working just fine and you should now feel a noticeable speed increase across all our websites:
I then jumped over to this cool HTTP Compression Test page to see if my GZIP was working.
Here’s the report for Boles Books:
Dramatic Medicine found some dramatic speed increases:
Even Carceral Nation found a GZIPped speed bump:
I ran that test on all my sites and they are all reporting GZIP compression.
Google admits their Speed Performance test uses less than 100 negotiating points with your website, so they’re having your cake while eating you, too — because it doesn’t matter if Google are doing a formal test or a sub-100 test — when Google spanks you, you always squeal and you never want to ask for another, and so you spend your Sundays looking for site speed tweaks and invoking compression schemes you hope will inflate your site status with Google while deflating load times for the places you build for good people like you to visit.