Over the weekend I deployed a new plugin I wrote for Perlbal that tracks bandwidth usage of certain features on Amie Street. The plugin is only running on 25% of our traffic for now, but after a couple of days I had collected plenty of data to start analyzing it this morning.
The first thing I noticed is that a single one of our images was accounting for 10x as much bandwidth use as the next most transferred image on the site. I loaded it up to find that it was only a 128x128 album thumbnail, yet was a 560KB file. I downloaded it and tried to figure out why it was so big, but without much luck. I soon figured out that I could use convert -scale 1x1 to scale it to a single pixel and it still took over 500KB.
After a bit of googling I came upon jpegoptim, a utility that performs lossless compression on JPEGs by reconstructing the huffman encoding without changing the image contents. Simply running jpegoptim image.jpg I cut the image size down to 28KB and reuploaded it to production.
After scanning the top 1000 largest thumbnail images on the site, I found that the majority of them could be decreased by a good 50% or more by simply using this tool. So, if you've got a site that has a lot of JPEGs (particularly user-uploaded content that's automatically scaled using ImageMagick) give jpegoptim a try. The worst that happens is that it will report that the file is already optimal; the best case is a 95% reduction in load time for users with slow connections.
Showing posts with label optimization. Show all posts
Showing posts with label optimization. Show all posts
Wednesday, March 05, 2008
Subscribe to:
Comments (Atom)
