Skip to content

Compressing web server images

Versión en español de esta publicación. 
It may be the case that you have a web server running and have a very big images (size and resolution) which are causing some problems in web pages like:

  • Slow down of web page loading
  • Images use a lot more hard disk space
  • Bandwidth consumption grows
  • Bad user experience when visiting your site

One blocker to compress images is that you can have lots of them in your site and going one by one to check their properties is simply not an option. In order to find a way around, we can use a bash script and a couple of nice linux libraries, imagemagick and jpegoptim. The first library is used to scale the image to a specific width and height, the second one is used to add compression to the images and reduce the size

In order to install this libraries in the form of linux packages we can use this command in a debian like distribution

sudo apt-get install imagemagick jpegoptim

Next, we show the bash script with the image optimization function that was created

function optimize_web_images() {   
    declare -a search_dirs=(
        "/var/www/mypage.com/images" 
        "/var/www/anotherpage.com/images"
    )

    counter=1
    ## now loop through the dirs array
    for dir in "${search_dirs[@]}"
    do
        echo "Processing folder: ${dir} ..";
        find "$dir" -type f -exec file --mime-type {}  \; | awk '{if ($NF == "image/jpeg") print $0 }' | while read f
        do
            local image_file="${f::-12}";
            local image_width=$(identify -format '%w' "${image_file}");
            local max_width="1024";
            local max_optim="80";

            if [[ "${image_width}" -gt "${max_width}" ]]; then
                echo "Optimizing image ${image_file} .. old width: ${image_width}px, new width: ${max_width}px";
                convert "${image_file}" -resize "${max_width}"\> "${image_file}";
                jpegoptim --strip-all -q --max="${max_optim}" "${image_file}";
            fi
            
            # if processed images are 500, print progress
            if ! (( $counter % 500 )) ; then
                echo "Processed $counter images .."
            fi

            # count processed images
            ((counter++))
        done
    done
}

Basically what this function does is to search for all image files recursively in a certain directory of a list that is passed as a parameter. To determine the file is actually an image we query for the file mime-type, this in order to avoid problems with the file extension in case of using approaches like using globs (‘*.jpg’)

Once the images are found we proceed to determine if the image is a candidate of optimization by taking a look at the image dimension, for this we use the identify command of the imagemagick package. In case the image is actually a candidate for optimization we first use the convert command of the same library in order to scale the image to the desired size and then we use the jpegoptim command to compress the resulting image

In this way we optimize a list of images in a web server in an automated way that can be executed periodically in order to have reduced images and faster web page loads. Of course this script is just an example of the things we can do and it fits well with our use case, feel free to modify it to your needs and we hope is helpful to you.

Cheers!
-Yohan

Leave a Reply

Your email address will not be published. Required fields are marked *