This article contains everything you need to know in order to adapt your website to the new Google PageSpeed Insights (PSI) tool image compression guidelines and get a good performance score.
- Note: For those who don't know what Google PageSpeed Insights tool is, I would say in a few words, that it is a tool that measures the performance of a web page and the score that it gives at the end, is an important factor in Google's Search Engine Ranking algorithm. In other words, this tool helps you determine any optimizations you need to do in order for your web page to score a higher number. The faster and best optimized a web page is, the higher the score and the higher the score, the higher the Search Engine Ranking.
Experiment with optimal quality settings for raster formats: don't be afraid to dial down the 'quality' settings, the results are often very good and byte savings are significant.
But what does that really mean? I took the time to run a few tests on the new PSI tool and found out that in some cases, it doesn’t matter what you do to optimize your images, the tool will recommend further compression/optimization. Now if you think about it, this should be expected, since if you go with loosy image compression there is no end to it. With lossless image compression/optimization you can go up to a certain level and to that point there is nothing more you can do, to better optimize your image. However, if you go with Google’s recommendation to: reduce quality to 85 if it was higher, there is no way you can ever hit an end and I can explain why.
Imagine that you have a jpeg picture which size is 100KB. If you go with loosy image compression and reduce quality to 85 you will probably end up with a slightly less image quality picture and some serious size decrease. However, if you keep lossy compressing your image further, without keep comparing it to your original lossless image, you will keep getting smaller image size every time you compress it again, but it would be of less quality comparing to the original lossless image. Now consider this: You have a point of reference when you do this manually, because you still have the original lossless image to compare it to, however PSI does not. It does not matter how many times you will lossy compress your images, PSI will recommend you to compress even more, even if you have already saved more than PSI's original recommendation, the first time you ran it. At a certain point you will see recommendations like save 1% of file size and this could be only a few bytes. So that is what I mean there is no way you can ever hit an end. So if that is the case, what is the way to go?
The best thing, in my opinion, is perform a good lossless compression and then let Google Page Speed module for your web server do its magic right after that. In any case there is no end to it, so let Google decide the degree of the lossy compression automatically. Of course you will still see recommendations to optimize more, but there is no way you can win on this battle. I believe that Google knows this and your website will not be penalized because of it. Just try to keep your total web pages sizes relatively small and your loading time fast and you will be fine.
Now if you worry how you can implement the above and how hard it is, if you host your website in one of our web servers, then you are in luck. There is nothing you need to do, since our servers are already equipped with Google Page Speed module, that uses the same optimization algorithms as Google's PSI tool. If you don’t, you can always bring it here and have a peace of mind.