Yeah, lossless compression never yields more results the second time around, unless you switch to a different tool, so that’s what I was expecting.
There are three major jpeg libraries, the local version of jpegtran uses the stock libjpeg, there is also libjpeg-turbo that is used on most Ubuntu systems, and I think that’s what Google uses for their tests, because it is fast, even if it isn’t the best compression, and then there is mozjpeg, which the API uses with the highest compression. If you re-optimized everything with a different version of jpegtran, you might get different results, but otherwise it should say “no savings” every time.
What we can do afterwards is run a query on the database to see if there were in fact any unoptimized images. So if you can, try not to upload any new images until after it finishes.
When it does finish, there are two ways of checking the results. One is to look at the settings page, and see what it shows for savings. It should say 0, or pretty near that.
You could also run a manual query on the database to see if there are any records where orig_size is not equal to (!=) image_size.
EDIT: I should clarify that this only applies to JPG images. The PNG tools have a slight bit of randomness built-in when they select the best compression parameters, and it is sometimes possible (although rare) to get different results the second time around on PNGs even when doing lossless.
-
This reply was modified 8 years, 2 months ago by
nosilver4u.