What is the best nginx compression gzip level?
Solution 1:
I tested this under nginx 1.3.9 with two files, and these were the results I got for the various levels:
text/html
- phpinfo():
0 55.38 KiB (100.00% of original size)
1 11.22 KiB ( 20.26% of original size)
2 10.89 KiB ( 19.66% of original size)
3 10.60 KiB ( 19.14% of original size)
4 10.17 KiB ( 18.36% of original size)
5 9.79 KiB ( 17.68% of original size)
6 9.62 KiB ( 17.37% of original size)
7 9.50 KiB ( 17.15% of original size)
8 9.45 KiB ( 17.06% of original size)
9 9.44 KiB ( 17.05% of original size)
application/x-javascript
- jQuery 1.8.3 (Uncompressed):
0 261.46 KiB (100.00% of original size)
1 95.01 KiB ( 36.34% of original size)
2 90.60 KiB ( 34.65% of original size)
3 87.16 KiB ( 33.36% of original size)
4 81.89 KiB ( 31.32% of original size)
5 79.33 KiB ( 30.34% of original size)
6 78.04 KiB ( 29.85% of original size)
7 77.85 KiB ( 29.78% of original size)
8 77.74 KiB ( 29.73% of original size)
9 77.75 KiB ( 29.74% of original size)
I'm not sure how representative this is but it should serve as an example. Also, I haven't taken the CPU usage into account but from these results the ideal compression level seems to be between 4
and 6
.
Additionally, if you use the gzip_static
module, you may want to pre-compress your files (in PHP):
function gzip_static($path)
{
if ((extension_loaded('zlib') === true) && (is_file($path) === true))
{
$levels = array();
$content = file_get_contents($path);
foreach (range(1, 9) as $level)
{
$levels[$level] = strlen(gzencode($content, $level));
}
if ((count($levels = array_filter($levels)) > 0) && (min($levels) < strlen($content)))
{
if (file_put_contents($path . '.gz', gzencode($content, array_search(min($levels), $levels)), LOCK_EX) !== false)
{
return touch($path . '.gz', filemtime($path), fileatime($path));
}
}
}
return false;
}
This allows you to get the best possible compression without sacrificing the CPU on every request.
Solution 2:
The level of gzip compression simply determines how compressed the data is on a scale from 1-9, where 9 is the most compressed. The trade-off is that the most compressed data usually requires the most work to compress/decompress, so if you have it set fairly high on a high-volume website, you may feel its effect.
It sounds like your issues are more related to the HTTP headers on the requests. Usually gzip-compressed HTTP traffic is accompanied by the Content-Encoding: gzip
header. If this is being dropped somewhere, then the client might not know to have to decompress the response.
Solution 3:
If you really can spare CPU resources, you can use 9, but for most sites a value of 2 is enough, since gzip doesn't reduce the file much after level 1.
Edit: I looked at Amazon CloudFront and it seems to be using level 6, probably because that level is the one that runs decompression faster, thus improving page render performance.