Last modified: 2009-07-28 01:06:32 UTC
If I go to http://wikipedia.org/ the resulting page is not gzip-compressed, even though my browser requests gzip compression. Roughly a 70% saving in size (~3x transfer speed improvement) could be achieved through compression: http://www.port80software.com/tools/compresscheck.asp?url=http%3A%2F%2Fwikipedia.org%2F This would decrease the bandwidth used on requests to that page, and increase performance for users. I suspect there is some way to achieve this while ensuring it can still be cached (so not increasing the CPU load on the servers). This also applies to other front pages like http://wikinews.org/ and to CSS files, though to a lesser extent as they're not so big.
Probably needs a one-line fix to extract2.php...
Appears to have been fixed. 50.2 KB -> 13.5 KB (73% saving).