Last modified: 2011-03-13 18:05:48 UTC
so i remove useless bits of code, optimise them and so on, in the purpose of makig them lightest possible, and then less heavy for the servers (i think that every single byte is significant in the common.js, the monobook.css of big projects).
but i also write documentation for the functions i optimise... so the weight i loss in the code optimisation i gain it back because of the comments
the full scripts page should remain intact with direct request on them of course.
i think that this feature will save a huge amount of money for wikimedia.
and besides this money consideration, it should be a nice feature
If you want something to optimise, try the 51KB of edit tools we send out every time someone clicks a red link (over 100KB on commons).
I agree with Tim. We already compress it, using gzip. The benefits of any further compression are outweighed by the annoyance of anyone who actually wants to read the stuff.
According to YSlow (http://developer.yahoo.com/yslow/), a number of JS files from commons are not gzip'd. Try editing a page on commons in Firefox and take a look at the response headers of wikibits.js for example. YSlow also complains that the commons logo doesn't have an expires header. There's probably a good reason for that that's beyond me.
Raw files are not currently compressed as they are served directly by the web server, which we have not currently configured to compress them (though we plan to do so).
Any JS served directly out of the wiki gets compressed.
This feature request though was about *obfuscation* for the purposes of making the files smaller; that's something we'll reject regardless of the use or non-use of gzip compression, as it makes debugging and customization much more difficult.
I've updated the summary to clarify.