Last modified: 2011-07-12 18:51:18 UTC
MediaWiki's XML export format has an XML Schema definition, which must be updated for each version. The XML document element for each export thingy includes a reference to a master copy, eg: http://www.mediawiki.org/xml/export-0.5.xsd Version 0.5 is missing; one was checked in in r79877 / r79878 but it still needs to actually be put online.
Well, the file is physically there... But is still giving 404 reedy@srv195:/apache/common/docroot/mediawiki/xml$ ls -al total 64 drwxrwxr-x 8 mwdeploy mwdeploy 4096 2011-07-11 23:14 . drwxrwxr-x 3 mwdeploy mwdeploy 4096 2011-02-23 21:24 .. drwxrwxr-x 2 mwdeploy mwdeploy 4096 2011-05-13 16:12 api drwxrwxr-x 2 mwdeploy mwdeploy 4096 2005-01-25 14:22 export-0.1 -rw-rw-r-- 1 mwdeploy mwdeploy 2365 2005-01-25 14:23 export-0.1.xsd drwxrwxr-x 2 mwdeploy mwdeploy 4096 2005-08-17 00:15 export-0.2 -rw-rw-r-- 1 mwdeploy mwdeploy 3171 2005-05-19 02:36 export-0.2.xsd drwxrwxr-x 2 mwdeploy mwdeploy 4096 2005-08-17 00:16 export-0.3 -rw-rw-r-- 1 mwdeploy mwdeploy 4875 2005-07-05 00:01 export-0.3.xsd drwxrwxr-x 2 mwdeploy mwdeploy 4096 2009-12-15 10:35 export-0.4 -rw-rw-r-- 1 mwdeploy mwdeploy 7162 2010-04-09 05:35 export-0.4.xsd drwxrwxr-x 2 mwdeploy mwdeploy 4096 2011-07-11 23:14 export-0.5 -rw-rw-r-- 1 mwdeploy mwdeploy 7518 2011-07-11 23:11 export-0.5.xsd
Did y'all clear the URL from the squid caches?
Have now
Thanks! I had no idea that the cache-clearing step was involved. Is this the type of change that needs to be marked "scaptrap" in the future, or is there a more robust way to ensure that the squid caching step happens?
That should generally be a standard step when deploying new or updated web-accessible files; alternately the directory it's in could have configuration changed so it doesn't claim to be cacheable by the squids. (Not sure offhand how to do that.)
(In reply to comment #4) > Thanks! I had no idea that the cache-clearing step was involved. Is this the > type of change that needs to be marked "scaptrap" in the future, or is there a > more robust way to ensure that the squid caching step happens? There wasn't par se. It required someone with shell access to grab the file onto fenari, then sync out the docroots As Brion, myself, and presumably others, had tried to access the file, the squids then cached the 404 error, meaning when the file did actually exist, the squids were still serving up the error. Purge the squid cache the actual file. Job done