Last modified: 2010-05-15 15:38:09 UTC
I'm trying to set up a mirror of the spanish wikipedia. I've installed mediawiki
1.5rc4 as I wanted to import a XML dump
(http://download.wikimedia.org/wikipedia/es/pages_full.xml.gz) using the
importDump.php script. First time I discovered that my php installation couln't
open >2GB files, so I recompiled it with the -D_FILE_OFFSET_BITS=64 following
the instructions on http://es2.php.net/manual/en/function.fopen.php#37791.
Apparently php can now open such files, so i retryed the import, but now it
always stops at the same point, no matter if I cat the .gz to the script or pass
the uncompressed file directly to the script:
leonov:/var/www/wiki/maintenance# php importDump.phps
100 (154.975063164 pages/sec 154.975063164 revs/sec)
2000 (124.46568431 pages/sec 124.46568431 revs/sec)
I' tried the script linked from the "Bug 2979: import script is broken (or dump
corrupted?)" (http://bugzilla.wikimedia.org/show_bug.cgi?id=2979) with the same
result, also tryed the workaround mentioned in "Bug 3182: Not enough memory for
The php process seems not to grow in memory size anyways.
The server is a dual Xeon 3.0Ghz with 4GB RAM.
Is there any other version of the script I should try?
*** This bug has been marked as a duplicate of 3473 ***