Last modified: 2010-05-15 15:38:42 UTC

Wikimedia Bugzilla is closed!

Wikimedia has migrated from Bugzilla to Phabricator. Bug reports should be created and updated in Wikimedia Phabricator instead. Please create an account in Phabricator and add your Bugzilla email address to it.
Wikimedia Bugzilla is read-only. If you try to edit or create any bug report in Bugzilla you will be shown an intentional error message.
In order to access the Phabricator task corresponding to a Bugzilla report, just remove "static-" from its URL.
You could still run searches in Bugzilla or access your list of votes but bug reports will obviously not be up-to-date in Bugzilla.
Bug 4268 - Search fails after running compressOld.php
Search fails after running compressOld.php
Product: MediaWiki
Classification: Unclassified
General/Unknown (Other open bugs)
All Linux
: Normal normal (vote)
: ---
Assigned To: Nobody - You can work on this!
Depends on:
  Show dependency treegraph
Reported: 2005-12-14 04:01 UTC by dharmaweb
Modified: 2010-05-15 15:38 UTC (History)
0 users

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Description dharmaweb 2005-12-14 04:01:15 UTC
This is the error I get:

Fatal error: Call to undefined function: uncompress() 
in /public/vhost/d/dharmaweb/html/includes/HistoryBlob.php on line 256

I enter the value "test" and it works with some other values such as "Van"

Here is the failed URL:

Here is the normal URL:

The bug occurred just right after I run the compressOld.php file.
Comment 1 Rob Church 2005-12-14 04:13:56 UTC
First glance, without thinking about it too much, suggests you're missing a
library or extension needed to handle decompression. Off the top of my head, I
don't know if this is a built-in one (which would indicate a dodgy/broken PHP
config.) or something else, but that's what it appears to be.

What version of PHP?
Comment 2 dharmaweb 2005-12-14 19:05:20 UTC
We're running PHP version 4.3.11.  The BZip2 module, version 1.0.2, is enabled, as 
well as ZLib version 1.1.4.
Comment 3 Brion Vibber 2005-12-15 22:23:12 UTC
This call failing probably indicates that the unserialize() failed in some way. Stick in 
something like var_dump($obj) on the prior lines to see what you've got.
Comment 4 dharmaweb 2005-12-16 00:00:23 UTC
This is what I get when I added the var_dump($obj) line 
to the HistoryBlob.php include file:

if( !is_object( $obj ) ) {
    // Correct for old double-serialization bug.
    $obj = unserialize( $obj );


// Save this item for reference; if pulling many
// items in a row we'll likely use it again.
$wgBlobCache = array( $this->mOldId => $obj );

Parse error: parse error, unexpected T_VARIABLE 
in /public/vhost/d/dharmaweb/html/includes/
p on line 258

Comment 5 Brion Vibber 2005-12-16 00:20:33 UTC
You need a semicolon at the end of the line.
Comment 6 dharmaweb 2005-12-19 22:49:35 UTC
object(concatenatedgziphistoryblob)(6) { ["mVersion"]=> 
int(0) ["mCompressed"]=> bool(false) ["mItems"]=> array
(3) { ["ccd26dfde854fdb73601545c58811298"]=> string
(73370) " 

Long text of the article here...

 ["mDefaultHash"]=> string
(32) "10dea8ff26d89a7e83531f694ad7b535" ["mFast"]=> int
(0) ["mSize"]=> int(0) } object(historyblobstub)(3) { 
["mOldId"]=> string(4) "3029" ["mHash"]=> string
(32) "1b661dabd3208de1c359fd47713ad77a" ["mRef"]=> 

Comment 7 dharmaweb 2006-01-03 18:56:41 UTC
Do you know how to undo the CompressOld process?  
Comment 8 dharmaweb 2006-02-21 19:29:53 UTC
I just found out that viewing some articles is also having the same problem.  

Here is the error and the var_dump($obj); statement produces:

object(historyblobstub)(3) { ["mOldId"]=> string(4) "1791" ["mHash"]=> string
(32) "795e63467eeabd9e8f5548fedd0a0794" ["mRef"]=> NULL } 
Fatal error: Call to undefined function: uncompress() 
in /public/vhost/d/dharmaweb/html/includes/HistoryBlob.php on line 258

Is there way to uncompress?  Can I just take current articles and reinstall 
the database and import them back in later?
Comment 9 Travis D 2007-02-02 15:21:27 UTC
It turns out that when you run compressOld.php on your table, you can lose some

For articles whose 2nd revision is an article move, data will be lost. 

For the revisions, compressOld.php will first set the initial text entry to a
ConcatenatedGzipHistoryBlob object. 

It will then look at the 2nd revision entry which represents the move:

*************************** 1. row ***************************
        rev_id: 265949
      rev_page: 58415
      rev_user: 0
 rev_timestamp: 20060825022115
rev_minor_edit: 0
   rev_deleted: 0
   rev_text_id: 265193
*************************** 2. row ***************************
        rev_id: 265984
      rev_page: 58415
   rev_comment: [[Make You Paper or Essay Longer Than It Is]] moved to [[Make an
Essay Appear Longer Than It Is]]: correct spelling, make title more exact
      rev_user: 1254433
 rev_user_text: Ladanea
 rev_timestamp: 20060825024904
rev_minor_edit: 1
   rev_deleted: 0
   rev_text_id: 265193

Unfortunately this rev_text_id is exactly the same as the first one, and a
HistoryBlobStub gets stored here overwriting the ConcatenatedGzipHistoryBlob
object and data is lost. 

The way to fix this from occuring again is something like this in

<                                               # Skip if not compressing and
don't overwrite the first revision
<                                               if ( $stubs[$j] !== false &&
$revs[$i + $j]->rev_text_id != $primaryOldid) {
>                                               # Skip if not compressing
>                                               if ( $stubs[$j] !== false ) {

Unfortunately there is no way to undo this process and get back your data. Your
only alternative is to somehow import the lost data from a backup, which you
hopefully have, into the text table. 
Comment 10 Brion Vibber 2007-02-02 18:14:35 UTC
Thanks, Travis!

Applied fix on trunk in r19726 and rel1.9 in r19727.
Comment 11 Alejandro Sánchez Marín 2007-05-07 16:09:08 UTC
I have same problem but i dont know how to fix it. What should i do to solve it?

Note You need to log in before you can comment on or make changes to this bug.