Last modified: 2014-01-10 18:26:05 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T53730, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 51730 - Unknown error: "stasherror" for >100 MB PDF file upload to Commons
Unknown error: "stasherror" for >100 MB PDF file upload to Commons
Status: REOPENED
Product: MediaWiki
Classification: Unclassified
Uploading (Other open bugs)
1.22.0
All All
: High major with 1 vote (vote)
: ---
Assigned To: Gilles Dubuc
:
Depends on:
Blocks: chunked-upload
  Show dependency treegraph
 
Reported: 2013-07-20 02:38 UTC by Shiju Alex
Modified: 2014-01-10 18:26 UTC (History)
15 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments
Screenshot of error (12.39 KB, image/png)
2013-07-20 15:41 UTC, Sreejith K
Details

Description Shiju Alex 2013-07-20 02:38:15 UTC
This bug is as per [[:commons:Help:Server-side_upload]]

Please upload the PDF file of a Public Domain document from this link (http://archive.org/details/englishmalayalam00tobirich) to commons. It is around 130 MB. 

Rename the file as "English_malayalam_sabdakosam_tobias_1907.pdf" while uploading.
Comment 1 Tomasz W. Kozlowski 2013-07-20 10:02:24 UTC
Please enable chunked uploads in your Preferences (Preferences => Uploads => Experimental features) and upload it yourself; no server-side upload is required for such small files.
Comment 2 Shiju Alex 2013-07-20 11:00:35 UTC
Actually initially I tried that. But it is showing error and the upload is not happening. Browser: FireFox 22.0. OS: Windows 7. That is why I logged a bug.
Comment 3 Andre Klapper 2013-07-20 13:49:53 UTC
(In reply to comment #2)
> But it is showing error

If there is an error, see https://www.mediawiki.org/wiki/How_to_report_a_bug
Comment 4 Tomasz W. Kozlowski 2013-07-20 14:45:28 UTC
Unknown error: "stasherror" is what I'm getting. Not especially helpful.
Comment 5 Sreejith K 2013-07-20 15:41:10 UTC
Created attachment 12903 [details]
Screenshot of error

This is the error I am getting and the message is not helpful.
Comment 6 Tomasz W. Kozlowski 2013-07-20 16:58:27 UTC
I can confirm I also received this error. The error mentioned in comment 4 occurred when I tried uploading the file for the second time.
Comment 7 Andre Klapper 2013-07-22 08:00:29 UTC
Moving to UploadWizard.
Comment 8 Bawolff (Brian Wolff) 2013-07-22 13:30:51 UTC
moving to MediaWiki/Uploading - Looks to be an issue on the backend part of uploading.
Comment 9 Shiju Alex 2013-07-24 06:46:28 UTC
@Tomasz W. Kozlowski, Could you please upload the file mentioned in Comment 1 to Commons.
Comment 10 Tomasz W. Kozlowski 2013-07-24 09:31:56 UTC
See comment 4 and comment 6 to see that I tried and could not upload it.
Comment 11 Shiju Alex 2013-07-29 08:54:09 UTC
I am changing the priority of this bug to "high". The reason is, due to this bug as of today there is no option in Wikimedia Commons to upload files more than 100 MB. This bug require immediate attention from developers.
Comment 12 Bawolff (Brian Wolff) 2013-07-29 14:07:19 UTC
I can confirm when I tried to upload it hung on the {"upload":{"result":"Poll","stage":"queued"}}, stage.
Comment 13 Bawolff (Brian Wolff) 2013-07-29 15:23:08 UTC
Testing locally, it seems a significant amount of time is spent getting file metadata/text layer when assembling chunks. (18 seconds vs about 2 minutes)

I suspect this is what's causing problems in this specific case (However there's the larger issue of poor error reporting, and the fact that things are timing out anywhere near this quickly anyhow)
Comment 14 Andre Klapper 2013-07-29 15:39:22 UTC
Mark: Could you take a look at this?
Comment 15 Bawolff (Brian Wolff) 2013-07-29 16:20:55 UTC
OTOH, Locally I'm also getting during the check status phase:

{"upload":{"result":"Poll","stage":"queued"}}<br />
<b>Fatal error</b>:  Allowed memory size of 134217728 bytes exhausted (tried to allocate 2 bytes) in <b>/var/www/w/git/includes/normal/UtfNormal.php</b> on line <b>295</b><br />


Which is a giant wtf, since that type of request should not take lots of memory. However that seems more likely something wrong with my local set up.
Comment 16 Bawolff (Brian Wolff) 2013-07-29 17:26:25 UTC
For reference, I was able to successfully upload the djvu version: http://commons.wikimedia.org/wiki/File:English_malayalam_sabdakosam_tobias_1907.djvu, so I guess the problem is with the PDF version of the file.

Shiju Alex: Could you update the file info for that djvu file as appropriate.
-----

Note, just to be clear, I'm not suggesting that just because the djvu file worked, that this bug is solved. Obviously there are still issues.
Comment 17 Shiju Alex 2013-07-29 17:30:34 UTC
For DjVu it worked because it's size is around 89 MB (that is, less than 100 MB). So I guess this bug (not able to upload files with size > 100 MB) is still valid. 

Let me locate another PDF that is more than 100 MB to verify this.
Comment 18 Bawolff (Brian Wolff) 2013-07-29 17:42:48 UTC
(In reply to comment #17)
> For DjVu it worked because it's size is around 89 MB (that is, less than 100
> MB). So I guess this bug (not able to upload files with size > 100 MB) is
> still
> valid.

I suspect it more has to do with uploading a PDF > 100 MB. It may be a combination of the file format and the lower size that made the djvu file work. (Note I still used chunked uploading to upload the djvu file)

> 
> Let me locate another PDF that is more than 100 MB to verify this.

More interesting would be a PDF file that is ~89 mb to check if the problem is something to do with our metadata extraction on pdf files.
Comment 19 Bawolff (Brian Wolff) 2013-07-29 18:12:23 UTC
> More interesting would be a PDF file that is ~89 mb to check if the problem
> is
> something to do with our metadata extraction on pdf files.

I tried uploading [[commons:File:Eugene and Frederick Sutermeister 1906.pdf]] (85 mb) to test2.wikipedia.org using chunked uploading. It did not work. This further suggests its related to our pdf handling.
Comment 20 Bawolff (Brian Wolff) 2013-07-29 20:28:32 UTC
(In reply to comment #19)
> > More interesting would be a PDF file that is ~89 mb to check if the problem
> > is
> > something to do with our metadata extraction on pdf files.
> 
> I tried uploading [[commons:File:Eugene and Frederick Sutermeister 1906.pdf]]
> (85 mb) to test2.wikipedia.org using chunked uploading. It did not work. This
> further suggests its related to our pdf handling.

Err, nevermind - Eugene and Frederick Sutermeister 1906-test.pdf did get uploaded to test2, I just didn't notice the success message. I still suspect issue is related to pdfs.
Comment 21 Nemo 2013-08-23 04:46:00 UTC
Isn't this bug just the revival of bug 36587?
Comment 22 Bawolff (Brian Wolff) 2013-08-23 06:33:09 UTC
(In reply to comment #21)
> Isn't this bug just the revival of bug 36587?

Probably related, but not precisely the same. Seems like pdfs are much more likely to trigger due to much more expensive metadata operation which could be optimized.
Comment 23 Nemo 2013-08-23 07:16:39 UTC
(In reply to comment #22)
> (In reply to comment #21)
> > Isn't this bug just the revival of bug 36587?
> 
> Probably related, but not precisely the same. Seems like pdfs are much more
> likely to trigger due to much more expensive metadata operation which could
> be
> optimized.

If that's the issue, it looks similar to what happened with TIFF files, a problem which IIRC was resolved by this piece of config:
    $wgTiffMaxMetaSize = 1048576;

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links