Last modified: 2013-03-16 21:47:52 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T24356, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 22356 - Provide option to use XML_PARSE_HUGE to avoid: Error: DOMDocument::loadXML(): Excessive depth in document: 256
Provide option to use XML_PARSE_HUGE to avoid: Error: DOMDocument::loadXML():...
Status: RESOLVED FIXED
Product: MediaWiki
Classification: Unclassified
Parser (Other open bugs)
unspecified
PC Windows XP
: Low enhancement (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2010-02-02 10:23 UTC by qin li
Modified: 2013-03-16 21:47 UTC (History)
3 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments
It's how the error shows (123.46 KB, image/png)
2010-02-02 10:23 UTC, qin li
Details

Description qin li 2010-02-02 10:23:00 UTC
Created attachment 7066 [details]
It's how the error shows 

I had built a Wiki in my computer and used the data dumps of  January 21th.
The soft enviroment is WampSever(php 5.3.0,Apache  2.2.11,Mysql 5.1.36).
The version of my mediawiki is 1.15.1 ,no extension and the $wgLanguageCode = "zh-hans".
I use mwdumper.jar and the command of "java  -jar mwdumper.jar --format=sql:1.5 zhwiki-20100121-pages-articles.xml.bz2 | mysql -u wikiuser -p wikidb" to import the data.
Every time when i finish importing the data and change 
   [mysqld]
  max_allowed_packet = 128M
  innodb_log_file_size = 100M
  [mysql]
  max_allowed_packet = 128M
  
  max_execution_time = 300
in php.ini. And set $wgShowExceptionDetails = true;
I search the Template::Babel(In chinese 模板:Babel) in my localwiki. It always shows the error:

Warning: DOMDocument::loadXML() [domdocument.loadxml]: Excessive depth in document: 256 use XML_PARSE_HUGE option in Entity, line: 256 in D:\wamp\www\mediawiki\includes\parser\Preprocessor_DOM.php on line 107

Warning: DOMDocument::loadXML() [domdocument.loadxml]: Extra content at the end of the document in Entity, line: 256 in D:\wamp\www\mediawiki\includes\parser\Preprocessor_DOM.php on line 107


Preprocessor_DOM::preprocessToObj generated invalid XML

Backtrace:

#0 D:\wamp\www\mediawiki\includes\parser\Parser.php(2579): Preprocessor_DOM->preprocessToObj('{| name="userbo...', 0)
#1 D:\wamp\www\mediawiki\includes\parser\Parser.php(2630): Parser->preprocessToDom('{| name="userbo...')
#2 D:\wamp\www\mediawiki\includes\parser\Parser.php(875): Parser->replaceVariables('{| name="userbo...')
#3 D:\wamp\www\mediawiki\includes\parser\Parser.php(327): Parser->internalParse('{| name="userbo...')
#4 [internal function]: Parser->parse('{| name="userbo...', Object(Title), Object(ParserOptions), true, true, 9779191)
#5 D:\wamp\www\mediawiki\includes\StubObject.php(58): call_user_func_array(Array, Array)
#6 D:\wamp\www\mediawiki\includes\StubObject.php(76): StubObject->_call('parse', Array)
#7 [internal function]: StubObject->__call('parse', Array)
#8 D:\wamp\www\mediawiki\includes\Article.php(3557): StubObject->parse('{| name="userbo...', Object(Title), Object(ParserOptions), true, true, 9779191)
#9 D:\wamp\www\mediawiki\includes\Article.php(979): Article->outputWikiText('{| name="userbo...')
#10 D:\wamp\www\mediawiki\includes\Wiki.php(450): Article->view()
#11 D:\wamp\www\mediawiki\includes\Wiki.php(63): MediaWiki->performAction(Object(OutputPage), Object(Article), Object(Title), Object(User), Object(WebRequest))
#12 D:\wamp\www\mediawiki\index.php(116): MediaWiki->initialize(Object(Title), Object(Article), Object(OutputPage), Object(User), Object(WebRequest))
#13 {main}

I tried three times in two computers.I don't know why.I encountered this error in 2009-01-30.
Comment 1 Andre Klapper 2013-01-29 19:30:32 UTC
This was also asked in http://lists.wikimedia.org/pipermail/wikitech-l/2010-February/046738.html

This is PHP's XML parser opting out due to excessive nesting. Nothing that could be fixed in MediaWiki software.
Wondering if our datadumps could do something differently, though.

<andre__> wondering whether something could be fixed in creation of datadumps.
<^demon> Not really, no.
<^demon> We could possibly add an option to MW core to use XML_PARSE_HUGE, but you probably wouldn't want it on by default.
Comment 2 Umherirrender 2013-03-16 21:47:52 UTC
XML_PARSE_HUGE is passed everytime since r96655, so this should be fixed.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links