Last modified: 2014-01-14 02:52:01 UTC
I was running PostgreSQL and MySQL tests in parallel on one machine, it seems that one test instance nuked /tmp/Foobar.svg needed by the other one :) There were 2 errors: 1) ParserTests::testParserTest with data set #41 ('Italicized possessive', 'The \'\'[[Main Pag e]]\'\'\'s talk page.', '<p>The <i><a href="/wiki/Main_Page" title="Main Page">Main Page</a>\' </i>s talk page. </p>', '', '') filesize(): stat failed for /tmp/Foobar.svg /usr/home/saper/test/mytest/includes/filebackend/FileBackendStore.php:163 /usr/home/saper/test/mytest/includes/filebackend/FileBackendStore.php:1097 /usr/home/saper/test/mytest/includes/filebackend/FileBackend.php:592 /usr/home/saper/test/mytest/includes/filebackend/FileBackend.php:612 /usr/home/saper/test/mytest/includes/filebackend/FileBackend.php:640 /usr/home/saper/test/mytest/tests/phpunit/includes/parser/NewParserTest.php:459 /usr/home/saper/test/mytest/tests/phpunit/includes/parser/NewParserTest.php:388 /usr/home/saper/test/mytest/tests/phpunit/includes/parser/NewParserTest.php:586 /usr/home/saper/test/mytest/tests/phpunit/MediaWikiTestCase.php:123 /usr/home/saper/test/mytest/tests/phpunit/MediaWikiPHPUnitCommand.php:80 /usr/home/saper/test/mytest/tests/phpunit/MediaWikiPHPUnitCommand.php:64 /usr/home/saper/test/mytest/tests/phpunit/phpunit.php:119 2) ParserTests::testParserTest with data set #196 ('BUG 787: Links with one slash after the ur l protocol are invalid', 'http:/example.com [http:/example.com title]', '<p>http:/example.com </p><p>[http:/example.com title] </p>', '', '') filesize(): stat failed for /tmp/Foobar.svg /usr/home/saper/test/mytest/includes/filebackend/FileBackendStore.php:163 /usr/home/saper/test/mytest/includes/filebackend/FileBackendStore.php:1097 /usr/home/saper/test/mytest/includes/filebackend/FileBackend.php:592 /usr/home/saper/test/mytest/includes/filebackend/FileBackend.php:612 /usr/home/saper/test/mytest/includes/filebackend/FileBackend.php:640 /usr/home/saper/test/mytest/tests/phpunit/includes/parser/NewParserTest.php:459 /usr/home/saper/test/mytest/tests/phpunit/includes/parser/NewParserTest.php:388 /usr/home/saper/test/mytest/tests/phpunit/includes/parser/NewParserTest.php:586 /usr/home/saper/test/mytest/tests/phpunit/MediaWikiTestCase.php:123 /usr/home/saper/test/mytest/tests/phpunit/MediaWikiPHPUnitCommand.php:80 /usr/home/saper/test/mytest/tests/phpunit/MediaWikiPHPUnitCommand.php:64 /usr/home/saper/test/mytest/tests/phpunit/phpunit.php:119
Yup we should mock the file system access :-(
(In reply to comment #1) > Yup we should mock the file system access :-( OMG, no! just use /tmp/Foobar${uniquetestinstanceidentifier}.svg Test instance ID might come from Jenkins (maybe?) or just random stuff combined with process ID and time of start if available. UUID might be some idea. For a nice solution why don't we offer a $this->tempFile() for tests that creates a directory that fulfills the above criteria which is removed every time on tearDown().
I wrote MockFileBackend and MockFSFile which let you create fake files which are purely virtual. In phpunit/includes/parser/NewParserTest.php we explicitly create a local file i.e.: $image = wfLocalFile( Title::makeTitle( NS_FILE, 'Foobar.jpg' ) ); That call should be replaced by using the mocked filebackend and storing in it a file named Foobar.jpg. That would solve the issue and remove a file creation on disk.
But that's not mocking that we need in this test (pretending that the file is created) - we need to create an actual file (ok, just a stream of bytes). We do care about the contents and not faking the mere existence of *some* mocked entity. I think we should change wfTempDir() or introduce something similar for tests that looks like NewParserTest::getUploadDir so that a unique temp directory is created per test instance and it's later tore down. Call it an integration test if that's easier :)
Maybe the MockFSFile can be slightly extended to let us write some content in, except the file payload would be an object property in memory instead of written to disk. The getSha36() would then simply she on the property. Note that this bug is preventing the Jenkins job mediawiki-core-phpunit-parser to be run in parallel ( https://gerrit.wikimedia.org/r/#/c/102152/ ).
I have a simpler/better idea (I think). What about teaching wfTempDir() to create a per-wiki instance directory under /tmp.... and adding a new function wfTempFile() á la UNIX mkstemp() that would allow us to (1) bind files to the Request instance or MediaWikiTestCase instance and clean them up on exit or tearDown() (2) run multiple wikis in parallel (interactive wikis or test instances, does not matter) on the same machine wfTempDir()/wfTempFile() could use some Universal Wiki Id if available. Is Jenkins providing a unique test instance ID for the scripts (didn't check)?
Change 102621 had a related patch set uploaded by Aaron Schulz: Added a MemoryFileBackend class and made MockFileBackend subclass it https://gerrit.wikimedia.org/r/102621
Change 102621 merged by Aaron Schulz: Added a MemoryFileBackend class and made MockFileBackend subclass it https://gerrit.wikimedia.org/r/102621
I think Aaron change fixed it up. The jenkins jobs running parser test has been made concurrent and does not seems to fail so far. Assuming it is properly fixed.
(In reply to comment #9) > I think Aaron change fixed it up. The jenkins jobs running parser test has > been made concurrent and does not seems to fail so far. This bug has appeared intermittently when Mark Hershberger tries to merge changes into REL1_22. Can the fix be backported or the Jenkins configuration changed to work around this?
Ooops. I guess all release branches we still support are affected by that race condition. So if we end up having several parser tests triggered while preparing a new release, we will most probably have failed jobs. I have no good way to prevent that via Jenkins beside creating non concurrent jobs for the old release branches. I would prefer to avoid that though.
Kevin, Thanks for adding me to this bug. I didn't know about it before. I was going to talk to Antoine about this but was focused on trying to do the release. This is incredibly annoying.