Last modified: 2014-11-15 06:47:45 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T65327, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 63327 - Migrate i18n file format to JSON
Migrate i18n file format to JSON
Status: PATCH_TO_REVIEW
Product: Pywikibot
Classification: Unclassified
i18n (Other open bugs)
unspecified
All All
: High critical
: ---
Assigned To: Pywikipedia bugs
: i18n
Depends on:
Blocks: pwb20
  Show dependency treegraph
 
Reported: 2014-03-31 21:12 UTC by Siebrand Mazeland
Modified: 2014-11-15 06:47 UTC (History)
9 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Siebrand Mazeland 2014-03-31 21:12:18 UTC
We've recently started using JSON for i18n files in MediaWiki and for some other products. We think it's a good non-executable format for i18n/L10n.

It would be nice if Pywikibot would also start using this format. It would also allow us to remove the less than excellent Python file format support in the Translate extension.
Comment 1 Merlijn van Deen (test) 2014-04-02 16:17:54 UTC
One of the reasons we are using the current file format is the structure: in pywikibot, the typical use case is 'getting a few strings from a lot of languages', while the general use case (e.g. for mediawiki) is 'getting a lot of strings from a single language'.

On tools-login, parsing all mediawiki's json files takes 3-5 seconds. For many bot users, this might be a lot longer (slower computer, slower hard drive), which (I think) is not reasonable.

Switching to a json-based format with the same file structure would be OK with me.
Comment 2 Niklas Laxström 2014-04-02 17:29:33 UTC
It shouldn't matter too much whether N messages are in 50 or 1000 files (made up numbers) on the time how much it takes to parse them.
Comment 3 Betacommand 2014-04-02 17:45:04 UTC
Actually it would, each new file introduces more disk read time, file open, file close time. Then depending on the language it may have to create a new json parser, parse the file, and then destroy the json parser. With 1-2 files its not that big of a deal, but as it scales up the issue becomes more and more of a bottle neck
Comment 4 Niklas Laxström 2014-04-02 18:01:28 UTC
I can understand that, but I'm not convinced it is a bottle neck currently.

I am weighing this against the development effort needed in Translate to support it.
Comment 5 Merlijn van Deen (test) 2014-04-02 18:32:33 UTC
(In reply to Niklas Laxström from comment #4)
> I can understand that, but I'm not convinced it is a bottle neck currently.

As I mentioned, I *measured* the time needed to parse JSON files. For Mediawiki core, this takes *multiple seconds*. That's *multiple seconds* during *startup* of *each* script.

Currently, each script needs to read a single file, that's also in a format that's loaded quickly (the python bytecode is cached).

(In reply to Niklas Laxström from comment #4)
> I am weighing this against the development effort needed in Translate to
> support it.

I assume you are aware this switch won't exactly be free in terms of *our* time, either, right? I'd rather work on site-based configuration, or python 3 support, instead of changing a file format for the sake of changing it.
Comment 6 Niklas Laxström 2014-04-03 13:56:08 UTC
Are you saying that you parse *all* of *MediaWiki's* i18n files on *startup* for all pywikibot scripts?
Comment 7 Merlijn van Deen (test) 2014-04-03 14:11:02 UTC
No, I'm saying we would need to parse *all* of *pywikibot's* i18n files on *startup* of all scripts if we would use the json format currently used by mediawiki.
Comment 8 Niklas Laxström 2014-04-03 14:30:33 UTC
MediaWiki core has about 500000 strings (all languages together). Pywikibot has 11000 strings. Assuming parsing is linear on the number of strings, for Pywikibot your example would take 0.1 seconds. This leaves some room for growth, non-linear parsing time and slower machines.

I would also imagine that you could on startup just dump all the messages in an efficient serialized format provided by python, which is used on subsequent loads, if necessary. I might be able to help with that.
Comment 9 Merlijn van Deen (test) 2014-04-04 08:54:05 UTC
You're right, MW is much bigger. Could you provide the json formatted output for pywikibot somewhere?
Comment 10 Siebrand Mazeland 2014-04-04 09:08:48 UTC
(In reply to Merlijn van Deen from comment #9)
> You're right, MW is much bigger. Could you provide the json formatted output
> for pywikibot somewhere?

There isn't such a thing. There's a file format and a conversion script that does not yet exist. There are currently 41 distinct i18n components with 129 strings for Pywikibot.

Let's double that. 100 components and 300 strings in say 70 languages. That's 5000 files and 21000 strings.

MediaWiki core has 3000 strings for English alone in a single file. It's hard to compare. I'd suggest to create some dummy data based on my above assumptions.

Sample files are abundant in any MediaWiki extension in i18n/en.json.
Comment 11 Niklas Laxström 2014-04-05 10:58:09 UTC
Or you can use the following python script to create json files for testing:

import json
import glob
import importlib
import os

for file in glob.glob("*.py"):
  module = file[:-3]
  try:
    dict = importlib.import_module(module).msg
    os.mkdir(module)
    for lang in dict:
      file = open(module + "/" + lang + ".json", "w+")
      json.dump(dict[lang], file, ensure_ascii=False)
  except AttributeError:
    pass
Comment 12 Gerrit Notification Bot 2014-08-01 17:20:02 UTC
Change 151113 had a related patch set uploaded by Ladsgroup:
[BREAKING] [Bug 63327] Use JSON for i18n files

https://gerrit.wikimedia.org/r/151113
Comment 13 Gerrit Notification Bot 2014-08-01 17:20:15 UTC
Change 151114 had a related patch set uploaded by Ladsgroup:
[BREAKING] [Bug 63327] Use JSON for i18n files

https://gerrit.wikimedia.org/r/151114
Comment 14 John Mark Vandenberg 2014-11-09 20:03:54 UTC
This needs to be done before the next version pushed to pypi , which would be another beta or a release candidate.
Comment 15 Gerrit Notification Bot 2014-11-14 16:37:32 UTC
Change 151114 had a related patch set uploaded by John Vandenberg:
Use JSON for i18n files

https://gerrit.wikimedia.org/r/151114

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links