Last modified: 2013-02-05 12:53:02 UTC
What about include SMW in the core system ? How about its installation on "official" wikipedia projects ? Thanks in advance for answers, Sincerely, E.
Semantic MediaWiki is an extension and will remain an extension for the foreseeable future. I don't know if it's ready to be enabled on Wikimedia Foundation projects. Possibly a pilot run on one of the smaller wikis wouldn't be a bad idea, to gauge performance and so forth, but I don't know how far it's gotten.
Thank you for your answer. The question is: Does the Semantic MediaWiki Team thinks that the extension is enough operational to be released as a supported extension that could be included in all projects ? In other words, would it be interresting to launch a pool in my langage project to ask to include it ? Would it be any technical reason to say no ? Thanks in adavance pfor your answers, Sincerely, E.
Excuse me to insist ... I just would like an answer to know if I can propose this functionnality to my chapter ... Thanks me in advance, E.
Last I checked, the extension breaks the whole regression testing system, since it makes database queries when the extension is parsed, instead of waiting until it's loaded. Although it's trivial to fix, I think it would rule out installing it in most wikis, since you could no longer test upgrades.
Hum... It seems like that would be never fixed ... But the features are really interresting ! How many categories in Wikipedia would be replaced by a such extension ! No more need of bots to check if listings or inclusions are symetrics ... But wait a minute... Don't understand why it breaks the regression tests when no modification of mediawiki files is needed to install ... Is it the database structure ? E.
(In reply to comment #5) Yes, the database schema. The test uses a different database than the normal wiki, so the tests can add their own articles as they need. The extension uses some additional tables, which are created by a maintenance script, rather than automatically, which would cause problems with testing it. But the real problem here is that it accesses these tables immediately, instead of waiting for the module to be loaded, via the normal hook. These means you can't test anything on a server with the extension installed, because this module will fail to load, even if it wasn't intended to be. This is simple to fix, by moving the initialisation into the appropriate hook, although I can't say when/if they want to address this; I would hope not never. > Hum... It seems like that would be never fixed ... But the features are really > interresting ! > How many categories in Wikipedia would be replaced by a such extension ! No more > need of bots to check if listings or inclusions are symetrics ... > > But wait a minute... Don't understand why it breaks the regression tests when no > modification of mediawiki files is needed to install ... Is it the database > structure ? > > E.
Where this regression could be finded ? I tried to search in meta and mediawiki, no result :(
(In reply to comment #7) > Where this regression could be finded ? I tried to search in meta and mediawiki, > no result :( See bug 8010. In general, if something breaks our regression testing (such as it is), then it may well not be installed live, as we absolutely *have* to be able to run these tests. However, Steve claims the solution is simple, so it's just a matter of having someone who's working on Semantic MediaWiki fix the problem.
I mean, I already read that bug. The question is where is the "official" regression test ? I didn't find them ...
I'm guessing Steve means the parser test suite, which can be found within the maintenance/ directory of any current MediaWiki installation.
This is the error I get when I try to run that test suite, with just the standard parser tests: --- >php maintenance/parserTests.php -quick -quiet A database query syntax error has occurred. The last attempted database query was: "DELETE FROM `parsertest_smw_relations` WHERE subject_id = '1'" from within function "SMW::DeleteRelations". MySQL returned error "1146: Table 'refwikidb.parsertest_smw_relations' doesn't exist (localhost)" --- The standard tests don't load the SMW extension hook, but it tries to access the non-existent table when the the extension is included, so the parser test suite is completely non-functional.
Hum... Then we should: - not initialize the SMW extension (the thing you've done) or - initialize the installation of the extension ! No?
We should fix bug 8010. That's why this depends on that. Further discussion of that bug should take place on its page.
Hi! What is the situation of this? Would it be feasible to install SMW on Wikimedia projects?
(In reply to comment #14) > Hi! > > What is the situation of this? > Would it be feasible to install SMW on Wikimedia projects? Any updates?
Hi, we are very interested in using SMW. Our project Acawiki uses this, and we are working to get it added. The other option is if there is a better performing extension which WMF prefers.
Wikinews might have a need to use this. It would greatly help with the handling of news articles in a structured, queriable manner that goes far beyond the limitations of what categories can do.
See also bug 39843. Is SMW running on any projects on the cluster?
(In reply to comment #18) > See also bug 39843. Is SMW running on any projects on the cluster? SMW is running on labsconsole. I'm not familar with SMW at all, but I am very doubtful it meets the performance requirements necessary for Wikipedia....
Right, that's what WikiData is for. SMW could work on the smaller projects though. From what little I know of WikiData, it should be straightforward to convert mundane SMW usage into WikiData usage if in the future those smaller projects aren't so small anymore. In general, I suspect that externalizing the semantic data processing load like WikiData is doing is probably the only way to approach the scaling problem. I'd have to study it more to be sure of that, but I'm guessing SMW's poor scaling will still bog down Moore's law in most use cases. Then again, maybe it's possible to externalize SMW itself, but that's useless observer speculation for now. I know Mozilla uses SMW, and if it's working for them, there's a good chance something like WikiNews can use it too. I'm not sure how "big" Mozilla's usage of SMW is compared to WikiNews, but my impression is that WikiNews is tiny. It's definitely tiny compared to Wikipedia.
See https://www.mediawiki.org/wiki/Writing_an_extension_for_deployment for information on what is needed to get an extension reviewed before potentially deploying it on a wikisite.
(In reply to comment #20) > Right, that's what WikiData is for. SMW could work on the smaller projects > though. From what little I know of WikiData, it should be straightforward to > convert mundane SMW usage into WikiData usage if in the future those smaller > projects aren't so small anymore. > > In general, I suspect that externalizing the semantic data processing load > like > WikiData is doing is probably the only way to approach the scaling problem. > I'd > have to study it more to be sure of that, but I'm guessing SMW's poor scaling > will still bog down Moore's law in most use cases. Then again, maybe it's > possible to externalize SMW itself, but that's useless observer speculation > for > now. > > I know Mozilla uses SMW, and if it's working for them, there's a good chance There was a fork of wikinews called openglobe. It was tiny compared to wikinews. It tried to use smw but performance was not acceptable (from a user perspective. It was always turned on but it was too slow to be useful) granted that could have been due to issues with how smw was set up, but nonetheless is not a good sign. > something like WikiNews can use it too. I'm not sure how "big" Mozilla's > usage > of SMW is compared to WikiNews, but my impression is that WikiNews is tiny. > It's definitely tiny compared to Wikipedia.
> There was a fork of wikinews called openglobe. It was tiny compared to > wikinews. It tried to use smw but performance was not acceptable (from a user > perspective. It was always turned on but it was too slow to be useful) granted > that could have been due to issues with how smw was set up, but nonetheless is > not a good sign. Sounds to me like issues with the setup. Offhand a few SMW instances I've used, which do have good performance: http://acawiki.org/ http://www.discoursedb.org/ The wiki of the month list gives a good idea of capabilities: http://semantic-mediawiki.org/wiki/Wiki_of_the_Month
(In reply to comment #14) > Hi! > > What is the situation of this? > Would it be feasible to install SMW on Wikimedia projects? I don't think this has ever changed. SMW isn't going to be installed on Wikimedia project wikis. Labsconsole is a notable exception that uses SMW. Wikidata is largely going to accomplish a lot of what people have been wanting SMW for over the years (and feature requests for that go elsewhere). Re-closing this as WONTFIX.
All of that is mostly true, although, just to clarify, there are a few other Wikimedia wikis where it might make sense to use SMW directly - the most notable, in my opinion, is mediawiki.org. (That counts, right?)
Let's recap: (In reply to comment #0) > What about include SMW in the core system ? No, as discuussed at length. > How about its installation on "official" wikipedia projects ? No as a rule, also as discussed at length. Hence WONTFIX is an apprpriate resolution. (In reply to comment #25) > All of that is mostly true, although, just to clarify, there are a few other > Wikimedia wikis where it might make sense to use SMW directly - the most > notable, in my opinion, is mediawiki.org. (That counts, right?) Yes, mediawiki.org counts. But this specific case is far from the generic request of this report opened 6 years ago. Even in that case the first thing would be to define a problem or an enhancement, and then perhaps propose SMW as a solution. Requesting the installation of an extension in mediawiki.org before defining a need for it most probably won't work. If you want to go ahead with this please do it in a new report instead of reopening this one. Thank you.
If a new report gets opened, please post a link to it here.
Quim - yes, of course. I wasn't trying to make a larger point, just correcting an error.