Last modified: 2013-03-05 19:46:12 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T37404, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 35404 - Leading space within [[ http://...]] gives redlink and avoids blacklist
Leading space within [[ http://...]] gives redlink and avoids blacklist
Status: RESOLVED WONTFIX
Product: MediaWiki
Classification: Unclassified
General/Unknown (Other open bugs)
1.19
All All
: Low minor (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks: SWMT
  Show dependency treegraph
 
Reported: 2012-03-22 11:13 UTC by billinghurst
Modified: 2013-03-05 19:46 UTC (History)
2 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description billinghurst 2012-03-22 11:13:03 UTC
When an external link http:// is used within a [[wikilink]] AND has a leading space such as [[ http://www.anu.edu.au]] the link is created as an internal redlink to itself.  Such a "link" also avoids the blacklist.

It would seem worthwhile to 'chomp' any leading spaces inside a [[wikilink]], especially where the character after the space is a url.

Example:
https://meta.wikimedia.org/w/index.php?title=User:Billinghurst/sandbox&oldid=3585709
Comment 1 Mark A. Hershberger 2012-05-28 17:56:29 UTC
Lowering priority on high priority bugs that have a low severity
Comment 2 James Forrester 2013-03-05 19:46:12 UTC
This is a WONTFIX on two counts:

1. SpamBlacklist (which is a horrible hack) is for blocking external links. "[[ http://foo.com/]]" isn't an external link, even if it contains some text that looks like one to a human, so doesn't get caught by it. A better way in general of dealing with spammy links is a global AbuseFilter, as that isn't focussed just on one . To use it in a filter directly, the code is written, and will be enabled soon; to have a proper spam-fighting capability in AbuseFilter is a route we haven't yet explored, but seems more sensible than fixing archaïc code. I've created a bug for that as bug 45747.

2. We're trying really, /really/ hard not to add more hacks onto the PHP parser at this point, as we're looking forwards to the point when Parsoid replaces it; as it is, the workload to create "bug-for-bug comptability" with the PHP parser is extreme, and changes to how it works is something to avoid. Additionally, changes to corpus interpretation that may be unexpected are generally not great - it's astonishing how many times our users delibaretly use the edge-cases of 'unexpected' behaviours to do something odd but intentional.

Sorry!

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links