Last modified: 2012-11-03 19:23:38 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T18325, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 16325 - Blacklisted links should mean the page can't be saved
Blacklisted links should mean the page can't be saved
Status: NEW
Product: MediaWiki extensions
Classification: Unclassified
Spam Blacklist (Other open bugs)
unspecified
All All
: Low normal with 2 votes (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks: SWMT
  Show dependency treegraph
 
Reported: 2008-11-13 03:51 UTC by Mike.lifeguard
Modified: 2012-11-03 19:23 UTC (History)
5 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Mike.lifeguard 2008-11-13 03:51:13 UTC
Basically, this is a request to undo r34769 (bug 1505).

Allowing blacklisted links to remain in the page is bad for a few reasons:
 * Allows duplication of blacklisted links (bug 14114)
 * Whitelisting is the correct way to use blacklisted links, per bug 1505 comment 2
 * Shifts the burden to remove blacklisted links onto a small team of users. Previously, the load was distributed since anyone trying to save a page with the link couldn't do so

Concerns about reverting vandalism (bug 1505 comment 1) are valid, and should be addressed; see bug 15450 to make *rollback* (only) exempt.
Comment 1 CBM 2008-11-13 03:54:13 UTC
One issue with this is that if page A transcludes page B, and B trips the spam filter, then saving A will fail. To make this less bad, the spam filter should be put deeper into the parser code, so that as soon as B is parsed the spam filter is checked against it. That would allow a more detailed error message - that page A cannot be saved because page B trips the spam filter with a certain link. 
Comment 2 Mike.lifeguard 2008-11-13 03:55:38 UTC
(In reply to comment #1)
> One issue with this is that if page A transcludes page B, and B trips the spam
> filter, then saving A will fail. To make this less bad, the spam filter should
> be put deeper into the parser code, so that as soon as B is parsed the spam
> filter is checked against it. That would allow a more detailed error message -
> that page A cannot be saved because page B trips the spam filter with a certain
> link. 
> 

That would require a more thorough rewrite, I think, and should therefore be requested separately. This might conceivably be done at the same time as bug 4459.
Comment 3 p858snake 2011-09-12 05:48:36 UTC
Isn't this how it works now? I'm sure i've seen issues with bots on en.wiki because they are failing when their is blacklisted links on the page.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links