Last modified: 2008-05-13 23:29:36 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T16092, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 14092 - block new spam only
block new spam only
Status: RESOLVED DUPLICATE of bug 1505
Product: MediaWiki extensions
Classification: Unclassified
Spam Blacklist (Other open bugs)
unspecified
All All
: Normal enhancement with 2 votes (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2008-05-12 09:33 UTC by seth
Modified: 2008-05-13 23:29 UTC (History)
1 user (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description seth 2008-05-12 09:33:31 UTC
a new entry in a blacklist should not cause a spamprotection intervention on existing links. it should one block someone, if he tries to put a new (forbidden) link to a page.
i guess, this could be technically solved by simply counting the numbers of occurrences of forbidden urls before and after editing of a page: if diff!=0 then block

background: when a regexp is put to the blacklist, all articles with matched urls are quasi-blocked and it would cost the blacklister too much time to delete/escape all links manually.
Comment 1 Daniel Friesen 2008-05-12 13:56:06 UTC
Agreed, this would also solve most of the cases of another issue talked about in another bug which suggested bots should be able to be exemplified because they don't always know how to handle spam urls right in this case.

Blocking new spamlinks would be benificial. (Though, we should probably find some way of warning the editor "Hey, there are spam links on this page... Could you remove them for us so that editors don't end up with issues during a revert?)

Also, it's not the number of urls, but we can check that they are all there. I suggest taking a look at the ProtectSection extension. It makes good use of the EditFilter, but namely, it uses a reegx to ensure that all the protected section tags remain in the page after edit, so this is the perfect behavior to help understand how to do this with Spam Blacklist.
Comment 2 Brion Vibber 2008-05-13 23:29:36 UTC

*** This bug has been marked as a duplicate of bug 1505 ***

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links