Last modified: 2008-05-13 23:29:36 UTC

Wikimedia Bugzilla is closed!

Wikimedia has migrated from Bugzilla to Phabricator. Bug reports should be created and updated in Wikimedia Phabricator instead. Please create an account in Phabricator and add your Bugzilla email address to it.
Wikimedia Bugzilla is read-only. If you try to edit or create any bug report in Bugzilla you will be shown an intentional error message.
In order to access the Phabricator task corresponding to a Bugzilla report, just remove "static-" from its URL.
You could still run searches in Bugzilla or access your list of votes but bug reports will obviously not be up-to-date in Bugzilla.
Bug 14092 - block new spam only
block new spam only
Status: RESOLVED DUPLICATE of bug 1505
Product: MediaWiki extensions
Classification: Unclassified
Spam Blacklist (Other open bugs)
All All
: Normal enhancement with 2 votes (vote)
: ---
Assigned To: Nobody - You can work on this!
Depends on:
  Show dependency treegraph
Reported: 2008-05-12 09:33 UTC by seth
Modified: 2008-05-13 23:29 UTC (History)
1 user (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Description seth 2008-05-12 09:33:31 UTC
a new entry in a blacklist should not cause a spamprotection intervention on existing links. it should one block someone, if he tries to put a new (forbidden) link to a page.
i guess, this could be technically solved by simply counting the numbers of occurrences of forbidden urls before and after editing of a page: if diff!=0 then block

background: when a regexp is put to the blacklist, all articles with matched urls are quasi-blocked and it would cost the blacklister too much time to delete/escape all links manually.
Comment 1 Daniel Friesen 2008-05-12 13:56:06 UTC
Agreed, this would also solve most of the cases of another issue talked about in another bug which suggested bots should be able to be exemplified because they don't always know how to handle spam urls right in this case.

Blocking new spamlinks would be benificial. (Though, we should probably find some way of warning the editor "Hey, there are spam links on this page... Could you remove them for us so that editors don't end up with issues during a revert?)

Also, it's not the number of urls, but we can check that they are all there. I suggest taking a look at the ProtectSection extension. It makes good use of the EditFilter, but namely, it uses a reegx to ensure that all the protected section tags remain in the page after edit, so this is the perfect behavior to help understand how to do this with Spam Blacklist.
Comment 2 Brion Vibber 2008-05-13 23:29:36 UTC

*** This bug has been marked as a duplicate of bug 1505 ***

Note You need to log in before you can comment on or make changes to this bug.