Last modified: 2008-05-12 20:27:50 UTC
There are some minor inconsistencies between manual and source-code.
1. The script does a preparsing which greps all external urls and separates them by "\n".
For the regexp-s-modifier is not used, it wouldn't be necessary to avoid patterns like ".*" as it is said in the manual. ".*" would only match until EOL, wouldn't it?
2. Actually the code !http://[a-z0-9\-.]*(line 1|line 2|...)!Si in manual is not the right one, because "!" is not the delimiter character used in the script. (So escaping like \! wouldn't be necessary to match a "!".)
3. The header of those blacklists always say "Every non-blank line is a regex fragment which will only match hosts inside URLs", while that isn't true. It does not only match hosts, it matches the path, too.
The page is not protected -- do feel free to fix it directly rather than waiting for someone else! :)
ok, i fixed the first two bugs.
but i guess i can't fix the third one. it problably has nothing to do with the extension itself but is a mistake of the spamlist-users. so i'll mark this whole bug as fixed.