Last modified: 2013-09-27 11:32:00 UTC
If this isn't a too big serverdrain, I would suggest, wen a link, for example
http://18.104.22.168 is included, and to be checked against the blacklist, that a dns
lookup is made, and that value is checked instead.
Any reason for the wontfix?
Perhaps you didn't understood my question, so I tries to reformulate:
A function in the blacklist that for inclusion of links with only the IP-address
instead of the domain name identifier specified, that will lookup the domain
name for the given IP-addredd and use the domain name for matching against the
blacklist. This because as of now, spammers can circumvent the blacklist by
using the IP-address instead of the domain name
This would require a decent bit of work. Might be easiest for the sysops to just block the IP
addresses from being added too.
Well, its happened, I wonder how many cases there are that we have not noticed.
([http://meta.wikimedia.org/wiki/Talk:Spam_blacklist#3_proxy_sites see this]}.
Another instance of it.
I'm going to look into doing a search of all current links on the blacklist, and
see if any are currently being abused. (that I can tell). Right now we have
about 0 information on how bad this problem could be. If its only a few hundred
we can deal with it, but if it happens to be much more, it might be quite a bit
of work just to check and maintain the IP addresses.
This would still be desirable, if technically feasible. We do monitor additions better than previously, however it is still sometimes a problem. This would make our lives a bit easier. On the other hand, I would say it is not a terribly high priority for the users maintaining the blacklist, and therefore shouldn't be a high priority for coders.
Updated summary, since this isn't about blocking, but rather blacklisting. Also, this hopefully makes it clear what the chain is:
blacklisted domain -> hosted on some IP -> auto-blacklist all other domains on that IP