Last modified: 2012-07-30 06:19:18 UTC
A CAPTCHA has to be solved by new users/logged-out users if they create a new external link. This is to prevent spamming, OK great. So why do it to websites that reliable?
media websites, e.g.
Now links to Wikimedia websites are be approved without the CAPTCHA. So presumably there is a list (somewhere) of approved websites, and this can be extended?
If so there needs to be a process whereby such sites are suggested, approved, and then added to the system.
I think the blacklist <http://meta.wikimedia.org/wiki/Spam_blacklist> is (or should be) involved here. Blacklist sites can never be entered anyway. Maybe a whitelist of good sites exists somewhere?
Changed summary to clarify.
It looks like this feature was implemented since 2007 so this bug can be closed.