Last modified: 2012-07-30 06:19:18 UTC
A CAPTCHA has to be solved by new users/logged-out users if they create a new external link. This is to prevent spamming, OK great. So why do it to websites that reliable? examples: media websites, e.g. - bbc.co.uk - nytimes.com government websites -.gov -.gov.uk - .mod.uk misc, e.g. - jstor.org etc? Now links to Wikimedia websites are be approved without the CAPTCHA. So presumably there is a list (somewhere) of approved websites, and this can be extended? If so there needs to be a process whereby such sites are suggested, approved, and then added to the system.
I think the blacklist <http://meta.wikimedia.org/wiki/Spam_blacklist> is (or should be) involved here. Blacklist sites can never be entered anyway. Maybe a whitelist of good sites exists somewhere?
Changed summary to clarify.
It looks like this feature was implemented since 2007 so this bug can be closed. http://www.mediawiki.org/wiki/Extension:ConfirmEdit#URL_and_IP_whitelists