Last modified: 2006-01-14 07:07:45 UTC
I've been running double redirect cleanup projects for the past while; it has become apparent that double redirects are
a systemic problem: in between this current dump and the last dump over 7,800 double redirects showed up in the
Wikipedia. This problem really should not be ignored anymore. To get the ball rolling I've started a study of this
problem at http://en.wikipedia.org/wiki/Wikipedia:Computer_help_desk/double_redirect_study - it might offer some
place to start thinking about the problem.
It could be argued that double redirects should just be solved with a bot instead of having humans perform the
cleanup duties - while this is possible it is definitely not a good solution, especially since double redirects can be so
How does this relate to MediaWiki?
I think the request is specifically asking for general views on what to do about
More seriously, the suggestion at the linked wiki page is for the software to
proactively prevent users from creating double redirects in the first place. I'm
going to leave some comments on that page.
Why are double redirects created in the first place? Why do we have to manually
check for them? When a page is moved, the pages that redirect to it should be
(In reply to comment #3)
> When a page is moved, the pages that redirect to it should be changed, too.
File a separate feature request for this.
Resolving as INVALID until this request is for something specific.
See also [[Wikipedia talk:Special:DoubleRedirects]].