Last modified: 2011-03-13 18:04:59 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T7503, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 5503 - Double redirects should work
Double redirects should work
Status: RESOLVED WONTFIX
Product: MediaWiki
Classification: Unclassified
Redirects (Other open bugs)
unspecified
All All
: Lowest enhancement with 1 vote (vote)
: ---
Assigned To: Nobody - You can work on this!
:
: 10499 (view as bug list)
Depends on:
Blocks:
  Show dependency treegraph
 
Reported: 2006-04-08 11:02 UTC by Pcb21
Modified: 2011-03-13 18:04 UTC (History)
5 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Pcb21 2006-04-08 11:02:14 UTC
At the moment double redirects "don't work" i.e. if A redirects to B and B
redirects to C and you request A you do not get page C.

However to keep Wikipedia (and other wikis) logical and well-structured, they
should. There are absolutely tons of examples where A and B are synonyms (so one
should redirect to the other) and are a subtopic of C.

Currently we have to write A redir C and B redir C. But now someone comes along
and decides to expand the subtopic and so undoes the B redirect and starts the
new article, but A still redirs to C. If we allowed double redirects then A
redirs B and B redirs would enable us to future-proof against future expansion.

This may seem a trivial application but I have personally seen dozens of
occasions where bad redirects have arisen because of this, and think it would be
really useful.

On the flip side, there is the extra processing required on-going basis (no idea
if this is "critical path" in terms of performance) and of course the
cycle-detection would have to be improved/implemented :/

thanks for listening,
Comment 1 Pcb21 2006-04-08 11:04:33 UTC
Felt I should give at least one concrete example:
In en.wikipedia both [[Rest Mass]] and [[Rest mass]] currently point to [[Mass
in special relativity]]. But if [[rest mass]] ever gets span out, we would want
[[Rest Mass]] to point to [[Rest mass]].
Comment 2 lɛʁi לערי ריינהארט 2006-04-18 00:17:29 UTC
This seems to be a duplicate of
Bug 3747: Avoiding double redirects
Comment 3 Pcb21 2006-04-18 07:20:01 UTC
No. This is not a duplicate of that request. This is a request that double
redirects should work. That is some vague request about making it impossible to
create double redirects. That solution would actually make things worse in my
view, because "abnormal" redirects would be created "behind the backs" of users.
Comment 4 Rob Church 2006-05-14 17:50:22 UTC
This has been proposed before and WONTFIXed.
Comment 5 Aryeh Gregor (not reading bugmail, please e-mail directly) 2007-07-08 19:16:09 UTC
*** Bug 10499 has been marked as a duplicate of this bug. ***
Comment 6 (none) 2007-07-10 14:10:07 UTC
An other reason for double redirects to work: Page B has many redirects. B is then moved to C. As a result, it takes a while until all the redirects are fixed. With software called AWB, 10 a minute seems like a maximum. This means that if there are 30 redirects, then for 3 minutes not all redirects work properly.
Comment 7 Pcb21 2007-07-10 14:53:33 UTC
It is a shame we never got a reason /why/ it is closed as WONTFIX, beyond it "has been proposed before"... If there is a technical reason why it is hard to implement well then that is one thing, but if it the rationale is that the current solution of bots "fixing" double redirects is a good solution, then I wonder if the developers would reflect on my example above and realize it is not a good solution, hence the repeated requests for a better one.
Comment 8 Aryeh Gregor (not reading bugmail, please e-mail directly) 2007-07-10 21:53:00 UTC
WONTFIX is not used (or not supposed to be used) just because something is hard to do.  In that case it's left open and unfixed.  I have never seen a satisfactory explanation for why we can't allow two or three consecutive redirects either, and would also like to hear why the current "solution" (bots moving them) is better than any fix at all in the software.  Brion, could you elaborate?
Comment 9 Brion Vibber 2007-07-11 14:53:36 UTC
The longer you allow the chain to get, the harder it will be to figure out how to clean it up when you need to. Hence, the requirement for immediate cleanup.
Comment 10 Pcb21 2007-07-11 15:43:04 UTC
I'm sorry, I must not be expressing my point clearly enough, because it seems to have been mis-understood again. Let me try one more time:

Sometimes there is no "clean up" to do when a double redirect arises. I.e. sometimes having a double redirect is the correct, most logical layout for a particular set of overlapping topics. 
E.g. if A and B are the synonymous, for example they are [[Subset Topic]] and [[Subset topic]], but are currently redirecting to a part of a more encompassing topic e.g. [[Supratopic]] then currently all the bots and clean up will force [[Subset topic]] and [[Subset Topic]] both to redirect to [[Supratopic]]. But it is much more logical for [[Subset Topic]] to redirect to [[Subset topic]], and indeed this is robust in the face of a new page on [[Subset topic]] being started.

Does that example make sense?

----

In any case I very much doubt that lengthy chains would arise - are there any natural examples of where a quadruple or even treble redirect makes sense? Seems rather theoretical...

----

My take is that this would a very minor improvement to the software, but perhaps a big deal to implement. In which case I would understand the developers saying its not worth the bother. However that is different from the developers misunderstanding the rationale..
Comment 11 Brion Vibber 2007-07-11 16:59:09 UTC
And then you change one of them and you have inconsistency. Hence, cleanup work.
Comment 12 (none) 2007-07-11 20:35:04 UTC
And what about a situation where a page whichalready has many redirects needs to be moved? Allowing a second redirect level (not a third, just a second) means that the cleanup work doesn't need to be done between the page move and the first time someone wants to see the page.
Comment 13 Rob Church 2007-07-11 20:45:11 UTC
Please note that bug warring (reopening/reclosing en masse) is unproductive and disruptive.

It's clear that there's strong feeling here from some parties that the issue needs further discussion, so I propose that this move to a mailing list or some other discussion forum, perhaps on MediaWiki.org.

It should be obvious to all that digging in and stubbornly messing with this bug is not the answer by any means. I will consider briefly locking the BugZilla accounts of anyone who continues with this behaviour, though I would prefer to think that project contributors were capable of discussing issues and reaching an acceptable understanding in a more mature fashion.
Comment 14 Pcb21 2007-07-12 07:05:33 UTC
Developers tend to be busy and thus terse...
Comment 15 Purodha Blissenbach 2007-10-12 23:01:31 UTC
With the redirect table, imho, we have a very simple solution satisfying both standpoints that want:
A: force to immediate "cleanup to chain length 1"
B: possibly a multi-hop logical redirect chain of at most N.

Introduce a variable:

$wgMaxRedirects = 1; 

in DefaultSettings.php which is counted down to zero while redirects are honored, and proceed to immediate display when either no-a-redirect is found, or the count is exhausted.

As a side-effect $wgMaxRedirects = 0; can "disallow" redirects completely for a wiki.

Since I do not want to reopen this bug bending its purpose to something too different, I made the proposal a new one.
See bug # 11644

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links