Last modified: 2014-09-24 00:05:53 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T23302, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 21302 - Send squid purges for extensions and specifically, on curid urls on DynamicPageList
Send squid purges for extensions and specifically, on curid urls on DynamicPa...
Status: NEW
Product: MediaWiki
Classification: Unclassified
Parser (Other open bugs)
1.16.x
All All
: Low normal (vote)
: ---
Assigned To: Nobody - You can work on this!
:
Depends on:
Blocks: 20818
  Show dependency treegraph
 
Reported: 2009-10-26 15:21 UTC by Platonides
Modified: 2014-09-24 00:05 UTC (History)
8 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments
SquidUpdate, Title, DynamicPageList (3.59 KB, patch)
2009-10-26 15:21 UTC, Platonides
Details

Description Platonides 2009-10-26 15:21:58 UTC
Created attachment 6722 [details]
SquidUpdate, Title, DynamicPageList

This code aims to fix bug 20818 squid caching issues.

First, SquidUpdate::newFromTitles() is changed to be based on getSquidURLs() instead of generating the url by itself. This probably fixes caching problems on wikis with variants.

Second, TitleSquidURLs hook is added to allow extensions to add extra article-related urls which need purging.

Third, DynamicPageList is modified to use the new hook to purge the ?curid urls it generates.
Comment 1 Siebrand Mazeland 2010-01-09 18:49:33 UTC
Can you please revise the patch, as bug 20818 has been resolved? (or do i have the relations the wrong way around?)
Comment 2 Platonides 2010-01-09 19:05:31 UTC
Bug 20818 was fixed by not using curids for Google.

In strict sense, curid urls should always be purged.
Here it was just doing it when DPL is active, to decrease load.
If DPL is handling out curid links, those shouldn't lead to old pages.

Perhaps there should be a DPL option to choose if allowing curid urls or not?

Comment 3 Bawolff (Brian Wolff) 2010-01-10 05:44:36 UTC
Note: There is no longer any need for DPL's to output curid's whatsoever. The only reason it was there in the first place was to have a number in the urls. Since that can just as easily be served as a dummy parameter (which it is currently), DPL really does not need this anymore.
Comment 4 Platonides 2010-01-10 10:39:54 UTC
Right, but we may want (or not) to still fix caching problem with curid urls.
And there is also the issue of variant wikis.

Comment 5 Sumana Harihareswara 2012-01-10 22:33:54 UTC
Platonides, marking your patch as obsolete since it doesn't apply cleanly to trunk anymore.  I'm cc'ing Gabriel Wicke in case he wants to consider this issue and your approach in the parser rewrite.
Comment 6 Sumana Harihareswara 2012-01-10 22:34:19 UTC
Comment on attachment 6722 [details]
SquidUpdate, Title, DynamicPageList

This patch no longer applies to trunk per Rusty Burchfield's automated testing
https://docs.google.com/spreadsheet/ccc?key=0Ah_71HHl7qa7dGtvSms3TGpHQU9NU2Y1VmNzUEUteWc .
Comment 7 Bawolff (Brian Wolff) 2012-01-10 22:42:24 UTC
(In reply to comment #5)
> Platonides, marking your patch as obsolete since it doesn't apply cleanly to
> trunk anymore.  I'm cc'ing Gabriel Wicke in case he wants to consider this
> issue and your approach in the parser rewrite.

This isn't really a parser issue.

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links