Last modified: 2006-03-30 03:24:10 UTC

Wikimedia Bugzilla is closed!

Wikimedia has migrated from Bugzilla to Phabricator. Bug reports should be created and updated in Wikimedia Phabricator instead. Please create an account in Phabricator and add your Bugzilla email address to it.
Wikimedia Bugzilla is read-only. If you try to edit or create any bug report in Bugzilla you will be shown an intentional error message.
In order to access the Phabricator task corresponding to a Bugzilla report, just remove "static-" from its URL.
You could still run searches in Bugzilla or access your list of votes but bug reports will obviously not be up-to-date in Bugzilla.
Bug 5334 - Robots.txt - problem on :fr with Google
Robots.txt - problem on :fr with Google
Product: Wikimedia
Classification: Unclassified
General/Unknown (Other open bugs)
All All
: Normal normal with 1 vote (vote)
: ---
Assigned To: Nobody - You can work on this!
Depends on:
  Show dependency treegraph
Reported: 2006-03-23 15:07 UTC by dake
Modified: 2006-03-30 03:24 UTC (History)
0 users

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Description dake 2006-03-23 15:07:49 UTC
Some people complained on :fr that their names appear in Google in our requests
for deletion, the robots.txt seems to be correctly set (see Bug 4776) but it
doesn't seem to work.

# fr:
Disallow: /wiki/Wikip%C3%A9dia:Pages_%C3%A0_supprimer/

For example, try "metametris" on, the third or fourth link starts
with the disallowed URL. Could there be a problem related to encoding ?
Comment 1 Brion Vibber 2006-03-30 03:24:10 UTC
No summary appears for the page in the google results; this is consistent
with Google's treatment of robots.txt-disallowed pages.

Unfortunately you'll have to complain to Google if you don't like that; that's
just how they do things. The URL still appears in their index, it's just not
spidered and the content isn't available for search or cache.

Note You need to log in before you can comment on or make changes to this bug.