Last modified: 2011-05-07 19:19:32 UTC

Wikimedia Bugzilla is closed!

Wikimedia migrated from Bugzilla to Phabricator. Bug reports are handled in Wikimedia Phabricator.
This static website is read-only and for historical purposes. It is not possible to log in and except for displaying bug reports and their history, links might be broken. See T17676, the corresponding Phabricator task for complete and up-to-date bug report information.
Bug 15676 - Duplicate image reuploading should be forbidden
Duplicate image reuploading should be forbidden
Status: RESOLVED WONTFIX
Product: MediaWiki
Classification: Unclassified
Uploading (Other open bugs)
1.15.x
All All
: Lowest enhancement with 3 votes (vote)
: ---
Assigned To: Nobody - You can work on this!
:
: 28374 (view as bug list)
Depends on: 14925
Blocks:
  Show dependency treegraph
 
Reported: 2008-09-21 16:29 UTC by Bryan Tong Minh
Modified: 2011-05-07 19:19 UTC (History)
5 users (show)

See Also:
Web browser: ---
Mobile Platform: ---
Assignee Huggle Beta Tester: ---


Attachments

Description Bryan Tong Minh 2008-09-21 16:29:56 UTC
Uploading the same image over itself is just a waste of space, log entries and such and should as such throw an error. 

Should probably introduce a new return value for UploadBase::verifyUpload.
Comment 1 Mike.lifeguard 2008-09-22 22:05:52 UTC
See also bug 14171.

Are we sure that the hashes don't have errors? IIRC there was an issue where non-identical images had identical hashes; this was probably discovered at the time of the "massive image loss" so perhaps Tim knows more?
Comment 2 Adam Cuerden 2009-02-24 06:54:55 UTC
Don't forget that if thumbnail generation breaks, reuploading may be the only way to reset it. 
Comment 3 Mike.lifeguard 2009-02-24 21:26:36 UTC
(In reply to comment #2)
> Don't forget that if thumbnail generation breaks, reuploading may be the only
> way to reset it. 
> 

That shouldn't be (and AFAIK isn't) the case.
Comment 4 Bryan Tong Minh 2009-11-04 12:58:44 UTC
Are there any other reasons not to implement this? Else I will go ahead.
Comment 5 Adam Cuerden 2009-11-04 20:18:01 UTC
I see no good reason to go forwards with this, and think it should be scrapped. Possible problems include: 

* If an image gets corrupted, but a stored hash is the one being compared, the system breaks.
* Uploading the image again is, for people who aren't utter experts in Wiki work - and that includes me, an admin on commons - the only obvious way to regenerate thumbnails.
* There is no evidence of actual good  that this will do.
* There is a small chance of two images having the same hash. As the number of uploads increases, the chance of this affecting someone does as well.
* The case for implementation is not established.

Please do not go forwards with this. 


There is no good reason to restrict this behaviour in the absence of evidence  that it is causing any problem.
Comment 6 Rocket000 2009-11-05 21:40:40 UTC
It shouldn't be "forbidden" (at the very least admins should be able to override it if needed), but a nice little warning would be nice. I hate when upload logs are filled with identical versions because the uploader simply picked the wrong file name (e.g. the original version again instead of the new one). Or they, for some unexplained reason, revert to the same version multiple times in a row. It happens more than you'd think. Sure, user carelessness/inexperience is at fault here, but software can help. Actually, I don't know why something like that isn't already in place for duplicates in general. One of my favorite features of Bryan's Flickr uploader is it tells you if the file is already uploaded (doesn't always work, but 95% of the time in my experience). There's been a couple times I uploaded an image directly (after doing a reasonable search on Commons) only to find out afterword we already have it thanks to the automatically-generated duplicate link provided on the description page. It's like, oh, thanks for letting me know ''now''. What we want to do is help prevent accidental duplicate uploads, not intentional ones. So even if the hashes get corrupted and the system fails, what, .1% of the time?, it's ok since a human makes the final call.
Comment 7 Bryan Tong Minh 2010-01-13 21:21:07 UTC
Seems not worth the hassle at all.
Comment 8 Krinkle 2010-07-20 23:25:15 UTC
I also see every now and then new users re-upload with the intention of changing the description page.
They remember when uploading there was this form, so they re-upload. Fill in the entire Information template again, and then upload the same file again with the same name.

Warninging a user when re-uploading (either by 'new version' or by 'revert')  if the to-upload file is the same as the last revision, it should either be blocked for non-admins or a warning that can be ignored (like with warn abusefilters and forceeditsummary, the 2nd time it goes through).

-
My 2c
Comment 9 Bryan Tong Minh 2011-04-22 18:35:24 UTC
*** Bug 28374 has been marked as a duplicate of this bug. ***

Note You need to log in before you can comment on or make changes to this bug.


Navigation
Links