#1
|
|||
|
|||
MD5 Checker
Hello!
it would be great if the Jdownloader could compare the rs md5 with the file on disk md5. Sometimes if one part has an crc error, I have to download 3-7 parts until I find the rigtht one... |
#2
|
|||
|
|||
I'm not sure I understand your suggestion/wish. jD does check md5 when this checksum is available on the host.
It would be nice if the customer would be able to add this md5 checksum to the link (or refer to a .md5 file), just like a download password can be added. For the time being you need to use a separate program to check these files. See also :- - Manual md5 check (rapidshare) - Feature #808 : way to enter md5 or other hash manually |
#3
|
|||
|
|||
Few useful links.
Considder adding the to the tracker, please, to not get lost. 1) HTTP protocol already has support for MD5 headers. At least protocol, if not servers. See section 14.15 **External links are only visible to Support Staff****External links are only visible to Support Staff** 2) there is also proposal for other checksums standardized **External links are only visible to Support Staff****External links are only visible to Support Staff** 3) there are probably non-standardized approaches like Git: **External links are only visible to Support Staff****External links are only visible to Support Staff** Gnutella and G2 p2p networks (they are based on HTTP with extensions) Such extra headers are to be auto-used. 4) There is also TigerTree hash and probably other less frequently used trees. Their support would be especially cool, since they track error to specific chunk of file. Other checksums require to drop the whole file and reget it tabula rasa. *Tree checksums allows file to be patched in-place instead. PS: Okay, hope the links were not wiped and u can see them :-) |
#4
|
||||
|
||||
most implementations of MD5/SHA checksums are done/gathered from hoster API's. I'm not aware of any doing this part of a http header. Very few at that, we can only add support for hosters with checksums once we know the format provided.
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] |
#5
|
|||
|
|||
Hen and egg problem. Someone is to break the vicious circle :-)
Then all the hosters may be asked to add the standard header in addition to non-standard API. Providing hosters would tolerate auto-downloading tools at all. |
#6
|
|||
|
|||
You're right. That's why RS for instance is in another league. They've always had it and they really try hard to distinguish themselves from the crowd. They also support https. I don't see them disappear as fast as other hosts as they're technologically more advanced.
|
#7
|
|||
|
|||
I think many servers already have it, disguised as ETag header. It only is left to be made explicit :-)
|
#8
|
||||
|
||||
rapidshare is done via there API, it's not in any http headers and/or complies with http rfc mentioned above. I wouldn't say they are advanced by any means, other than they have a working/robust API unlike fileserve *cough cough*
@arioch1 maybe they do, but if its not in clear text or easily distinguishable its harder to support, whole point in standardising is to make it the same/easier. Currently the checksum functions are done within the plugin, but you could add http header support at the core, you would just have to set it's hash value to file<>db. The only other problem with header based checks is... it would require you to start downloading (do the captchas etc), because the final GET for the file will provide the required checksum, and not the steps prior. This would be totally suckie if your a freeuser you would use up spot on hoster if there's a queue/wait system (though wait times can be avoided by reconnecting). Like most rfc's it takes many years for them to be adopted wide spread. It's more a problem of httpds and business owner/operators not setting up/adopting these functions for what ever reason, and not necessarily the download/web clients, in this case. I'm still not aware of any hosters using headers to provide checksums. This method for the use of comparing files is best as a preventable measure when features like adding pile of links into JD when your not sure if the data is the same, for when mirror support comes into play, it would prevent data corruption. But until hosters freely give up this information I don't see how this will work well as a feature like the OP was asking. Most things have (positive and) negative sides, one of these is.. Most host providers do not provide this info as it helps copyright owners prove x data is the say as y or previous data. This allows them to issue take down notices with more ease and speed, if your thinking about the typical 'file hosters' and its use (my own generalisation statement).
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] Last edited by raztoki; 30.10.2011 at 16:37. Reason: spellin + refined ending |
#9
|
|||
|
|||
HEAD request could suffice to get hash, though some servers do not support HEAD at all.
> wait times can be avoided by reconnecting only on dynamic external IP's, which is not always the case :-) > It's more a problem of httpds and business owner/operators not setting up/adopting these functions That neither client would benefit from. Hen and egg. They sometimes make ad hoc solutions, like publishing checksums on the page or in .md5 corresponding file, so they wished to provide hashes. But they really do not invest time in IIS/HTTPD... Hen and egg. > Doesn't see this method for the use of comparing been very good This topic is not about comparing mirrors. It's about data integrity. Though TigerTree would be definiely better than "hit or miss" all-file hashes. > it would require you to start downloading So it is not the problem at all, since checksum would be needed and used after download complete :-) |
#10
|
||||
|
||||
Not sure if you noticed I refined my comment as it wasn't clear in what I meant, from your quoted text.
JD already supports this feature, checksums are supported on hosters when possible. I was more talking in terms of the OP request, to use it as a comparing mechanism pre download. Currently JD supports md5 and sha1/2, you cant compare these hash sums against one another that's another problem with the OP request also. Which hopefully rfcs will clear up and only use one checksum as it's standard. Still say it's easier to adopt functions once the support at the server level is standardised and slightly less important its adoption within (non) industry or sector.
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] |
#11
|
|||
|
|||
> checksums are supported on hosters when possible
But are not supported, if they are specified outside those hosters, for exmple just published on page with download links. User also can be the source of checksums. > in terms of the OP request, to use it as a comparing mechanism pre download. Yes, that's what i missed. But even then, when the only choic left to re-download anyway, ans slot would be used anyway, then it still makes sense to query hash from HTTP GET. And user can be asked if he really-really wants it (and understand what it is all about :-) ). > and only use one checksum as it's standard. why ? application can check for different checksums. Shareaza has 3 checksums supported, for each there is three states: unknown/doubtful/authoritative Medium state is usually when file is searched for one checksum, and some of pears report that file with this checksums also poseeses those checksums. Until downloaded, it can not be proved. But it better be acocunted for, than completely ignored The checksum should clearly be stated, like in magnet URIs, and then application can respect supported hash functions and ignore unsupported. |
#12
|
||||
|
||||
Not in the manner your wanting but yes its still implemented, and we make the effort of using checksums for supported hosters when they provide documentation of there API's/or published checksums on website. As I've said I'm unaware of any hosters providing any forms of checksums in http headers. The current implementation of checksumming is automatic for the sole purpose CRC of local copy vs remote copy. This process has nothing to do with the user and under it's current design/constraints. Our next major version is still going under heavy redesign maybe the properties window might display the checksum figure. Time will tell.
Keep in mind the OP idea wasn't declined by me (before my time on the forums) I meant the problem with checksums say on hoster 1 is sha2 hoster 2 md5 == problem, you can not compare. The OP request wouldn't tolerate that, but if all probable hash's given then no problems.
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] Last edited by raztoki; 30.10.2011 at 20:22. Reason: reorder, cut/paste |
#13
|
|||
|
|||
See also
|
Thread Tools | |
Display Modes | |
|
|