#1
|
|||
|
|||
Package at 100%, but parts are missing
Hey there,
i found no real bug tracker for jd2 so I will post the bug here: Since files sometimes fail to get found at first try it happened more than once to me, that a package was marked with 100% as complete, but wasn't. One or more parts were completly ignored due they were disabled or something automatically by Jd2. Since there is quiet a chance that a download existing in the download list is very likely to be available or at least was it as it was passed to it from link collector I would recommend a very small retry time of about...2 minutes or so (by default) and disabled it after the second (by default) missing file report of the hoster. But of course with the package not reaching 100% after a part has been marked as "down". Thank you for your attention. Gr33tZ Rn |
#2
|
|||
|
|||
No answer at all?
|
#3
|
||||
|
||||
Hi
I didn't ever experience such an error. Maybe the parts were simply not even added to the linkgrabber GreeZ pspzockerscene
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#4
|
|||
|
|||
They were found by the link grabber and added to the list.
And If I reset them they download without any problem. They are not really down. |
#5
|
||||
|
||||
Well I need to know how to reproduce this...
GreeZ psp
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#6
|
|||
|
|||
There is no real way, I think.
Maybe If you add a non file click hoster file e.g. from an ftp you control along with other files in a package and after adding them to the download list you delete one or two of them. I think that could work. -- Edit: Tried it. The above reproduces the problem. Last edited by regsnerven; 14.04.2013 at 19:42. |
#7
|
|||
|
|||
Still no news to this?
|
#8
|
|||
|
|||
regsnerven
If the archive is divided into parts - should be noted that the last part of a different size (usually smaller size) - if it is a similar size as the other parts that may be missing other parts. You can also check the checksums (if the file is very large - some users post checksums to compare. User - sent to the server archive damaged volumes. Despite attempts to re-download - file can still be bad. |
#9
|
|||
|
|||
I think you did not get the point of this bug report.
If you have a package with either part links in it or also any kind of downloadable files and one of that files is down the package shows a progress of 100% if all other files except the down one are downloaded. That is the problem here, because that is basicly a wrong state of progress. |
#10
|
|||
|
|||
still nothing on this one?
|
#11
|
|||
|
|||
Usually, you have to extract all the party, or the most important is the first part and the last (if parts are missing - for example, you have downloaded incomplete - when, one on the server has been removed)
|
#12
|
||||
|
||||
Bit hard to comment, or even reproduce if you don't give us example links in which result in your experienced pattern. It would also be useful if you took screen shots comparing linkgrabber and download tab to prove this pattern.
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] |
#13
|
|||
|
|||
It does not matter whether the files to download are archives or other files.
I am showing a pictured way to reproduce the problem:
This is the whole problem. As mentioned it would be nice, if JDs tries to redownload the files after some time, because netload and premiumize.me in combination often produces false down messages. I hope this is now enough? Because I do not know what more I should explain^^ Any further questions, please ask. Gr33tZ Rn Last edited by raztoki; 07.07.2013 at 16:19. Reason: placed second image in IMG tag vs URL |
#14
|
||||
|
||||
did you mean delete file from the hoster ?? or JD ??
I had a user the other day that could not remove items from the download tab, from what I assume is corruption of some form. Think I renamed cfg/ to cfg.old/ and started JD again and everything was fine once again. if you deleted from the hoster, JD behaves in the manner we intend. JD disabled files for various reasons: file not found, if the retry count expires, fatal events, etc. This is to allow the queue to continue process without loops happening. It also permanently frees the slot from that download as if it retries every minute or what ever its hogging a slot from another download. Thus reducing over all download speed and increasing the required time to download queue. raztoki
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] |
#15
|
||||
|
||||
I'm quite sure you mean from the hoster.. If you want to add more to the conversation provide working example links (and deleted), so it can be reproduced on our end. We are quite positive JD behaves in the manner we intend, if link is detected as offline, then its not downloadable. why wait for it infinite ?? We show 100% progress because all files in package are done (in manner of items downloadable). Offline files are not downloadable.
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] |
#16
|
|||
|
|||
As i said at step number three in my "guide", I delete the file from the ftp. So, yes, i delete the file from the hoster.
In my opinion a PACKAGE's download progress is 100% completed if ALL the files in it are downloaded successfully and up to 100%. If not the package is of course completly progressed, but not completly downloaded. I can not provide any links, because the ftp scenario is just an example to reproduce this problem. This problem mainly occurs if I download files from netload via premiumize.me. Those files are not down, because if I reset the files, it gets downloaded without any problems. I am pretty sure I got this problem without the use of premiumize.me between me and netload and also with other OCH like uploaded.net. I do not want JD to make infinite retries on a file, but as you already do for files that are temporarily not available, you could disable them temporarily and retry the file after for example 30 seconds or so. There would be one retry, so what is the harm? The packaged is broken in most cases anyway, because one part is missing, if it is really down. Do you get my point? |
#17
|
|||
|
|||
This problem also occurs when you use a Multi-One-Click-Hoster, Alldebrid in my case. Especially with Netload, parts are often marked as "file not found" and get disabled even though the files exist on their server. If you have all your packages minimized, jdownloader sets the package to Finished if all but the now disabled "file not found" parts are downloaded.
|
#18
|
|||
|
|||
Thanks for supporting this bug report and giving another point of view to the whole 'intended behavior' thing.
|
#19
|
||||
|
||||
Quote:
Multihosters have some problems with handling because the current frame work isn't designed with multihosters in mind. So on some multihoster errors we remove given host from the supported host array. When you use multihosters use will experience a multiplication factor of issues as effectively service in the middle which you have little to no control over. multihoster has plugins they need to maintain, the multihoster could introduce socket issues receiving from hoster or sending to you. Even with better error/linkstatus handling for multihosters in mind (we do have a ticket for it), it will just make our lives easier making plugins and give the users a better multihoster experience. But it wont stop downloads from been disabled for what ever reason. I've said before and I'll say it again, you must accept these introduced issues from the multihoster. If you don't want so many of these introduced issues, I suggest you buy a standard premium account on those hosters you use most. These are the pros and cons of multihoster vs normal hoster accounts. Quote:
Thanks for your report. raztoki
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] Last edited by raztoki; 09.07.2013 at 22:41. Reason: by = buy |
#20
|
|||
|
|||
With my thank you, I was talking to selftitled ^^
But takes it as yours no problem. Feedback is always appreciated. I know multihosters can cause theses issues. But that's not really the point I want to change. Cross out retries for down files. I have no problem with resetting one or two downloads from time to time. My real problem is that a package is NOT! completly if files in this package are down. That is my point. I want you as the dev team to think about this. I as a normal user would not expect a package to be 100% completed if parts are missing. And it is of course really annoying if you have like 15 packages and have to recheck them all over again after they have "finished" downloading to make sure they are complete. |
#21
|
||||
|
||||
Well I gathered it was to both of us, considering selftitled brought outside help and myself because you mentioned intended behaviour, which he never stated.
Btw JD2 has 'right click context > other > resume' feature which resets linkstatus/states without the need to reset data. I don't think the package behaviour will be changing. It will always be on isDownloadable link status, a offline file will never be downloadable. I can give some additional advice to find troubled downloads would be to use the 'failed downloads' filter quickly. raztoki
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] |
#22
|
|||
|
|||
I have no problem with clicking resume on parts occasionally, it only bothers me with Netload but that's no issue with JD.
But while you mention resume. To properly resume a part in this scenario you have to click resume (right click context), enable it and click on "force download" for some reason which is a little copious to be honest. So in other words, "resume" has no use for multihosters as it seems (edit: except you get a weird "file not found" error with a red cross while your part is unfinished). Maybe it's just Alldebrid Last edited by selftitled; 11.07.2013 at 18:42. |
Thread Tools | |
Display Modes | |
|
|