#1
|
|||
|
|||
[EventScripter] Suspend downloads while Linkgrabber is busy
Hello,
I'm sometimes downloading from hosters with a download limit which I get around using the reconnect feature for getting a new IP. To avoid any linkgrabber jobs from failing and getting into trouble because of having no network connectivity during a reconnect (takes ~15 seconds), I would like to suspend all downloads (so also no reconnect would trigger) for as long as the LinkGrabber is busy. I saw that the Eventscripter has an event "New Crawler Job" in which I could stop all downloads. But there seems to be no event for "All crawling jobs finished" which could be used to re-enable the downloads again afterwards? Are there any options available to make JD behave like this? Cheers and thanks in advance! |
#2
|
||||
|
||||
I was just about to say: EventScripter.
Unfortunately I'm not that familiar with it so you'll need to wait for an answer of either Jiaz or mgpai. -psp- EDIT Okay I might have misunderstood you in the first place - you want to avoid reconnects interrupting the LinkGrabber?
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download Last edited by pspzockerscene; 06.01.2021 at 16:52. |
#3
|
|||
|
|||
yeah thats the ultimate goal. when my downloads are busy it will reconnect at random times whenever the download limit is reached .. during the reconnect there will be no network for some time which would affect the LinkGrabber if it is just busy crawling for links (I guess?).
Both things are automatic running (crawljobs being added from a script, reconnects whenever JD decides its needed). |
#4
|
||||
|
||||
To be honest I've always thought JD would already auto handle this but I'am wrong - I guess maybe it was ignored because most crawljobs are processed quickly or most of them were back then ... but I get your point.
If you add a large amount of URLs and/or e.g. let JD crawl a complete twitter profile this can lead to issues. Maybe a setting for this in the Reconnect setting would even make sense ... -psp-
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download Last edited by pspzockerscene; 06.01.2021 at 17:17. |
#5
|
|||
|
|||
I must admit its more a theoretical problem I thought of, nothing that I actively saw happening yet.
A setting in the LinkGrabber or in the Reconnect might work too yeah. Or an event in the Eventscripter to trigger that the LinkGrabber is not busy anymore. |
#6
|
|||
|
|||
Quote:
|
#7
|
|||
|
|||
These were not even mentioned in the eventscripter help, I never saw this other API yet.
I will sure check it out tomorrow. Thanks for the hint! |
#8
|
||||
|
||||
At the moment there is no out-of-the-box support for reconnect to wait for linkcrawler to finish
__________________
JD-Dev & Server-Admin |
#9
|
||||
|
||||
@mikk88: I'm sure mgpai can help you with this or check out existing scripts here https://board.jdownloader.org/showthread.php?t=70525
__________________
JD-Dev & Server-Admin |
#10
|
|||
|
|||
Code:
/* Wait while linkcrawler/linkcollector is active Trigger: Before a Reconnect */ while ( callAPI("linkcrawler", "isCrawling") || callAPI("linkgrabberv2", "isCollecting")) { sleep(10 * 1000); // check state every 10 seconds } |
#11
|
|||
|
|||
wow thanks mgpai!
that looks easier than I thought, very cool |
#12
|
||||
|
||||
@mgpai: Thanks for small/easy/great solution/workaround
__________________
JD-Dev & Server-Admin |
Thread Tools | |
Display Modes | |
|
|