JDownloader Community - Appwork GmbH
 

Reply
 
Thread Tools Display Modes
  #1  
Old 28.03.2019, 16:18
pbm2 pbm2 is offline
Junior Loader
 
Join Date: Mar 2019
Posts: 12
Default Please add a variable for customized extract path in Folderwatch plugin

Currently there is no variable that can be used inside the .crawlerjob files to set a custom extraction path.
Reply With Quote
  #2  
Old 28.03.2019, 19:32
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 63,432
Default

That's currently not possible because at the moment there is no way to 'forward' this type of additional information

__________________
JD-Dev & Server-Admin
Reply With Quote
  #3  
Old 28.03.2019, 19:33
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 63,432
Default

In meantime you could 'hack' this by setting the extraction folder as comment and with help of Eventscripter script, read the comment and then customize the extraction folder once the links are in download list
See https://board.jdownloader.org/showthread.php?t=70525
Ask mgpai for help, he's our Scriptmaster of the universe
__________________
JD-Dev & Server-Admin
Reply With Quote
  #4  
Old 10.04.2019, 11:49
pbm2 pbm2 is offline
Junior Loader
 
Join Date: Mar 2019
Posts: 12
Default

Problem is that I have multiple downloads active that need all different extraction subfolders. If I understand correctly the 'hack' by setting the extraction folder as comment will change the extraction folder for all active downloads?

Anyway, I hope you can find some free time to add this variable
Reply With Quote
  #5  
Old 10.04.2019, 18:56
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 63,432
Default

Hmm, maybe I don't understand. My idea was to use comment field in Crawljob and then make use of Eventscripter to read the comment of all links from that Crawljob to customize the extraction to destination.
The script would only change the extraction to destination according to the comment of the link which should be different for each crawljob.

Or did I understand you wrong? Or you me?
__________________
JD-Dev & Server-Admin
Reply With Quote
  #6  
Old 26.04.2019, 14:27
pbm2 pbm2 is offline
Junior Loader
 
Join Date: Mar 2019
Posts: 12
Default

I was wrong. I was thinking this would change the global destination path, but you made it clear it will be done for each crawjob.
Reply With Quote
  #7  
Old 26.04.2019, 17:17
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 63,432
Default

thanks for the feedback. So did you try to contact mgpai for help with solution via eventscripter?
__________________
JD-Dev & Server-Admin
Reply With Quote
  #8  
Old 07.05.2019, 00:32
pbm2 pbm2 is offline
Junior Loader
 
Join Date: Mar 2019
Posts: 12
Default

yes, contacted him.

Quote:
Only Jiaz can add that 'key' to the folderwatch plugins. You will have to use the eventscripter alternative suggested by Jiaz. You can add the folder as 'comment' in crawljob or append it to the url (e.g. url#xf=c:\downloads\extracted) and use a script to set it as extraction folder.
Reply With Quote
  #9  
Old 07.05.2019, 09:34
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 63,432
Default

I'm sorry but at the moment I don't find time to work on this
__________________
JD-Dev & Server-Admin
Reply With Quote
  #10  
Old 03.06.2019, 07:22
pbm2 pbm2 is offline
Junior Loader
 
Join Date: Mar 2019
Posts: 12
Default

Hi Jiaz, sorry to bother you again.. I created a simple script as you suggested which should fetch the comment in the .crawljob files and then set the extraction folder:

Code:
var myString = package.getComment();
archive.setExtractToFolder(myString);
But there are multiple problems.

1st problem: When adding 'comment=sometest' to the crawljob file it simply does not read the comment. The variable is just ignored. Not matter what I put into it or just restarting jdownloader. It never displays anything in the 'Comment' table in the Download tab of Jdownloader. I also tested with this script:

Code:
var myString = package.getComment();
writeFile(JD_HOME + "/log.txt", JSON.stringify(myString) + "\r\n", true);
It simply writes 'null' the log.txt

2nd problem: there is simply no 'archive.setExtractToFolder();' variable in Jdownloader. There is only a archive.getExtractToFolder(); (https://svn.jdownloader.org/issues/85038)

3rd problem: the 'archive....Extract...();' variables can only be used the with the 'Any extraction' trigger while package.getComment() can only be used with 'Package finished' trigger.

Last edited by pbm2; 03.06.2019 at 08:07.
Reply With Quote
  #11  
Old 03.06.2019, 09:37
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 63,432
Default

1.) The comment is assigned to the individual links. archive.getDownloadLinks()[0].getComment should work
2.) sorry, my bad. I will try to add it when I find time. In meantime maybe you could just use different download locations and let JDownloader extract into the same folder (default)?

3.) it depends on what package variable you're trying to access. *Package finished* does have global defined variable. for your case you will have to access the archive and the individual links
__________________
JD-Dev & Server-Admin
Reply With Quote
  #12  
Old 05.06.2019, 10:20
mgpai mgpai is offline
Script Master
 
Join Date: Sep 2013
Posts: 544
Default

Quote:
Originally Posted by Jiaz View Post
In meantime you could 'hack' this by setting the extraction folder as comment and with help of Eventscripter script, read the comment and then customize the extraction folder once the links are in download list
Script:
Code:
// Parse/Set Extraction Folder from Link Comment
// Trigger: A Download Started

var archive = link.getArchive();
var comment = link.getComment();

if (archive && comment) {
    var currentFolder = archive.getExtractToFolder();
    var newFolder = comment.match(/^(?:[a-zA-Z]{1}:)?(?:\\|\/).+/) || currentFolder;
    if (currentFolder != newFolder) {
        var linkId = archive.getDownloadLinks()[0].UUID;
        var archiveId = callAPI("extraction", "getArchiveInfo", [linkId], [])[0].archiveId;
        callAPI("extraction", "setArchiveSettings", archiveId, {
            "extractPath": newFolder.toString()
        })
    }
}
Reply With Quote
  #13  
Old 05.06.2019, 11:34
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 63,432
Default

mgpai, our hero, always finds a workaround for missing methods *thanks!*
__________________
JD-Dev & Server-Admin
Reply With Quote
  #14  
Old 05.06.2019, 17:28
mgpai mgpai is offline
Script Master
 
Join Date: Sep 2013
Posts: 544
Default

@Jiaz: It's a pleasure.
Reply With Quote
  #15  
Old 06.06.2019, 00:03
pbm2 pbm2 is offline
Junior Loader
 
Join Date: Mar 2019
Posts: 12
Default

Thank you very much! It works perfectly.
Reply With Quote
  #16  
Old 06.06.2019, 03:21
pbm2 pbm2 is offline
Junior Loader
 
Join Date: Mar 2019
Posts: 12
Default

One more question:

I want to execute a script when all Jdownloader activity (downloading/extraction) is finished. Basically exactly the same like the JDshutdown extension, but I want to run a custom script:
callAsync(null, "cmd", "/c %windir%\\sysnative\\qprocess|findstr \"process1.exe process2.exe\"||shutdown -s -f")
Problem is that i can't find an event trigger for this. 'Download Controller Stopped' does execute the script if extraction is still going on. The other triggers are individual to every download/package.

Last edited by pbm2; 06.06.2019 at 03:45.
Reply With Quote
  #17  
Old 06.06.2019, 10:11
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 63,432
Default

@pbm2: you can use interval trigger and for example check all 60 secs for download/extraction activity and then you can call your script and don't forget to also shutdown JDownloader after initiating the shutdown. I'm sure mgpai can help you with this
__________________
JD-Dev & Server-Admin
Reply With Quote
  #18  
Old 06.06.2019, 12:55
mgpai mgpai is offline
Script Master
 
Join Date: Sep 2013
Posts: 544
Default

Quote:
Originally Posted by pbm2 View Post
'Download Controller Stopped' does execute the script if extraction is still going on.
You can prevent it by using the following loop, before executing your call:
Code:
while (callAPI("extraction", "getQueue").length) sleep(60000);
Reply With Quote
  #19  
Old 06.06.2019, 13:25
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 63,432
Default

@mgpai: I'm always fascinated HOW you solve those matter. I would never thought about this simple and yet working idea
__________________
JD-Dev & Server-Admin
Reply With Quote
  #20  
Old 07.06.2019, 18:42
pbm2 pbm2 is offline
Junior Loader
 
Join Date: Mar 2019
Posts: 12
Default

I made 2 scripts, but not sure which is the better one. Does && in Jdownloader means the same as && in cmd? (execute second command only if the first was successfull)? Or is && the same as & in JDownloader? I didn't noticed a difference using && or &

Code:
if (isDownloadControllerIdle() && !callAPI("extraction", "getQueue").length && !callAPI("linkcrawler", "isCrawling")) && callAsync(null, "cmd", "/c %windir%\\sysnative\\qprocess|findstr \"process1.exe process2.exe\"||shutdown -s -f")

Code:
if (isDownloadControllerIdle()) {
    if (!callAPI("extraction", "getQueue").length) {
        if (!callAPI("linkcrawler", "isCrawling")) {
            callAsync(null, "cmd", "/c %windir%\\sysnative\\qprocess|findstr \"process1.exe process2.exe\"||shutdown -s -f")
        }
    }
}
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 02:16.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.