JDownloader Community - Appwork GmbH
 

Go Back   JDownloader Community - Appwork GmbH > International Support > JDownloader 2
Reply
 
Thread Tools Display Modes
  #1  
Old 22.03.2022, 13:31
michael88 michael88 is offline
Ultra Loader
 
Join Date: May 2017
Posts: 45
Default Crawljob - Dealing with duplicate links?

[{
"enabled": "TRUE",
"packageName": "my_package",
"autoConfirm": "TRUE",
"autoStart": "FALSE",
"extractAfterDownload": "FALSE",
"downloadFolder": "/home/username/HD",
"overwritePackagizerEnabled": false,
"text": "bunch of links",
"addOfflineLink": false,
"forcedStart": "TRUE"
}]

I have this crawljob that works to add links. It works but there are a few problems.

1. There is an option to ignore offline links, but if there are duplicate links it prompts the user what to do. Is there a value to set a default for the action?

2. Even after setting a downloadFolder, if the overwritePackagizerEnabled is not set to True, it gets overwritten by the default download location in the general tab. Is this by design, and is there a way to set the downloadFolder without disabling packagizer, as i set custom filenames through that.
Reply With Quote
  #2  
Old 22.03.2022, 14:50
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,554
Default

1.) See Settings->Advanced Settings->
LinkgrabberSettings.defaultonaddeddupeslinksaction
and
LinkgrabberSettings.handledupesonconfirmlatestselection

2.) currently that's not supported but there is an open ticket for this issue

but please know that you can also use dynamic tags within the crawljob file downloadfolder
so you want this download folder to be applied AND custom rules that work on filename, right?
__________________
JD-Dev & Server-Admin

Last edited by Jiaz; 22.03.2022 at 14:52.
Reply With Quote
  #3  
Old 23.03.2022, 05:15
michael88 michael88 is offline
Ultra Loader
 
Join Date: May 2017
Posts: 45
Default

1.) This works but since it sets the defaults for the whole program, it is less than ideal. Once enabled the crawljob works but if i want to add duplicate links afterwards for other jobs, it is not possible to do so without changing the setting again. An option for that particular job, like the offline value would be great.

2.) i set overwritePackagizerEnabled to true, and somehow am still getting custom filenames, so that all good now.
Reply With Quote
  #4  
Old 23.03.2022, 10:48
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,554
Default

@michael88:
2.) overwritePackagizerEnabled, of course only specified/customized values are "overwritten". for example packagizer rules that modify package name will still work fine when your crawljob does not customize it

1.) I'm sorry but there is no support yet to have customized behaviour depending on *how you have added the links*.
1.1.) possible workaround, you use an additional eventscripter script that removes duplicates from linkgrabber automatically when they come from crawljob, see https://board.jdownloader.org/showthread.php?t=70525 and ask in that thread for help. Such a script should be very easy to do
__________________
JD-Dev & Server-Admin
Reply With Quote
  #5  
Old 23.03.2022, 11:10
mgpai mgpai is offline
Script Master
 
Join Date: Sep 2013
Posts: 1,553
Default

Quote:
Originally Posted by michael88 View Post
1.) ...An option for that particular job, like the offline value would be great.
You can create a packagizer rule to disable 'auto confirm' for 'dupes' based on 'source url'. It is also possible to filter them using linkfilter rule with same conditions.
Reply With Quote
  #6  
Old 23.03.2022, 11:35
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,554
Default

@mgpai: nice workaround and yeah, didn't thought of using the linkfilter!
You know the cool features better than I do
__________________
JD-Dev & Server-Admin
Reply With Quote
  #7  
Old 24.03.2022, 04:13
michael88 michael88 is offline
Ultra Loader
 
Join Date: May 2017
Posts: 45
Default

Quote:
Originally Posted by mgpai View Post
You can create a packagizer rule to disable 'auto confirm' for 'dupes' based on 'source url'. It is also possible to filter them using linkfilter rule with same conditions.
I set a custom package name and used that as a matching condition and also set link is already in dowloadlist condition to true. Then i set auto confirm to disabled but that doesn't do anything, i am still prompted by the dialog box for dealing with duplicates.

I was able to set the view with the same conditions and that works, but not the packagizer or filter.

Is there something else i need to set for the packagizer and filter, the same conditions with views works?
Also thanks for the help.

Last edited by michael88; 24.03.2022 at 04:15. Reason: Incomplete
Reply With Quote
  #8  
Old 24.03.2022, 10:07
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,554
Default

@michael88:
You have to remove
Quote:
"autoConfirm": "TRUE",
else it will *overwrite* your packagizer rules due to
Quote:
i set overwritePackagizerEnabled to true,
Create two packagizer rules, one that disables "auto confirm" for duplicated links, another one that enables "auto confirm" for none duplicated links. the 2nd rule is not required if "auto confirm" is default enabled
__________________
JD-Dev & Server-Admin
Reply With Quote
  #9  
Old 25.03.2022, 11:00
michael88 michael88 is offline
Ultra Loader
 
Join Date: May 2017
Posts: 45
Default

That just worked wonderfully, thanks for all the help!!
Reply With Quote
  #10  
Old 25.03.2022, 11:52
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,554
Default

@michael88: Thanks for the feedback It works as expected/wished?
__________________
JD-Dev & Server-Admin
Reply With Quote
  #11  
Old 25.03.2022, 14:06
michael88 michael88 is offline
Ultra Loader
 
Join Date: May 2017
Posts: 45
Default

Quote:
Originally Posted by Jiaz View Post
@michael88: Thanks for the feedback It works as expected/wished?
Yeah works exactly as i wished and saves me a bunch of time as now i can have a python script write crawljobs, rather than adding links by hand.

As always thanks for the very helpful and quick solutions. Cheers
Reply With Quote
  #12  
Old 25.03.2022, 14:09
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,554
Default

You're welcome all thanks goes to mpgai for his idea with Packagizer
__________________
JD-Dev & Server-Admin
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 04:58.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.