#1
|
|||
|
|||
![]()
[{
"enabled": "TRUE", "packageName": "my_package", "autoConfirm": "TRUE", "autoStart": "FALSE", "extractAfterDownload": "FALSE", "downloadFolder": "/home/username/HD", "overwritePackagizerEnabled": false, "text": "bunch of links", "addOfflineLink": false, "forcedStart": "TRUE" }] I have this crawljob that works to add links. It works but there are a few problems. 1. There is an option to ignore offline links, but if there are duplicate links it prompts the user what to do. Is there a value to set a default for the action? 2. Even after setting a downloadFolder, if the overwritePackagizerEnabled is not set to True, it gets overwritten by the default download location in the general tab. Is this by design, and is there a way to set the downloadFolder without disabling packagizer, as i set custom filenames through that. |
#2
|
||||
|
||||
![]()
1.) See Settings->Advanced Settings->
LinkgrabberSettings.defaultonaddeddupeslinksaction and LinkgrabberSettings.handledupesonconfirmlatestselection 2.) currently that's not supported but there is an open ticket for this issue but please know that you can also use dynamic tags within the crawljob file downloadfolder so you want this download folder to be applied AND custom rules that work on filename, right?
__________________
JD-Dev & Server-Admin Last edited by Jiaz; 22.03.2022 at 14:52. |
#3
|
|||
|
|||
![]()
1.) This works but since it sets the defaults for the whole program, it is less than ideal. Once enabled the crawljob works but if i want to add duplicate links afterwards for other jobs, it is not possible to do so without changing the setting again. An option for that particular job, like the offline value would be great.
2.) i set overwritePackagizerEnabled to true, and somehow am still getting custom filenames, so that all good now. |
#4
|
||||
|
||||
![]()
@michael88:
2.) overwritePackagizerEnabled, of course only specified/customized values are "overwritten". for example packagizer rules that modify package name will still work fine when your crawljob does not customize it 1.) I'm sorry but there is no support yet to have customized behaviour depending on *how you have added the links*. 1.1.) possible workaround, you use an additional eventscripter script that removes duplicates from linkgrabber automatically when they come from crawljob, see https://board.jdownloader.org/showthread.php?t=70525 and ask in that thread for help. Such a script should be very easy to do
__________________
JD-Dev & Server-Admin |
#5
|
|||
|
|||
![]()
You can create a packagizer rule to disable 'auto confirm' for 'dupes' based on 'source url'. It is also possible to filter them using linkfilter rule with same conditions.
|
#6
|
||||
|
||||
![]()
@mgpai: nice workaround
![]() You know the cool features better than I do ![]()
__________________
JD-Dev & Server-Admin |
#7
|
|||
|
|||
![]() Quote:
I was able to set the view with the same conditions and that works, but not the packagizer or filter. Is there something else i need to set for the packagizer and filter, the same conditions with views works? Also thanks for the help. Last edited by michael88; 24.03.2022 at 04:15. Reason: Incomplete |
#8
|
||||
|
||||
![]()
@michael88:
You have to remove Quote:
Quote:
__________________
JD-Dev & Server-Admin |
#9
|
|||
|
|||
![]()
That just worked wonderfully, thanks for all the help!!
|
#10
|
||||
|
||||
![]()
@michael88: Thanks for the feedback
![]()
__________________
JD-Dev & Server-Admin |
#11
|
|||
|
|||
![]()
Yeah works exactly as i wished and saves me a bunch of time as now i can have a python script write crawljobs, rather than adding links by hand.
As always thanks for the very helpful and quick solutions. Cheers |
#12
|
||||
|
||||
![]()
You're welcome
![]() ![]()
__________________
JD-Dev & Server-Admin |
![]() |
Thread Tools | |
Display Modes | |
|
|