JDownloader Community - Appwork GmbH
 

 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #1  
Old 22.03.2022, 13:31
michael88 michael88 is offline
DSL Light User
 
Join Date: May 2017
Posts: 34
Default Crawljob - Dealing with duplicate links?

[{
"enabled": "TRUE",
"packageName": "my_package",
"autoConfirm": "TRUE",
"autoStart": "FALSE",
"extractAfterDownload": "FALSE",
"downloadFolder": "/home/username/HD",
"overwritePackagizerEnabled": false,
"text": "bunch of links",
"addOfflineLink": false,
"forcedStart": "TRUE"
}]

I have this crawljob that works to add links. It works but there are a few problems.

1. There is an option to ignore offline links, but if there are duplicate links it prompts the user what to do. Is there a value to set a default for the action?

2. Even after setting a downloadFolder, if the overwritePackagizerEnabled is not set to True, it gets overwritten by the default download location in the general tab. Is this by design, and is there a way to set the downloadFolder without disabling packagizer, as i set custom filenames through that.
Reply With Quote
 

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 16:45.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2023, Jelsoft Enterprises Ltd.