#1
|
|||
|
|||
How to setup JDownloader on NAS to get and the subfolders?
Hi,
I installed JDownloader on my NAS and using the web interface to download but I cannot figure out how to set it up in order to download the full path (main and sub folders) and create the same structure on my NAS Like website.com/folder1 (contains files and folders) website.com/folder1/sub1 (contains files and folders) website.com/folder1/sub1/sub1_sub (contains files and folders) ... website.com/folder1/sub2 (contains files and folders) website.com/folder1/sub2_sub (contains files and folders) ... In the folder that I will specify on my NAS to create exactly the same structure. Thanks |
#2
|
||||
|
||||
Hi,
do you have website-specific URLs and want to use the structure of the URL as the filepath in JD or do you have cloud services URLs like e.g. from Google Drive or MEGA.nz? For the 2nd type of URLs, JD should auto-set the path by default - the first one might be tricky. Do you have working example URLs of your URLs for me? You can also send them to me via PN. -psp- EDIT Also your post kind of implies that you already got this working on a "normal" JDownloader with GUI - do I understand that correctly?
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download Last edited by pspzockerscene; 25.05.2020 at 17:08. |
#3
|
||||
|
||||
When adding links, please pay attention to the checkbox *override packagizer*. This checkbox disables the packagizer rules!
__________________
JD-Dev & Server-Admin |
#4
|
|||
|
|||
Quote:
First of all I did not try this on a "normal" JDownloader. I have a URL for a private server that contains files and folders. In the folders there are other files and folders and so on. What I am doing now is to copy each folder's link separately and download only the files from there. What I want to do is something like your first point. To give the main url and JD to download the whole information from all subfolders on my nas and keep the same structure if possible. As I mentioned above it´s a private server and the access is restricted to specific IPs, so I cannot provide any working links. |
#5
|
||||
|
||||
maybe easier with wget --mirror ? as its designed todo this type of task out of the box?
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] |
#6
|
||||
|
||||
To make this work in JDownloader you need to create a custom packagizer rule that parses folder information from URL and sets downloaddirectory accordingly
We can check/help but need example links
__________________
JD-Dev & Server-Admin |
#7
|
||||
|
||||
You also need a Linkcrawler Rule to tell/teach JDownloader how to *travel* the links.
I would use tools like wget that are designed for this task.
__________________
JD-Dev & Server-Admin |
#8
|
|||
|
|||
I installed Super Shell on NAS but the wget command does not support neither recursive nor mirror parameters.
Running wget from a pc is not a solution. The only solution remains the JD. By saying "We can check/help but need example links" you mean something real or just to use it to build the rule? |
#9
|
||||
|
||||
Something real because we need to check and test our rules/help. It's nothing you just build out of nothing without being able to actual test it.
__________________
JD-Dev & Server-Admin |
Thread Tools | |
Display Modes | |
|
|