JDownloader Community - Appwork GmbH
 

Notices

Reply
 
Thread Tools Display Modes
  #1  
Old 19.01.2020, 15:45
djmakinera djmakinera is offline
Banned
 
Join Date: May 2010
Location: Poland
Posts: 8,387
Default Import URL from file

Where's the option?


JD2 -> File -> Import -> URL from file (example: text file)
Reply With Quote
  #2  
Old 20.01.2020, 17:57
pspzockerscene's Avatar
pspzockerscene pspzockerscene is online now
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 70,921
Default

Which option?

File --> Load linkcontainer

-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #3  
Old 21.01.2020, 04:45
djmakinera djmakinera is offline
Banned
 
Join Date: May 2010
Location: Poland
Posts: 8,387
Default

There is no TXT extension in the options
so I choose "All Files"
I choose a text file with links and nothing happens.

That's why I asked about it.
Reply With Quote
  #4  
Old 21.01.2020, 17:37
thecoder2012's Avatar
thecoder2012 thecoder2012 is offline
Official 9kw.eu Support
 
Join Date: Feb 2013
Location: Internet
Posts: 1,324
Default

Quote:
Originally Posted by djmakinera View Post
Where's the option?
There is no option in JDownloader.

Quote:
Originally Posted by djmakinera View Post
That's why I asked about it.
You can try a script in the eventscripter with button solution as workaround.
__________________
Join 9kw.eu Captcha Service now and let your JD continue downloads while you sleep.
Reply With Quote
  #5  
Old 21.01.2020, 18:08
pspzockerscene's Avatar
pspzockerscene pspzockerscene is online now
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 70,921
Default

Thx thecoder2012.

-psp-
EDIT

You could also use any script you like and then pass the URLs to JDownloader via myjdownloader/remote API.
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?

Last edited by pspzockerscene; 21.01.2020 at 18:10.
Reply With Quote
  #6  
Old 21.01.2020, 18:12
mgpai mgpai is offline
Script Master
 
Join Date: Sep 2013
Posts: 1,533
Default

If the text file contains only urls (one per line), it can be loaded via "Load linkcontainer" by changing the extension to 'crawljob', if the 'folderwatch' extension is installed/enabled. It can also be added by placing it in the folder specified in the 'folderwatch' settings.
Reply With Quote
  #7  
Old 21.01.2020, 20:34
djmakinera djmakinera is offline
Banned
 
Join Date: May 2010
Location: Poland
Posts: 8,387
Default

Excuse me. I haven't seen any tool yet that would not make it easy for every user to load links from a text file.
Any other download manager has this option in the "File-> Import" configuration
Reply With Quote
  #8  
Old 25.02.2020, 00:11
ticedoff8 ticedoff8 is offline
Modem User
 
Join Date: Feb 2020
Location: USA, California
Posts: 4
Default

This is my 1st question on the site, I hope you will forgive if it is not a great question.
On a Win10 64bit workstation running the latest release of JD2.
I have installed and enabled the 'folderwatch' extension
Now, I have a file with 50 fully qualified URL (1 per line) and the file extension is <filename>.crawljob.

Each line is simple and looks like this:
**External links are only visible to Support Staff****External links are only visible to Support Staff**
With the Gallery_XX incrementing from 01 to 50.
There is no other text in the file other than 50 lines of unique URLs.

I click "File" => "Load Linkcontainer" and find the linkfile.crawljob
Then, nothing happens.

The file was created on my Linux system with a for/do loop and then edited on the Win10 system with `notepad++`, so there should be no special characters.

I also changed the ["folderwatch"] value to ["k:"] (note: there are two "" in driver letter - the editor removes the 2nd one when I post) the to see if it would use the K drive for the .crawljob files. There were no warnings, but it didn't help.

I also restarted JD2 between each change (just in case).

Suggestions?

Last edited by ticedoff8; 25.02.2020 at 00:15.
Reply With Quote
  #9  
Old 25.02.2020, 00:18
pspzockerscene's Avatar
pspzockerscene pspzockerscene is online now
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 70,921
Default

Does JD have a plugin that officially supports your URLs?
It looks as if you want to let JD deep-parse every of your single URLs.
Does every of your URLs lead to more URLs e.g. an image gallery?

Best would be to have a real life example ...

-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #10  
Old 25.02.2020, 00:55
ticedoff8 ticedoff8 is offline
Modem User
 
Join Date: Feb 2020
Location: USA, California
Posts: 4
Default

Quote:
Does JD have a plugin that officially supports your URLs?
Not that I know of.
Quote:
It looks as if you want to let JD deep-parse every of your single URLs.
That makes sense.
Quote:
Does every of your URLs lead to more URLs e.g. an image gallery?
Yes.
The "Gallery_XX"/index.htm will bring up a gallery of thumbnail sized pictures with each picture representing a full-size .JPG that will be displayed.

When I use any one of the URL in a typical browser, it brings up the image galley page with X-number of thumb-nail sized pictures (5 to 10 per gallery).
If I paste the same URL directly into LinkGrabber, LinkGrabber shows the same number of files, their actual full-size (not the "thumb-nail size") and the unique file names all in the "Various Files" package (not sure if "package" is the correct term).
When I click on "Start Downloads", the package moves to the "Downloads" tab all the files start downloading (3 at a time).

I was hoping that the linkfile.crawljob file with 1 URL per line would do the same as I was doing manually.

Last edited by ticedoff8; 25.02.2020 at 01:03.
Reply With Quote
  #11  
Old 25.02.2020, 00:58
pspzockerscene's Avatar
pspzockerscene pspzockerscene is online now
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 70,921
Default

By default JD can only auto-handle everything it has plugins for AND direct-URLs which means all of your URLs will probably get ignored.
Addiitonally you will need to add a DEEPDECRYPT link crawler rule (see forum search) that matches the URL structure of your gallery URLs.

You can do this first and test it e.g. JD should crawl these URLs when clipboard observation is enabled and then you can work on the crawljob stuff.
... so you want to crawljob to repeatedly add thse gallery URLs as the content behind them changes?

-psp-
EDIT

Also please keep in mind that without having real life test URLs, I can point you to the right direction but I will not be able to really help you.
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #12  
Old 25.02.2020, 01:28
ticedoff8 ticedoff8 is offline
Modem User
 
Join Date: Feb 2020
Location: USA, California
Posts: 4
Default

Quote:
... so you want to crawljob to repeatedly add thse gallery URLs as the content behind them changes?
The existing galleries are static. The moderator adds a new gallery when posting new content (currently galleries \Gallery_01\index.htm through Gallery_654\index.htm).

The only difference between each gallery is the number of the gallery - the rest of every URL is the same.
At first, I hoped I could use a for/do loop and insert the number into the URL string where the number is.
Or use some kind of 'wildcard' in the URL string in place of the numeric section. But, "*" is not a valid DNS character.

IRL URL (I think this will get edited down to "**External links are only visible to Support Staff**"):
**External links are only visible to Support Staff****External links are only visible to Support Staff**

EDITED:
On a side note: If my linkfile.crawljob is in the K:\ drive, it will "magically" disappear after about 10 seconds
I assume the the Folder Watch parameter ["K:\"] is working. I would guess that the expected behaviour is that the LinkGrabber finds the linkfile.crawljob file, parses it, and then deletes it to avoid a loop and re-downloading the same files over and over.
Except, in my case, LinkGrabber doesn't like the format of the file, but deletes it anyway.

Last edited by ticedoff8; 25.02.2020 at 01:43.
Reply With Quote
  #13  
Old 25.02.2020, 03:48
pspzockerscene's Avatar
pspzockerscene pspzockerscene is online now
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 70,921
Default

See Settings --> Advanced Settings --> Type in "Link crawler rules" --> Use this rule:
Replace the word "CENSORED" with the name of the website you posted in your last post.
Code:
[ {
  "enabled" : true,
  "updateCookies" : true,
  "logging" : false,
  "maxDecryptDepth" : 1,
  "id" : 1422443765154,
  "name" : "CENSORED.com example rule",
  "pattern" : "https?://(www\\.)?CENSORED\\.com/Sets/Galleries/Gallery_[0-9]+/index_high\\.htm",
  "rule" : "DEEPDECRYPT",
  "packageNamePattern" : "<font size=\\+3><b>([^>]+)</b></font>",
  "passwordPattern" : null,
  "formPattern" : null,
  "deepPattern" : "\"([^\"]+\\.jpg)",
  "rewriteReplaceWith" : null
} ]
Test it by enabling clipboard detection --> JD should then automatically find pictures once you copy fitting ones.

This rule is already quite good but you could e.g. further improve it so that it e.g. does not grab the thumbnails anymore.

-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #14  
Old 25.02.2020, 10:28
ticedoff8 ticedoff8 is offline
Modem User
 
Join Date: Feb 2020
Location: USA, California
Posts: 4
Default

I think that worked like a charm.
There are 2 "LinkCrawler: Link Crawler Rules" listed, but I think the 1st one is simply for enabling the 2nd one and the 2nd is where the text is entered.

Once that was done, I changed the extension on the linkfile.txt to linkfile.crawljob and 10 seconds later, the LinkGrabber list starts creating packages whose names match the Gallery_XX URL link's Gallery value and each package has all the expected full-size .JPG filenames listed and ready for download.

I will play around a little more, but that looks like it did the trick.
Thanks.
Reply With Quote
  #15  
Old 25.02.2020, 14:43
pspzockerscene's Avatar
pspzockerscene pspzockerscene is online now
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 70,921
Default

Thanks for your feedback

-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 19:35.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.