#1
|
|||
|
|||
How to download all images in folders, one generation down into webpage
Still loving JD2. Best download software EVER, by far.
I'm sure this question has been discussed before, but I don't know how to search for an answer. I don't know the terminology. Here's a webpage with a large number of photos. Each photo is a link to another page with a series of photos related to the top "link" photo. Is there a way to download all the photos in separate folders without manually capturing each link? I would love to learn some way to just select-all on the top page and that generates a folder for each of the embedded "under" links. Basically, I'm not looking to download the images on this page. I'm looking to download all the images on the pages that these images link to. And I want jdownloader2 to create folders for each page so that all I have to do is select-all and copy-selected-links on this page. One or two clicks, then a few minutes of jd2 generating folders and links, and I'm done. I think the photos I want are not one folder down. They are two folders down. The below link is the top page. Those images all link to thumbnails one page down. And those thumbnails link to the full-resolution individual images that I'm after. There are about 580 images on this page. I want jd2 to create 580 folders, each named automatically like the 580 webpages, and each folder containing the 12 to 20 links to full-res images. How is this accomplished? **External links are only visible to Support Staff****External links are only visible to Support Staff** Last edited by Jiaz; 14.04.2021 at 15:05. |
#2
|
||||
|
||||
Hi,
this is easily possible via LinkCrawler Rule(s). You can find a lot of examples in our forum. Just check back here if you're stuck and we can help you to create working rules. Please keep in mind that getting all results from this search query (your example) automatically won't work but adding e.g. such links, finding all images and putting them into one package with a meaningful name is easily possible: ..com/galleries/this-is-an-example-12345678/ -psp-
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download Last edited by Jiaz; 14.04.2021 at 15:05. Reason: Forgot to link support article |
#3
|
||||
|
||||
https://support.jdownloader.org/Know...kcrawler-rules
https://support.jdownloader.org/Know...le-deepdecrypt
__________________
JD-Dev & Server-Admin |
#4
|
||||
|
||||
You just need a linkcrawler rule for the index/search site, the individual links will be fixed with next update
__________________
JD-Dev & Server-Admin |
#5
|
|||
|
|||
Hello guys. Thank you for the replies. Always so helpful!
I spent an hour or so trying to make my own linkcrawler rule, but no luck. I'm just not experienced enough with all the terminology. I couldn't even understand the tutorial! I did find where to insert the rule in the advanced settings, but the changes I made to the sample rule did not work. If either of you want to help, this is the page of master thumbnails. **External links are only visible to Support Staff****External links are only visible to Support Staff** Behind each of these images is a page with twenty or so large thumbnails - and behind each of those images is the full-res image. So I guess I'm looking to search two levels down into this page for image links. The goal is to have only the full-res image links loaded into JD2. If they would appear in folders with the page name for each one, that would be even better. But I am presuming from pspzockerscene's comment that that is not possible. Not a big problem. I can package the image links myself after they are all in JD2 if necessary. But just getting 550+ pages worth of links into JD2 using linkcrawler rules is still over my head. Any help would be appreciated. |
#6
|
||||
|
||||
The website uses pagination in way not supported by Linkcrawler Rules.
I would recommend to just scroll in browser until all links are visible and then select all (CTRL+A) and copy to Clipboard (CTRL+C) As an alternative you could try to use Eventscripter, see https://board.jdownloader.org/showthread.php?t=70525 and a small script that does the pagination and crawl all the links for you. Ask mgpai for help with this.
__________________
JD-Dev & Server-Admin |
#7
|
|||
|
|||
I appreciate you trying. Thanks.
|
Thread Tools | |
Display Modes | |
|
|