JDownloader Community - Appwork GmbH
 

Reply
 
Thread Tools Display Modes
  #1  
Old 17.01.2020, 12:40
ravidhyena ravidhyena is offline
Modem User
 
Join Date: Jan 2020
Posts: 3
Default More complex link crawling help

Hi,
I am not even sure if this is possible with the program.
There is a webpage where there are 18 links to 18 sub pages
the 18 sub pages i am interested in are all a number eg www.*&*$(*.com/235346
there are obviously lots of other links on this page that I do not want to trawl.
On the 18 sub pages there is a Google Drive link with a file I wished download.

Is there a way of making the program look at each of the 18 sub pages?and then go on to download those? (this is a website with many main pages that i wish to do this process)

Many thanks for any pointers
Reply With Quote
  #2  
Old 17.01.2020, 15:15
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 16,524
Default

what do the base links look like that lead into the sub pages
if they both are different to one another it does make it easier.
you could easily do it with two linkcrawler rules in that case, one for the main page which returns sublinks. then second rule to return the external links, google drive etc.

if you want todo it via decrypter plugin you can also, just requires a bit more time setting up IDE and workspace.


raztoki
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]

Last edited by raztoki; 17.01.2020 at 15:20.
Reply With Quote
  #3  
Old 20.01.2020, 10:49
ravidhyena ravidhyena is offline
Modem User
 
Join Date: Jan 2020
Posts: 3
Default

Thanks. Sorry, I didnt know there was a reply!
The base links are quite different and dont end in a number. I actually am happy to copy the basslink to the jdownloader clipboard manually.
So I just need to have it look at the 18 links with the number at the end and grab the google links of those 18 pages.
Any tutorials or resources on how I might do that? I find the linkcrawler confusing. I cant even see how you set the 'depth' of crawling!
Thanks
Reply With Quote
  #4  
Old 20.01.2020, 14:46
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 16,524
Default

there are no tutes, I did make a video though never published it.
we have getting started guide for setting up IDE, then you can use existing websites (simple sites) to see how things work and flow.

linkcrawler rules are for simple tasks. decrypter plugin way more powerful, just requires some time to get things setup and understanding how things work.
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]
Reply With Quote
  #5  
Old 22.01.2020, 12:54
ravidhyena ravidhyena is offline
Modem User
 
Join Date: Jan 2020
Posts: 3
Default

Thanks for giving the advice.
Sadly this looks way to complicated for a novice like me.
Back to manual clicking or maybe macro writing for me
Reply With Quote
  #6  
Old 22.01.2020, 13:13
tony2long's Avatar
tony2long tony2long is offline
English Supporter
 
Join Date: Jun 2009
Posts: 6,296
Default

If you give real example, maybe we can give real solution, if everything are in source html, not hidden.
__________________
FAQ: How to upload a Log
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 00:32.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.