JDownloader Community - Appwork GmbH
 

Go Back   JDownloader Community - Appwork GmbH > English Support > Suggestions & Requests
Reply
 
Thread Tools Display Modes
  #1  
Old 28.12.2021, 06:49
JTK JTK is offline
Super Loader
 
Join Date: Jun 2021
Posts: 26
Default twitter LinkCrawler option to combine and stop if match with already download files

Got 2 questions when using LinkCrawler with already downloaded url. (ex. use to crawl the same twitter's user again)

1.
Can it be set to stop crawling if it reach the already download files and not keep crawling so it can updates only new files faster.

I fear that it will alert the website if I crawl lots of url so for now I add 1 url at a time and hit abort when I see that it reach already download files and repeat the same process with the next url and so on.

I understand that it keep crawling so in case that the first time it didn't get all the files the next time it might be able to get them. So option to turn on and off would be great. Or maybe just an option to select how many files to crawl.

2.
Can new package from the LinkCrawler with the same name merge with the already download package? If it can, what would happened when the destination folder of the two is different? Would it stay the same or move to the old/new folder?
ex:
package name : test ,destination folder : /test/
package name : test ,destination folder : /test2021/
Reply With Quote
  #2  
Old 28.12.2021, 15:16
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,560
Default

@JTK: unfortunately that's not possible because neither LinkCrawler nor Plugins do support this.
what host/site exactly do you have in mind? maybe we can add special plugin support to eg crawl only last x items/posts or last y time/days.
__________________
JD-Dev & Server-Admin
Reply With Quote
  #3  
Old 28.12.2021, 15:17
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,560
Default

Quote:
Originally Posted by JTK View Post
Can new package from the LinkCrawler with the same name merge with the already download package?
That's also not yet supported and if we would add optional support for it, then it would only merge if package name and destination folder are the same.
__________________
JD-Dev & Server-Admin
Reply With Quote
  #4  
Old 28.12.2021, 15:18
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,560
Default

Maybe you can describe your use case/what you're trying to achieve? then we can understand better and maybe find better solution
__________________
JD-Dev & Server-Admin
Reply With Quote
  #5  
Old 29.12.2021, 11:35
JTK JTK is offline
Super Loader
 
Join Date: Jun 2021
Posts: 26
Default

Quote:
Originally Posted by Jiaz View Post
@JTK: unfortunately that's not possible because neither LinkCrawler nor Plugins do support this.
what host/site exactly do you have in mind? maybe we can add special plugin support to eg crawl only last x items/posts or last y time/days.
I'm using it with twitter.com

example:
I let it crawl till finished for the first time. Then I crawl every 1-4 weeks which for some user have less than 50 new files.

So when I crawl for update, I wait till I saw that the counter for new files (which didn't count already downloaded files) is stopped and let it run a few tick more and then I abort the crawling.

If there's an option to crawl only last x items/posts or last y time/days or both like you said would be great.:)
Reply With Quote
  #6  
Old 29.12.2021, 11:56
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,560
Default

@JTK: Thanks for the feedback and understood.I've created a ticket for this
and I will discuss this with pspzockerscene when he returns from his holidays *next* year
__________________
JD-Dev & Server-Admin
Reply With Quote
  #7  
Old 10.01.2022, 15:13
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,117
Default

I've added the first "abort on user condition" feature to our twitter.com plugin.
Usage:
When adding user profiles such as "twitter.com/exampleUsername", simply append e.g. "?maxitems=50" to make the crawl process stop after 50 tweets.
Keep in mind that a tweet can contain multiple items so most likely you will still get way more than 50 items!

I will also add the "limit by date" support once I find the time

-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #8  
Old 17.01.2022, 17:43
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,117
Default

I've added a 2nd abort condition parameter "max_date".

I've created a dedicated help article regarding this plugins' advanced features.

Bitte auf das nächste CORE-Update warten!

Please wait for the next CORE-Update!

Wartest du auf einen angekündigten Bugfix oder ein neues Feature?
Updates werden nicht immer sofort bereitgestellt!
Bitte lies unser Update FAQ! | Please read our Update FAQ!

---
Are you waiting for recently announced changes to get released?
Updates to not necessarily get released immediately!
Bitte lies unser Update FAQ! | Please read our Update FAQ!


-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #9  
Old 21.01.2022, 08:37
JTK JTK is offline
Super Loader
 
Join Date: Jun 2021
Posts: 26
Default

Quote:
Originally Posted by pspzockerscene View Post
I've added the first "abort on user condition" feature to our twitter.com plugin.
Usage:
When adding user profiles such as "twitter.com/exampleUsername", simply append e.g. "?maxitems=50" to make the crawl process stop after 50 tweets.
Keep in mind that a tweet can contain multiple items so most likely you will still get way more than 50 items!

I will also add the "limit by date" support once I find the time

-psp-
Quote:
Originally Posted by pspzockerscene View Post
I've added a 2nd abort condition parameter "max_date".

I've created a **External links are only visible to Support Staff**... regarding this plugins' advanced features.

Bitte auf das nächste CORE-Update warten!

Please wait for the next CORE-Update!

Wartest du auf einen angekündigten Bugfix oder ein neues Feature?
Updates werden nicht immer sofort bereitgestellt!
Bitte lies unser **External links are only visible to Support Staff**...! | Please read our **External links are only visible to Support Staff**...!

---
Are you waiting for recently announced changes to get released?
Updates to not necessarily get released immediately!
Bitte lies unser **External links are only visible to Support Staff**...! | Please read our **External links are only visible to Support Staff**...!


-psp-
Thank you.
I have some question.
In case of crawling multiple user at the same time do I have to add or change the date for each line?

e.g.
twitter.com/exampleUsername1?max_date=2022-01-10
twitter.com/exampleUsername2?max_date=2022-01-10
twitter.com/exampleUsername3?max_date=2022-01-10

or can I just add the date at the end for all items like
twitter.com/exampleUsername1
twitter.com/exampleUsername2
twitter.com/exampleUsername3
?max_date=2022-01-10

which would be a lot easier as I can just change 1 line when updating dozens of user weekly.
Reply With Quote
  #10  
Old 21.01.2022, 13:53
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,117
Default

Quote:
Originally Posted by JTK View Post
In case of crawling multiple user at the same time do I have to add or change the date for each line?
Yap you have to apply your wished end-date to each line.
You can easily do this using a powerful text editor like "Notepad++" (google "How to apply text to each line notepad++").

Quote:
Originally Posted by JTK View Post
which would be a lot easier as I can just change 1 line when updating dozens of user weekly.
This is not possible because this is not how URLs work and also not how our parser works.
To save time, you can do it as described in the beginning of this post.
Then simply keep a text-document of all profile-URLs you want to crawl frequently -> Then simply add the new wished date or do a "replace all" on the old date.

Please keep in mind that CORE-updates are still pending so you won't be able to use this functionality yet!

-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?

Last edited by pspzockerscene; 21.01.2022 at 13:54. Reason: Improved wording
Reply With Quote
  #11  
Old 14.02.2022, 17:24
Fabrice1973 Fabrice1973 is offline
Modem User
 
Join Date: Feb 2022
Posts: 2
Default Limiting Downloads Found Search (twitter.com)

Hi all,

Is there any way to limit the amount of downloads found in the link grabber?

I want to download pictures from various Twitter accounts but am only interested in the recent posts.

Or to abort the search after X seconds by URL?

Thanks in advance,

Fab
Reply With Quote
  #12  
Old 15.02.2022, 16:27
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,117
Default

Merged similar twitter threads.

@Fabrice1973
Please read this thread and decide whether or not the added- but not yet released features will be enough for you.

-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #13  
Old 15.02.2022, 18:20
Fabrice1973 Fabrice1973 is offline
Modem User
 
Join Date: Feb 2022
Posts: 2
Default

Thank you @pspzockerscene !

That will do for me.
Reply With Quote
  #14  
Old 16.02.2022, 20:30
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,117
Default

I'm ready for your feedback on this once our next CORE-updates go out

-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #15  
Old 25.02.2022, 16:18
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,117
Default

CORE-Updates have been released!
Please update your JDownloader and report any issues you find asap.

CCORE-Updates wurden released!
Bitte JDownloader updaten und eventuelle Bugs schnellstmöglich an uns melden.

-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
Reply


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 10:10.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.