JDownloader Community - Appwork GmbH
 

Notices

Reply
 
Thread Tools Display Modes
  #1  
Old 18.09.2022, 01:33
I3ordo I3ordo is offline
Mega Loader
 
Join Date: Mar 2022
Posts: 65
Default Adding links from the clipboard via cmd line or something else...

Hi!
wondering...
Is it possible to to paste a clipboard (that contains download links per line) and make JD accept and start those downloads through a windows bat file? like for example. c:path\jdownloader.exe -getclipboard -startdownloads &

alternatively? can we command JD to get clipboard and start downloads on start up, via event scripter?

Longer story:
I am trying to fully automate my daily download procedure via task manager.

First , I got help from a friend and this powershell script goes to a specified web site and grabs all necessary links to clipboard. Then i manually activate JD, ctrl+v , hit yes and hit "start all downloads".

We could not figure out how to make JD receive the clipboard(that contains those links) , answer yes to linkgrabber's question that has to be "yes" and make those downloads start.

The clipboard's contents are like this:
somesite.com/some_article_01
somesite.com/some_article_02
somesite.com/some_article_03...
Reply With Quote
  #2  
Old 18.09.2022, 15:30
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,339
Default

@I3ordo: Please check Folderwatch/Crawljobs, see https://support.jdownloader.org/Know...ch-basic-usage
That way you can create Crawljob files that contain links and can customize some stuff like package name or download directory. You can then add those crawljob files or place them in specific folder and JDownloader will pick it up automatically

Why query/paste the clipboard? JDownloader will auto pickup the link once copied to clipboard, so just let JDownloader run in background and tell it to auto-add/auto-start new links.
__________________
JD-Dev & Server-Admin
Reply With Quote
  #3  
Old 18.09.2022, 15:33
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,339
Default

Quote:
Originally Posted by I3ordo View Post
Longer story:
I am trying to fully automate my daily download procedure via task manager.

First , I got help from a friend and this powershell script goes to a specified web site and grabs all necessary links to clipboard. Then i manually activate JD, ctrl+v , hit yes and hit "start all downloads".
Best use case for crawljob files script checks website for changes/new links, then creates crawljob file and JDownloader Folderwatch extension will do the rest.
__________________
JD-Dev & Server-Admin
Reply With Quote
  #4  
Old 18.09.2022, 15:35
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,339
Default

Quote:
Originally Posted by I3ordo View Post
We could not figure out how to make JD receive the clipboard(that contains those links) , answer yes to linkgrabber's question that has to be "yes" and make those downloads start.

The clipboard's contents are like this:
somesite.com/some_article_01
somesite.com/some_article_02
somesite.com/some_article_03...
Most likey because those Links are not supported by any plugin and do require DeepDecrypt Mode. You can easily solve this by creating Linkcrawler Rule(s), see https://support.jdownloader.org/Know...kcrawler-rules
That way you teach JDownloader how to handle unsupported URLs and what content you're interested in
__________________
JD-Dev & Server-Admin
Reply With Quote
  #5  
Old 18.09.2022, 15:36
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,339
Default

In case you need further help with this /got questions, just ask
__________________
JD-Dev & Server-Admin
Reply With Quote
  #6  
Old 18.09.2022, 18:31
I3ordo I3ordo is offline
Mega Loader
 
Join Date: Mar 2022
Posts: 65
Default

Well, i already this powershell code that gets the links
Code:
$Pages = "3" #Set Page number you would like this to stop at 
#$Pages = Read-Host "What Pages number would you like this to stop at?" #Will ask you everytime you run

#If you uncomment with # between the top two lines. line 1 is just set line 2 will ask you when you run 

$export = @()
$currentPageString = "1" 
$Pageint = [int]$currentPageString
While($Pageint -le $Pages){
$link = ‘**External links are only visible to Support Staff** + $pageint
$links = (Invoke-WebRequest –Uri $link).Links | select href
$export += $links | where {$_.Href -Like "***External links are only visible to Support Staff** -and $_.Href -notlike "**External links are only visible to Support Staff** -and $_.Href -ne "**External links are only visible to Support Staff** -and $_.Href -ne "**External links are only visible to Support Staff** -and $_.href -notlike "*#more*" -and $_.href -notlike "*#respond*"-and $_.href -notlike "*#comments*"} | select Href
$Pageint++
}
Set-Clipboard $export.href

C:\temp\Jdownloader\Jdownloader2.exe #Change Patch to where you Jdownloader2.exe sits
and the links are at the clipboard. just need something from the jd to start deep crawling them and autostart the downloads.

I already paid some $ for the code above, now i could pay a bit more but cant find anyone with JD coding services in fiverr yet...
Reply With Quote
  #7  
Old 19.09.2022, 11:33
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,339
Default

@l3ordo: Please see here for a linkcrawler rule for your links,
https://board.jdownloader.org/showth...t=down3dmodels
You just have to add those rules to your JDownloader and it will be able to pickup those links from clipboard. Just let JDownloader run in background. It's important that JDownloader runs before you add something to clipboard, else it will not be picked up.
__________________
JD-Dev & Server-Admin
Reply With Quote
  #8  
Old 19.09.2022, 11:33
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,339
Default

Quote:
Originally Posted by I3ordo View Post
I already paid some $ for the code above, now i could pay a bit more but cant find anyone with JD coding services in fiverr yet...
I hope not that much $ because you could also easily have achieved the same via Linkcrawler Rules natively in JDownloader.
__________________
JD-Dev & Server-Admin
Reply With Quote
  #9  
Old 20.09.2022, 17:50
I3ordo I3ordo is offline
Mega Loader
 
Join Date: Mar 2022
Posts: 65
Default

well i was stuck at the

"2. Ask for help in our Eventcripter thread:
Ask for a script that will auto-add a specified URL every X hours and also ask what you can do to avoid duplicates when doing that."...
Reply With Quote
  #10  
Old 20.09.2022, 17:53
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,339
Default

@I3ordo: Your script should do fine when you have the linkcrawler rule in place so JDownloader knows how to handle those URLs
Habe you checked the content of the Clipboard after execution of the script? that it matches your expectation?
__________________
JD-Dev & Server-Admin
Reply With Quote
  #11  
Old 23.09.2022, 02:33
I3ordo I3ordo is offline
Mega Loader
 
Join Date: Mar 2022
Posts: 65
Default

Hi , i generally try to squeeze in some jd knowledge in between breaks that i take when i am not modeling and rendering stuff, and i am sure you have got a lot going too
anyway , just wanted to say i appreciate it greatly..thanks for always responding to my questions.

The script i ordered on fiverr. the powershell script, works as expected. The clipboard is populated with article links at each line. Then all i have to do is manually add the links from clipboard to link grabber (by ctrl+v) and have jd autostart downloads. Then archive files and images automatically renamed and assigned to a sperarte folder by jd's packagizer rules.

At the moment, I just want JD to grab the clipboard links to linkgrabber by it self, so that s why i asked for a command line or a JD script that grabs clipboard and initiates the downloads without human interaction.

Last edited by I3ordo; 23.09.2022 at 14:07.
Reply With Quote
  #12  
Old 23.09.2022, 15:59
I3ordo I3ordo is offline
Mega Loader
 
Join Date: Mar 2022
Posts: 65
Default

A video explanation of the current situation, short one
**External links are only visible to Support Staff****External links are only visible to Support Staff**
Reply With Quote
  #13  
Old 24.09.2022, 02:07
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,140
Default

I'm sorry I haven't watched your video 100% but, referring to the end where you say "I can't code" while you opened the forum thread containing the LinkCrawler rules Jiaz mentioned:
You do not need to be able to code in order to use those rules but you need those rules so that JD can auto handle those URLs when they get copied in your clipboard

1. Quickly read our support article about LinkCrawler rules.
2. Go to this thread, grab those two rules, add them to JD and afterwards JD should handle the links shown in your video whenever you copy them and clipboard observation is active.

Also ignore everything that was posted about "EventScripter" in the linked thread:
That user wanted JD to auto-re-scan a specific link every X time to find new content. This is not what you want thus you can ignore all info we gave him as a reply to that specific question.
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #14  
Old 25.09.2022, 14:11
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,339
Default

Quote:
Originally Posted by pspzockerscene View Post
Also ignore everything that was posted about "EventScripter" in the linked thread:
That user wanted JD to auto-re-scan a specific link every X time to find new content. This is not what you want thus you can ignore all info we gave him as a reply to that specific question.
@pspzockerscene: it IS the same user @I3ordo
__________________
JD-Dev & Server-Admin
Reply With Quote
  #15  
Old 25.09.2022, 14:14
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,339
Default

Quote:
Originally Posted by I3ordo View Post
(by ctrl+v) and have jd autostart downloads.
You have to CTRL+V (Paste) because JDownloader doesn't support these URLs out of the box but we already provided you working linkcrawler rules here, https://board.jdownloader.org/showpo...69&postcount=6


Quote:
Originally Posted by I3ordo View Post
At the moment, I just want JD to grab the clipboard links to linkgrabber by it self, so that s why i asked for a command line or a JD script that grabs clipboard and initiates the downloads without human interaction.
No need for any command line any anything else. You just have to setup the already provided linkcrawler rule so JDownloader will be able to auto pickup/process those links from clipboard.
__________________
JD-Dev & Server-Admin
Reply With Quote
  #16  
Old 25.09.2022, 14:59
I3ordo I3ordo is offline
Mega Loader
 
Join Date: Mar 2022
Posts: 65
Default

Quote:
Originally Posted by Jiaz View Post
@pspzockerscene: it IS the same user @I3ordo
Yep that was me

Also, i pasted the link crawler rule and had already clipboard monitoring active...


That rule;
Code:
[
 {
  "cookies"            : null,
  "deepPattern"        : "<link>(https?://[^<]+)</link>",
  "formPattern"        : null,
  "id"                 : 1664062694270,
  "maxDecryptDepth"    : 1,
  "name"               : "crawl all single item URLs from 'down3dmodels.com/feed/'",
  "packageNamePattern" : null,
  "passwordPattern"    : null,
  "pattern"            : "**External links are only visible to Support Staff**,
  "rewriteReplaceWith" : null,
  "rule"               : "DEEPDECRYPT",
  "enabled"            : true,
  "logging"            : false,
  "updateCookies"      : true
 },
 {
  "cookies"            : [],
  "deepPattern"        : ">(https?://(?!down3dmodels\\.com/)[^<]+)</",
  "formPattern"        : null,
  "id"                 : 1664062694271,
  "maxDecryptDepth"    : 1,
  "name"               : "crawl all URLs inside all URLs from 'down3dmodels.com' except 'down3dmodels.com/feed/'",
  "packageNamePattern" : "<title>(.*?)</title>",
  "passwordPattern"    : null,
  "pattern"            : "**External links are only visible to Support Staff**,
  "rewriteReplaceWith" : null,
  "rule"               : "DEEPDECRYPT",
  "enabled"            : true,
  "logging"            : false,
  "updateCookies"      : true
 }
]
Grabs only the rar/archive files but leaves out the image of each article. SO it s not like having JD ctrl+v'ing the clipboard.
My only existing packagizer rule is this one that only works with archive files and images
and i do the additional filtering like host and filetype filtering in the link grabber window(only archives from rapidgator, hoster as down3dmodels, no facebook links or nitroflare etc)

Can the crawler rule be modified to include also the article images? I dont see any thing in pockerscene's rule that includes rar files and excludes everything else. it seems

Last edited by I3ordo; 25.09.2022 at 15:08.
Reply With Quote
  #17  
Old 26.09.2022, 17:00
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,140
Default

Quote:
Originally Posted by Jiaz View Post
@pspzockerscene: it IS the same user @I3ordo
Ohh I see. Lack of concentration was clearly there :D

Quote:
Originally Posted by I3ordo View Post
Grabs only the rar/archive files but leaves out the image of each article.
SO it s not like having JD ctrl+v'ing the clipboard.
Sure it's not. With DEEPDECRYPT LinkCrawler Rules you explicitely state "which part of the page you want to crawl content from".
I've modified your rule and enhanced the regular expression to include imgur.com URLs.
Please keep in mind that this will fail if the article image is hosted on another website and sure it is possible to make the rule more dynamic if needed.
Please also keep in mind that I recommend learning regular expressions so you can work on such rules yourself in the future.
Our support article links tools you can use to test/learn regular expressions:
https://support.jdownloader.org/Know...kcrawler-rules

Quote:
Originally Posted by I3ordo View Post
Can the crawler rule be modified to include also the article images? I dont see any thing in pockerscene's rule that includes rar files and excludes everything else. it seems
You haven't understood how those rules work yet.
For rules of type DEEPDECRYPT, they look for certain stuff inside the websites' html code.
What to look for is up to you and regular expressions are used to find/filter those things.

Modified LinkCrawler rule(s):
Code:
[
  {
    "deepPattern": "<link>(https?://[^<]+)</link>",
    "maxDecryptDepth": 1,
    "name": "crawl all single item URLs from 'down3dmodels.com/feed/'",
    "packageNamePattern": null,
    "pattern": "**External links are only visible to Support Staff**,
    "rewriteReplaceWith": null,
    "rule": "DEEPDECRYPT",
    "enabled": true,
    "logging": false,
    "updateCookies": true
  },
  {
    "cookies": [],
    "deepPattern": "((https?://i\\.imgur\\.com/[A-Za-z0-9]+\\.[A-Za-z]+)|>(https?://(?!down3dmodels\\.com/)[^<]+)</)",
    "maxDecryptDepth": 1,
    "name": "crawl all URLs inside all URLs from 'down3dmodels.com' except 'down3dmodels.com/feed/'",
    "packageNamePattern": "<title>(.*?)</title>",
    "pattern": "**External links are only visible to Support Staff**,
    "rewriteReplaceWith": null,
    "rule": "DEEPDECRYPT",
    "enabled": true,
    "logging": false,
    "updateCookies": true
  }
]
Rule as plaintext:
pastebin.com/raw/SHAH8dhn
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #18  
Old 27.09.2022, 06:35
I3ordo I3ordo is offline
Mega Loader
 
Join Date: Mar 2022
Posts: 65
Default

@pspzockerscene

Hey this is great, now seeing the added rule, i can make a sense of how i can build one my self, i got familiar with regex (by the help of regexr.com during the packagizer rule )

Now , i can see that the downloads are grabbed and waiting to be confirmed.

Then i re-visited my packagizer rule for d3d and ticked the "auto confirm" option.

Now it is fully automated, super happy now!


Checking the rule that you have modified, it was just initiating the deepcrawling process upon regex matching. and the other crucial part was done by my packagizer rule and filterings...

Funny things become much more evident when the solution arrives.

Was spoon fed again but that was an educational spoon in the end!

thank your very much, I am thrilled to have solved this repetitive task in to the "automation" @pspzockerscene @jiaz
Reply With Quote
  #19  
Old 27.09.2022, 15:10
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,140
Default

Thanks for your feedback I'm glad it's working for you now.

I'll mark this thread as "Solved" now though it will remain open in case you got further questions.
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?

Last edited by pspzockerscene; 27.09.2022 at 15:10. Reason: Fixed typo
Reply With Quote
  #20  
Old 27.09.2022, 15:59
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,339
Default

@I3ordo: Thanks for your feedback! And sorry for the long road and many ping/pong
__________________
JD-Dev & Server-Admin
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 10:18.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.