#2201
|
||||
|
||||
Script for exporting result of 'Copy information' to an csv-file?
When you export data using 'copy information' in the context menu, you can define, what you want to copy to clipboard.
E.g: {type};{name};{filesize};{path};{url} Note: Using TAB instead of ; as separator can be helpful, when you are dealing with file or folder names containing ; Then you can copy the clipboard contents e.g. to Excel. Jiaz recommended that I should ask here, if a script for that already has been written or could be written. And - as mentioned above - the separator should be TAB, because ',' and also ';' can be part of file or folder names, which would mess up the Excel sheet. |
#2202
|
|||
|
|||
As i am bad like hell in programming ...
Dunno if someone posted here ... but does anyone as an eventscripter script to disable CRC Check for any unpacked files? So CRC only for zipped/rared Files where a CRC Error would prevent unpacking? Thx |
#2203
|
||||
|
||||
@StefanM: In case you want to experiment by yourself, see
https://board.jdownloader.org/showpo...&postcount=102 https://board.jdownloader.org/showpo...&postcount=794 https://board.jdownloader.org/showpo...6&postcount=19
__________________
JD-Dev & Server-Admin |
#2204
|
||||
|
||||
@Mydgard: why you want to disable CRC for non archives? why disable it all?
I'm sorry but currently it's not possible to disable it on *per file basis*.
__________________
JD-Dev & Server-Admin |
#2205
|
|||
|
|||
Quote:
And i want to do that because my Download HDD is @100% while CRC Checking all those Rapidgator links ... |
#2206
|
|||
|
|||
Code:
/* Remove hashinfo from non-archive files Trigger: Packagzier Hook */ !link.crawledLink.archive && link.setProperty("HASHINFO", null); Code:
/* Disable hashcheck Trigger: Packagzier Hook */ state == "BEFORE" && linkcheckDone && !link.crawledLink.archive && link.setProperty("ALLOW_HASHCHECK", false); Last edited by mgpai; 04.06.2022 at 15:04. Reason: Script replaced. |
#2207
|
||||
|
||||
@Mydgard: Thanks for the feedback. He is right, I had to check source. There is a property that can be set "ALLOW_HASHCHECK" to false, so no hashcheck is done on the file.
Better wait for mgpai to help with a script that set this property, eg a script that set the property after linkcheck. But it's okay that a HDD is working/high IO Load during check because the whole file has to be read for checking. a HDD may work and is not designed to have as less reads as possible
__________________
JD-Dev & Server-Admin |
#2208
|
||||
|
||||
@mgpai: don't change HASHINFO property as it might get updated/set again during normal plugin work. Better set the named property , see my previous post
__________________
JD-Dev & Server-Admin |
#2209
|
|||
|
|||
@mgpai / Jiaz:
Thx, tried this: /* Remove hadinfo from non-archive files Trigger: Packagzier Hook */ state == "BEFORE" && linkcheckDone && !link.crawledLink.archive && link.setProperty("ALLOW_HASHCHECK", false); Trigger is "Hook für Paketverwalter". Started a RG Link ... after download a CRC Check occurs ... So either the script isnt working, i did a mistake or the new script works only for newly added RG links? That would be bad as i have a huge amount of RG links in JD2. Thx |
#2210
|
||||
|
||||
@Mydgard: The rule is only applied on *Packagizer*, so during adding of new links. In case you want to modify existing links in list, you will need a different script that loops through all links and modifies this property for all non archive links. Please wait for mgpai for further help with this
__________________
JD-Dev & Server-Admin |
#2211
|
|||
|
|||
Ah okay thx Helpful for new links as well yes but mainly needed for existing links ...
|
#2212
|
||||
|
||||
@Mydgard: I'm sure mgpai will be able to write a wonderful script for you
__________________
JD-Dev & Server-Admin |
#2213
|
|||
|
|||
Quote:
Code:
/* Disable hashcheck for non-archive files Trigger : None */ getAllDownloadLinks().forEach(function(link) { !link.finished && link.hashInfo && !link.archive && link.setProperty("ALLOW_HASHCHECK" , false); }) getAllCrawledLinks().forEach(function(link) { link.hashInfo && !link.archive && link.setProperty("ALLOW_HASHCHECK" , false); }) |
#2214
|
||||
|
||||
@mgpai: maybe we add new trigger type (manual) with a button *run* besides? your opinion on this?
or all *manual* scripts are available automatically in rightclick context menu. new type "context menu entry", so it automatically adds to context menu?
__________________
JD-Dev & Server-Admin Last edited by Jiaz; 04.06.2022 at 15:50. |
#2215
|
|||
|
|||
Quote:
This thread has several scripts which can be used to format/export data. @Jiaz has listed some of them. You can also use scripts to perform dupe checks from within JD instead of exporting and comparing them manually. |
#2216
|
|||
|
|||
@mgpai: will i get a end message or something similar? have 10877 links in jd2 actual ...
Not sure how long it will take to run? |
#2217
|
|||
|
|||
A button outside of the editor may not be needed as they are run (mostly) only once, at the time of adding them.
|
#2218
|
||||
|
||||
Quote:
I just wanted to point out to other readers here, that - if they do it that way - they should prefer using TABs. And of course, any script should use TABs as well. This is what I wanted to say. |
#2219
|
|||
|
|||
Quote:
You can use this to get the count of links which for which hashcheck has been disabled Code:
/* Show hascheck disabled count Trigger : None */ var count = 0; getAllDownloadLinks().forEach(function(link) { link.getProperty("ALLOW_HASHCHECK") == false && count++; }) getAllCrawledLinks().forEach(function(link) { link.getProperty("ALLOW_HASHCHECK") == false && count++; }) alert("Hash check is disabled for " + count + " links"); |
#2220
|
|||
|
|||
It's clear now.
|
#2221
|
|||
|
|||
[QUOTE=mgpai;503641]
You can use this to get the count of links which for which hashcheck has been disabled 4657 Links ... Testet a small file, no crc check occured ... so it it worked Thanx again. Last edited by Mydgard; 04.06.2022 at 17:08. |
#2222
|
|||
|
|||
I have tested only on list with few links, so not sure how long it will take for 10k+ links. How many of those are unfinished links? If in doubt, I can modify the script to display a message at the end. You will have to restart JD (to abort the current script if it is running) and add the new script and run it again.
|
#2223
|
|||
|
|||
|
#2224
|
|||
|
|||
Code:
/* Export link info Trigger : Downloadlist contextmenu button pressed Customizze downloadlist context menu > Add a noew "Eventscripter Trigger" button > Rename it to "Export link info" */ if (name == "Export link info") { var file = JD_HOME + "/" + Date.now() + ".tsv"; var data = []; dlSelection.links.forEach(function(link) { data.push([getPath(link.downloadPath).extension, link.name, link.bytesLoaded, link.downloadPath, link.url].join("\t")); }) if (data.length) { writeFile(file, data.join("\r\n"), true); } } |
#2225
|
|||
|
|||
@mgpai: All unfinished ... why should i hold finished links in jd2? Before you may ask: yes all those links are around 22,91 TB of space i don't have actual ... some of them are somewhat older Approx 17,5 TB are RG links ...
Last edited by Mydgard; 04.06.2022 at 17:36. |
#2226
|
|||
|
|||
Asked because many users leave finished links in the list to know if a file has been downloaded before and avoid downloading it again. There are users who have millions (not exaggerating) of finished links in the download list.
|
#2227
|
|||
|
|||
Okay? One Page ago you told someone he could use a dupe check within existing links, tried to google that, but ... even as dupe check in link grabber is active i have dupes in my list ... several times i downloaded a file and somewhere in the list it shows "mirror is loading" exact same name same file size ... dunno why jd2 didn't showed the new link in red to tell me it is still in list ...
Any chance to give me fix script to find those dupes? |
#2228
|
|||
|
|||
Hi mgpai,
I'm running my JD on a remote vps as a download station, and I ssh to the vps and upload the downloaded file to a transfer site called oshi.at using the command Code:
curl -T /path/to/file **External links are only visible to Support Staff** Thank you. |
#2229
|
||||
|
||||
@slcf2003: you should also explain what to do then for example what to do with response, most likely it will contain a download URL then?
Yes you can, see/search this thread, there are many examples how to execute external tool/script.
__________________
JD-Dev & Server-Admin |
#2230
|
||||
|
||||
Quote:
Quote:
I'm sure mgpai can help with this, but you will have to decide the condition. because filename alone is unsafe. also you have to decide if you only want to check against safe or unsafe filename/filesize. Safe = JDownloader will use the filename/filesize for downloading (eg information provided by an api) while unsafe means that filename or size can still change. Maybe you should try to do a teamviewer session with mgpai(or us) and explain him what you want to achieve, so he/us can take a look at the *situation* and provide better help!
__________________
JD-Dev & Server-Admin Last edited by Jiaz; 05.06.2022 at 15:15. |
#2231
|
|||
|
|||
Quote:
|
#2232
|
|||
|
|||
parse a url from a board needing login
Hi ! I try to make a script to let JD to search some direct links from a board where it's needed to be logged in to view those direct links; but I can't fin dexample...Could you help me please? thanks
|
#2234
|
||||
|
||||
@Tomrich: close JDownloader/kill running process and the same command line can be used
Quote:
else contact us via support@jdownloader.org and we can take a look at it together
__________________
JD-Dev & Server-Admin |
#2235
|
|||
|
|||
Script Request - Split larger than x_size and move them to folder
1. Download Files (extensions are usually, .rar .zip .exe .dmg .apk) 2. Download Finished (Extraction of .rar or .zip is not required) 3. If filesize is smaller than 999 MB move to folder A 4. Else, Split Files into parts to a certain size, usually 999 MB parts (Size999.part1.rar, Size999.part2.rar) 5. Move split files to folder A Last edited by xefeg; 09.06.2022 at 09:18. |
#2236
|
|||
|
|||
Script - Move downloaded files from specific folder to folder A
Using Packagizer, I sort my files based on the host it downloaded. Here is what I need. 1. Download from Google.com 2. Folder for Google.com is Folder A using Packagizer 3. Move files from Folder A to Folder B Solution is here by mgpai https://board.jdownloader.org/showpo...6&postcount=35 Last edited by xefeg; 09.06.2022 at 12:52. Reason: Request filled and link to the script |
#2237
|
||||
|
||||
Script to add (random) delays between requests and/or downloads
@mgpai or anyone who can help
NEW REQUEST Adding (random) delays between requests and/or downloads Many webpages detect too many requests/downloads in a short period of time and then block the user IP for some time. That's why with some Download Managers the user can define delays between requests/downloads. If possible, a script for this would really be appreciated. A possible URS would look like this A) Timer definition (optional!) The user can either define a fixed delay in seconds. Optional (not that important): User can define a random delay between X and Y seconds. Note: To make things not too complicated, a script with a fixed delay, e.g. 2 seconds, would be fine with me. B) (Random) delay while grabbing links Link grabbing must be reduced to only one request at a time. After a request has been sent by JD, the 'programmed' timer will start. Once the timer has run out, the next request will be sent. C) (Random) delay while downloading Max. simultaneous downloads should work as usual. Let's assume the limit was set to 3: JD should start with the first download. At the same time the 'programmed' timer will start. Once the timer has run out, the second download will be started. Again the 'programmed' timer will start. Once the timer has run out, the third download will be started. Once again, the 'programmed' timer will start. If one of those three downloads has been finished, a new download will not be started before the timer has run out. PS: I filed this as a feature request first, but it was declined. At the same time pspzockerscene recommended to ask for such a script here. |
#2238
|
||||
|
||||
Quote:
A.) you can use sleep command in scripts B.) @mgpai: you could use packagizer hook here to add sleep/wait in linkcrawler C.) @StefanM: you can add sleep/wait before/after a download start/finish but only on download/link level NOT on request level @mgpai: synchronous script with dummy script Quote:
__________________
JD-Dev & Server-Admin Last edited by pspzockerscene; 09.06.2022 at 16:28. Reason: Fixed typo |
#2239
|
||||
|
||||
Why the need to split them? Please know that this will most likely break (manual) extraction of multipart archives when a single part is larger and split. for example: first file might be Size999.part1.part1.rar, Size999part1.part2.rar and second file Size999.part2.rar
__________________
JD-Dev & Server-Admin |
#2240
|
|||
|
|||
Quote:
|
Thread Tools | |
Display Modes | |
|
|