@mgpai thanks a lot but I already started working on the script using one for finished downloads and another for new links where I use "A Download Stopped" + checking "myDownloadLink.isFinished();" and "A new link has been added", it is way more optimised, performance friendly and faster than trying to use a single script with many API calls, I mean, I already feel bad because I have to go twice through the same array xD.
I am busy doing a lot of things and I have a life rhythm leaving me with not much free time so the script(s) progress slowly, but it is still one of my priorities.
By the way, how much URLs per files should I put by default to optimise both memory and performances, considering that if the url is found in the first file(s) it will pay off to have "many" small files but it will punish if the download is on the latest one (also knowing that I'll first compare with downloads already in JD) ?
Also my current code remove the http(s) and the www when it is in the link because I thought that there could be cases where one files have been downloaded through a "http" and another through "https", and knowing that some websites accept both with and without "www" it might also reduce possible different URLs pointing to the same address, but is it a good idea or should I remove it or make it optional ?
|