Thread: [Developer Feedback required] Too Many Links
View Single Post
  #3  
Old 27.04.2021, 02:08
TomArrow TomArrow is offline
JD Alpha
 
Join Date: Sep 2017
Posts: 21
Default

After years of using jDownloader2, and even after a CPU upgrade, I have to say that this software needs a serious serious refacturing in my opinion.

jDownloader should never run out of memory just because one has too many links. There is no need to keep all the links in memory. Indexed databases like MySQL can handle millions of entries without breaking a sweat but this software becomes unusable with a few dozens of thousands.

Yesterday I accidentally copied some strange link into the clipboard. Today I woke up to a completely unresponsive jDownloader2 eating 99% of my CPU and being at its own RAM limit. It would not respond to any of my commands. I did your trick and increased RAM usage to 8 GB and now it eats about 7.5GB of memory and its still eating a *lot* of CPU.

Now I can see that it crawled about 10 thousand new links over night. 10 thousand! And that was enough to completely kill the software.

And now I tried to copy those new links into a new package and BAM it completely froze the entire software again. This is such a joke.

During my usage of the software over the years I've had to almost completely clear the history regularly just to keep the software functioning. Which of course is a garbage solution because then it can no longer recognize what was already downloaded, as OP pointed out.

I once peered into the zip files and it's all zipped JSON files. Of course there's no way to quickly access huge zipped JSON files. No wonder it's slow as hell.

Please, for the love of god, consider doing a revamp of this whole concept of how things are saved. It is and has been for years the one big blemish that this software has.

I don't know the source code but I don't see a reason why all the entries couldn't be stored in an indexed database and read on demand. Some form of SQL is an established solution that should do the trick. All it needs is an index to find things, which doesn't eat up much memory because its cleverly designed.

It's just not acceptable for a software whose task is to run 24/7 in the background for nothing but mere downloading of files, to eat up a lot of CPU, sometimes 100%, and incredible amounts of RAM, making the entire computer sluggish. And mind you, even with some tricks pulled to make it at least responsive again, often times you end up with situations like changing a download folder in a single link and having to wait a minute for the GUI to unfreeze after that. This can't be the way...

After writing all this, I am *still* struggling to get rid of the new links, having now increased memory limits to 16 GB, which is all the RAM I have.
Reply With Quote