JDownloader Community - Appwork GmbH
 

Notices

Reply
 
Thread Tools Display Modes
  #1  
Old 16.10.2020, 20:26
JamConnor JamConnor is offline
Modem User
 
Join Date: Oct 2020
Posts: 1
Unhappy Too Many Links

Hello, I use JDownloader 2 to download files in bulk for my website. I have accumulated about 784k completed links in the Downloads tab. I don't want to delete them because, if I add the same links to the linkgrabber, it will tell me that I have downloaded them already and I can delete those links.

The problem is JDownloader is very hard to use now. It is using 100% CPU and most of my memory whenever I try to download new files.

Things I've tried:
- Moving all the completed links to one package named Old
- Disabling completed links
- Increasing GeneralSettings: Max Buffer Size to 2048
- Increasing GraphicalUserInterfaceSettings: Downloads Table Refresh Interval to 3000

The result is the program is functional again but it is still using about 85% CPU and half of my memory.

Last edited by JamConnor; 16.10.2020 at 20:30.
Reply With Quote
  #2  
Old 18.10.2020, 10:16
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,286
Default

High number of links and high cpu sounds like JDownloader runs out of memory and spends more and more time on memory management. Can you provide a screenshot of the about dialog? It shows details about memory.

Do not increase "Max Buffer Size" as this is for download only and just increases pressure on memory even more! leave it on default.

You can modify/increase
DownloadController.minimumsavedelay
DownloadController.maximumsavedelay
to reduce how often the list is written to disk. It waits minimum "minimumsavedelay" after last change in list and max "maximumsavedelay".

Check about screenshot, if the first and middle numbers of memory are near the last number, then your JDownloader/java runs out of memory. In that case you can modify available memory manually. Close JD
and modify the .vmoptions file in your JDownloader folder. Remove any -Xmx and -Xms lines
and just add
-Xmx2g
to allow max 2gbyte of RAM. With that many links I would recommend at least
-Xmx4g
to allow max 4gbyte of RAM.
Please know that this only works with 64bit Java, see https://jdownloader.org/jdownloader2
Backup the complete cfg folder and then reinstall JDownloader with 64bit version and then restore the backup of cfg folder/replace the new one completely with the old one
__________________
JD-Dev & Server-Admin
Reply With Quote
  #3  
Old 27.04.2021, 03:08
TomArrow TomArrow is offline
Super Loader
 
Join Date: Sep 2017
Posts: 27
Default

After years of using jDownloader2, and even after a CPU upgrade, I have to say that this software needs a serious serious refacturing in my opinion.

jDownloader should never run out of memory just because one has too many links. There is no need to keep all the links in memory. Indexed databases like MySQL can handle millions of entries without breaking a sweat but this software becomes unusable with a few dozens of thousands.

Yesterday I accidentally copied some strange link into the clipboard. Today I woke up to a completely unresponsive jDownloader2 eating 99% of my CPU and being at its own RAM limit. It would not respond to any of my commands. I did your trick and increased RAM usage to 8 GB and now it eats about 7.5GB of memory and its still eating a *lot* of CPU.

Now I can see that it crawled about 10 thousand new links over night. 10 thousand! And that was enough to completely kill the software.

And now I tried to copy those new links into a new package and BAM it completely froze the entire software again. This is such a joke.

During my usage of the software over the years I've had to almost completely clear the history regularly just to keep the software functioning. Which of course is a garbage solution because then it can no longer recognize what was already downloaded, as OP pointed out.

I once peered into the zip files and it's all zipped JSON files. Of course there's no way to quickly access huge zipped JSON files. No wonder it's slow as hell.

Please, for the love of god, consider doing a revamp of this whole concept of how things are saved. It is and has been for years the one big blemish that this software has.

I don't know the source code but I don't see a reason why all the entries couldn't be stored in an indexed database and read on demand. Some form of SQL is an established solution that should do the trick. All it needs is an index to find things, which doesn't eat up much memory because its cleverly designed.

It's just not acceptable for a software whose task is to run 24/7 in the background for nothing but mere downloading of files, to eat up a lot of CPU, sometimes 100%, and incredible amounts of RAM, making the entire computer sluggish. And mind you, even with some tricks pulled to make it at least responsive again, often times you end up with situations like changing a download folder in a single link and having to wait a minute for the GUI to unfreeze after that. This can't be the way...

After writing all this, I am *still* struggling to get rid of the new links, having now increased memory limits to 16 GB, which is all the RAM I have.
Reply With Quote
  #4  
Old 27.04.2021, 03:47
TomArrow TomArrow is offline
Super Loader
 
Join Date: Sep 2017
Posts: 27
Default

Addendum: After another half an hour, I've given up trying to move the new links into a new package. The RAM usage spiked to 16 GB and then it died again.

To be completely fair, it wasn't 10k new links, but 10k new packages, it was actually about a million of links. So maybe it wasn't quite as bad as I said, but still...

Get this into your head - moving about 1M links into a new package used up 8 GB of additional memory, around 3/4 of my CPU for half an hour and then it bloody died .... in relational SQL this would be an UPDATE operation that would take a fraction of a second.

This software seems like a kingdom built on top of a sand castle.
Reply With Quote
  #5  
Old 30.04.2022, 22:43
I3ordo I3ordo is offline
Mega Loader
 
Join Date: Mar 2022
Posts: 65
Default

i was concerned about this as well and came here to check what that script's name was, i think it gets the completed links and stores it somewhere then cleans the download list off downloaded files while still retain the history...
Reply With Quote
  #6  
Old 02.05.2022, 16:45
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,286
Default

@I3ordo: better ask here for the history script, see https://board.jdownloader.org/showthread.php?t=70525
__________________
JD-Dev & Server-Admin
Reply With Quote
  #7  
Old 03.05.2022, 15:02
Student im ersten Jahr Student im ersten Jahr is offline
Black Hole
 
Join Date: Nov 2020
Posts: 281
Default

A slightly different question.
Where is the option: "Priority low" for downloading.
So that it does not interfere with the internet speed of other applications in particular the browser.
Reply With Quote
  #8  
Old 03.05.2022, 15:22
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,286
Default

@Student im ersten Jahr: Such an option/feature does not exist. You have to set a speed limit
__________________
JD-Dev & Server-Admin
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 18:34.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.