JDownloader Community - Appwork GmbH
 

Reply
 
Thread Tools Display Modes
  #1  
Old 12.07.2012, 00:15
13mh13 13mh13 is offline
Linkgrabbing Monster
 
Join Date: Sep 2011
Posts: 85
Default JD's VM Size & CPU use too high!!

These issues may have been prev. discussed---I didn't exhaustively research in the archives of this forum, so if there are topical threads, just throw some links my way ... thx.
Anyway ...

When JD (my version: 0.9.581, with all the updates) is running, my two separate PC systems -- XP-Pro and Win 7 -based -- show massive CPU activity (80-90%) and VM Size is 483MB.
That makes both of these PCs very slow and "draggy". Any way around this? Are the latest version of JD more efficient?

Last edited by Jiaz; 12.07.2012 at 12:40.
Reply With Quote
  #2  
Old 12.07.2012, 06:12
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 16,196
Default

A typical fresh install will use less than 100MB RAM to load and run with a few hundred links. I run about 6000 Links in a old nightly test test version and it consumes about 150MB/RAM. Once you start hitting that 512 MB ram limit you've started to the upper limit of reserve HEAD space, you might need to allocate more by starting JDownloader by command line/shortcut,
javaw -jar jdownloader.jar -Xmx1024m

When you reach that upper limit the program and the OS swapping like mad to keep the programming running as you're out of memory allocation.

The latest version of JD (JDownloader 2) has had extensive rewrites and does consume less memory resources and rewrites to other components require less cpu cycles. Typically when your eating that much ram it tends to be because you have 1000's of links (virgin or completed downloads) within the download list. Each entry requires a memory allocation. To reduce memory use reduce your download list often by removing completed downloads. Do not use it as a 'history' of what you've been downloading as this doesn't work well for your memory requirement issues. CPU load it shouldn't chew that many cpu cycles, unless you are constantly downloading but also say using extracting addons. Want less cpu cycles my recommendation is turn off all auto extraction procedures, along with checksuming, but that would then mean you need to do these tasks yourself!

raztoki
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]
Reply With Quote
  #3  
Old 12.07.2012, 07:24
13mh13 13mh13 is offline
Linkgrabbing Monster
 
Join Date: Sep 2011
Posts: 85
Default

Quote:
Originally Posted by raztoki View Post
A typical fresh install will use less than 100MB RAM to load and run with a few hundred links. I run about 6000 Links in a old nightly test test version and it consumes about 150MB/RAM. Once you start hitting that 512 MB ram limit you've started to the upper limit of reserve HEAD space, you might need to allocate more by starting JDownloader by command line/shortcut,
javaw -jar jdownloader.jar -Xmx1024m

When you reach that upper limit the program and the OS swapping like mad to keep the programming running as you're out of memory allocation.

The latest version of JD (JDownloader 2) has had extensive rewrites and does consume less memory resources and rewrites to other components require less cpu cycles. Typically when your eating that much ram it tends to be because you have 1000's of links (virgin or completed downloads) within the download list. Each entry requires a memory allocation. To reduce memory use reduce your download list often by removing completed downloads. Do not use it as a 'history' of what you've been downloading as this doesn't work well for your memory requirement issues. CPU load it shouldn't chew that many cpu cycles, unless you are constantly downloading but also say using extracting addons. Want less cpu cycles my recommendation is turn off all auto extraction procedures, along with checksuming, but that would then mean you need to do these tasks yourself!

raztoki
Yes -- several-K links

Many don't work since the MU take-down (lotta file-hosts left us then!). I want to delete them from JD but I still want the list (= record) of them (along with their URL, Package/Filename, Status, etc.) Can I squeeze out an XLS or similar list (out of JD) and then get rid of them from JD?
Reply With Quote
  #4  
Old 12.07.2012, 08:31
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 16,196
Default

simple answer is no raw text list can be exported, nor can it in JD2. You can make a DLC, it saves link info and passwords within crypted volume. You can do a backup of jdownloader/config/ and then start deleting old stuff, revert config for backup history. that's the only way to keep a history without suffering issues with large download queue.
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]
Reply With Quote
  #5  
Old 14.07.2012, 14:25
13mh13 13mh13 is offline
Linkgrabbing Monster
 
Join Date: Sep 2011
Posts: 85
Default

Quote:
Originally Posted by raztoki View Post
simple answer is no raw text list can be exported, nor can it in JD2. You can make a DLC, it saves link info and passwords within crypted volume. You can do a backup of jdownloader/config/ and then start deleting old stuff, revert config for backup history. that's the only way to keep a history without suffering issues with large download queue.
I assume database.script is the 'main' file in jdownloader/config/ that contains the links ... and no way to edit or display in text editor?

BTW my database.script is ~14MB. Is that "too large"?
Reply With Quote
  #6  
Old 14.07.2012, 14:37
13mh13 13mh13 is offline
Linkgrabbing Monster
 
Join Date: Sep 2011
Posts: 85
Default

Okay, I deleted most of the links (down to a few 100) but that only reduced database.script to 9.4MB -- and CPU and VM issues are virtually unaffected.
Reply With Quote
  #7  
Old 16.07.2012, 03:24
13mh13 13mh13 is offline
Linkgrabbing Monster
 
Join Date: Sep 2011
Posts: 85
Default

I just updated JD (last updated in May), and even with most of those old links deleted, my VM is is up over 1GB ... the exact opposite what one would expect.

JD "support" or "development" team ... any clues?
Reply With Quote
  #8  
Old 16.07.2012, 13:28
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 65,456
Default

vm size with more than 512/640mb (default Xmx parameter) is a sign for memleak inside firewall/av. no unusual issue with some firewall/Av.
What exactly is the problem? JD may eat up to its max heap limit and thats okay.

if you want to use several thousand links, pleaes use the the pre-beta of jd2 for it
http://board.jdownloader.org/forumdisplay.php?f=50

old stable/nightly are not meant for so many links!
__________________
JD-Dev & Server-Admin
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 06:25.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2019, Jelsoft Enterprises Ltd.