JDownloader Community - Appwork GmbH
 

Reply
 
Thread Tools Display Modes
  #1  
Old 03.09.2011, 03:52
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default Memory Leak?

I have been pushing JD a tad hard, but by no means harder than heavy users. For example, right now I have 5500 links queued. I am running Max Conn 5 and Max Dls 5. I have all kinds of links in it.
My system is a tad old, but still OK:

Windows 2000 SP4 Server
RAM: 2 Gb
Swap File: 1.5 Gb (fixed size)

There are no other active apps running in the background and and JD is the only java app.

What I noticed is that at start-up JD consumes RAM and Virtual Memory at the same rate and then it goes ballistic with the VM and it swallows it all (according to Windows 1.7 Gb which is strange). At that point, MemTurbo kicks in trying to release RAM and JD (presumably) runs out of VM and crashes.

What is strange is that JD is not releasing VM. If it would do so, there would be no problems.

Any ideas?

Last edited by Jiaz; 05.09.2011 at 10:19.
Reply With Quote
  #2  
Old 03.09.2011, 04:55
tony2long's Avatar
tony2long tony2long is offline
English Supporter
 
Join Date: Jun 2009
Posts: 6,507
Default

What is your Java version? I think Java should handle this.
Reply With Quote
  #3  
Old 03.09.2011, 08:02
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 17,614
Default

You can try starting jdownloader with more head room

default setting 'java -Xmx512m -jar JDownloader.jar'
try 'java -Xmx1024m -jar JDownloader.jar'
or even up to 2048 depending on your system. (but since your system only has 2GB maybe not hehe)

This allows JD to be able to work larger download queues.

Each entry in the queue takes up memory. So remove any completed packages/downloads from the queue, as this also reduces memory requirements. Maybe save half of your queue (as DLC), remove it and add it back later once you finished the first half. This will also reduce your memory requirement.
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]
Reply With Quote
  #4  
Old 03.09.2011, 14:56
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Quote:
Originally Posted by tony2long View Post
What is your Java version? I think Java should handle this.
Hi, Java version:


Version 6 Update 22

build 1.6.0 22-b04
Reply With Quote
  #5  
Old 03.09.2011, 15:00
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Quote:
Originally Posted by raztoki View Post
You can try starting jdownloader with more head room

default setting 'java -Xmx512m -jar JDownloader.jar'
try 'java -Xmx1024m -jar JDownloader.jar'
or even up to 2048 depending on your system. (but since your system only has 2GB maybe not hehe)

This allows JD to be able to work larger download queues.

Each entry in the queue takes up memory. So remove any completed packages/downloads from the queue, as this also reduces memory requirements. Maybe save half of your queue (as DLC), remove it and add it back later once you finished the first half. This will also reduce your memory requirement.
Tx!
will try expanding ram usage see how it goes.
Wrt removal: I do so religiously but the VM does not shrink no matter how many completed files I remove.
Wrt DLCing, it's a pain in the neck cos it ruins my "downloading style", besides, It was working OK up to a few weeks ago with a larger queue.
Reply With Quote
  #6  
Old 03.09.2011, 15:11
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 17,614
Default

Get faster internet and or buy premium accounts :>

zoom zoom
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]
Reply With Quote
  #7  
Old 03.09.2011, 21:35
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Quote:
Originally Posted by raztoki View Post
Get faster internet and or buy premium accounts :>

zoom zoom
Huh?
What's got bigger bandwidth and/or premium accounts got to do with it?
It is the number of active downloads (in parallel OR sequence) that's killin JD. So, let's assume I get a larger bandwidth.... what's the end result? I still need to dl 5500 links. So, it does nothing to solve my problem.
Now, let's assume I buy 10 or 15 premium account (assuming I don't want to eat for a month or so), I still need to dl 5500 links. Which, again, does bupkus to solve my problem.
Reply With Quote
  #8  
Old 03.09.2011, 21:59
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 17,614
Default

The faster your link is, or the ability to download (say with premium accounts) the quicker you download list clears. That was the only correlation I was trying to make with large queue.

You stated previously that you were using multiple chunks + concurrent downloads, but those settings to me would not necessarily cause any harm if your connection can handle (5 5 0) = 25 connections.

try logging to file by running 'windows_createlog.bat' from within jD install folder.

upload your logs to http://jdownloader.org/pastebin/ and then post the urls here.
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]
Reply With Quote
  #9  
Old 03.09.2011, 22:54
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Quote:
Originally Posted by raztoki View Post
The faster your link is, or the ability to download (say with premium accounts) the quicker you download list clears. That was the only correlation I was trying to make with large queue.
Yes, I understood. However, I still can't see the correlation. What's eating VM is the size and number of dls, which, do not change however fast or slow my bw is. In other words, if I am running 25 connections with a total size of let's say 100 Mb max before JD flushes to hdd, it is irrelevant if I mange to do so in 1 second or 5 minutes. JD will still use 100 Mb of VM. The problem arises if after finishing a download and flushing to hdd the corresponding VM is not released. Sorry, I don't see the correlation.

Quote:
Originally Posted by raztoki View Post
You stated previously that you were using multiple chunks + concurrent downloads, but those settings to me would not necessarily cause any harm if your connection can handle (5 5 0) = 25 connections.
It can. I have had a much, much higher connection load on other P2P clients (Torr, eMule, for example) without a hitch, the number of connections, in some test cases, was higher than 1000 without a hiccup.

Quote:
Originally Posted by raztoki View Post
try logging to file by running 'windows_createlog.bat' from within jD install folder. Upload your logs to http://jdownloader.org/pastebin/ and then post the urls here.
Will do although it will take some time. Right now, with all the experimenting with memory, all the logs are clean.

Tx! for all the help.
Reply With Quote
  #10  
Old 04.09.2011, 08:39
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 17,614
Default

Sorry this is were I'll disagree with you. Unless your adding links to JD as faster than they can download, at most times the queue should finish faster than you can add (That's what I find, though I hardly download these days, and I only have 3Mbit). The correlation is, if the link is faster than you can queue data to download, the download queue should never be 5000+ links in length to start with, or for a very short period. Since your computer very old maybe you should look at 'settings > basic > download and connection :: remove finished downloads: immediately' thus freeing up memory as the download queue progresses. JD has the same memory requirements of a queued list VS completed list, so the sooner you clear out completed downloads the better. Coming version (v1) will address memory issues within the table view better, along with a new database storage system.

Now I run a old system also 2048 MB (PC3200 DDR SDRAM) RAM also, that said I haven't experienced any troubles with memory (once when linkgrabbing 5000 youtube videos). My current download tab has 519 packages 2027 links all complete, using 113MB RAM. Maybe your issues are around connections, I've seen the logger fill up memory specially when associated with socket connection problems. When you upload your logfile this will give me a better indication.


Now to the p2p, its highly depending on what bandwidth peer/seeders can supply you per connection. Generally as this is done from range of connection speeds, on average they do not necessarily provide the same download speed per connection than say ftp/http (given that you have premium accounts). JD shouldn't need anywhere near the amount of connections as p2p to produce same bandwidth transfer KiB/sec as ISPs tend to QoS P2P more than HTTP. But that said I'm sure it's possible to transfers faster than JD assuming you are connected to only fast seed/peers.
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]

Last edited by raztoki; 04.09.2011 at 08:43.
Reply With Quote
  #11  
Old 04.09.2011, 11:14
remi
Guest
 
Posts: n/a
Default

@globus999

Update 22 is not the best version of java. I would remove it and replace it by JRE (1.)6 update 21. See "**External links are only visible to Support Staff**. Also use JavaRa before installing the new version to be sure that all remaining bits of java are removed.

Note also that the JRE will only reclaim memory when it's needed for other programs. If you don't run any other programs then the memory will increase until it reaches the maximum memory that you set at jD's startup.
Reply With Quote
  #12  
Old 04.09.2011, 14:44
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Hi, I guess will just have to disagree. Pls see notes below.

Quote:
Originally Posted by raztoki View Post
Sorry this is were I'll disagree with you. Unless your adding links to JD as faster than they can download, at most times the queue should finish faster than you can add (That's what I find, though I hardly download these days, and I only have 3Mbit). The correlation is, if the link is faster than you can queue data to download, the download queue should never be 5000+ links in length to start with, or for a very short period.
Not in my case since my Total Size is about 600 Gb. In this case it would probably require a 1GB or 10 GB bandwidth connection to achieve above. I am running on 6M/800K (down / up). However the important bit it that I have been using JD for over a year now and never had this problem before, and I was pushing hard previously.

Quote:
Originally Posted by raztoki View Post
Since your computer very old maybe you should look at 'settings > basic > download and connection :: remove finished downloads: immediately' thus freeing up memory as the download queue progresses. JD has the same memory requirements of a queued list VS completed list, so the sooner you clear out completed downloads the better. Coming version (v1) will address memory issues within the table view better, along with a new database storage system.
Yes, I understand that. You are also missing the fact that progressing download info gets stored in RAM and VM until flushed to hdd. However, please note that the key point is that whether JD removes a completed link or I do it manually, it should still release VM since I am not adding links (i.e. the total amount of links is decreasing). I am watching VM consumption by JD in real time when I remove completed links and makes no difference.

Quote:
Originally Posted by raztoki View Post
Now I run a old system also 2048 MB (PC3200 DDR SDRAM) RAM also, that said I haven't experienced any troubles with memory (once when linkgrabbing 5000 youtube videos). My current download tab has 519 packages 2027 links all complete, using 113MB RAM. Maybe your issues are around connections, I've seen the logger fill up memory specially when associated with socket connection problems. When you upload your logfile this will give me a better indication.
Will upload, although I am still running memory tests. Not sure if it will be of any use.


Quote:
Originally Posted by raztoki View Post
Now to the p2p, its highly depending on what bandwidth peer/seeders can supply you per connection. Generally as this is done from range of connection speeds, on average they do not necessarily provide the same download speed per connection than say ftp/http (given that you have premium accounts). JD shouldn't need anywhere near the amount of connections as p2p to produce same bandwidth transfer KiB/sec as ISPs tend to QoS P2P more than HTTP. But that said I'm sure it's possible to transfers faster than JD assuming you are connected to only fast seed/peers.
Yes, but JD's also dependen upon the bandwidth per hoster that they provide, and since I am using non-premium, they vary from 40Kb/s to 6 Mb/s (i.e. they saturate my connection). I agree that JD uses a heck of a lot less connections than P2P. But that was my point. My P2P tests were as bad as I could make it and still never had these VM issues. By definition JD should use less VM than a badly configured P2P app, yet, it consumes more. That's why I was wandering if there was a memory leak in JD that perhaps only shows up when utilization is quite high for long periods of time.

Oh, btw, I am running JD 24x7 for several days at the time (sometimes more than a week without reboot). I don't think there are too many people doing this. This may also be a contributing factor to the VM issue.
Reply With Quote
  #13  
Old 04.09.2011, 15:05
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Log files, as requested:

http://jdownloader.org/pastebin/51781

http://jdownloader.org/pastebin/51782

Not sure if they will be of any use since I am testing different memory settings.
FYI, it would seem that setting ram to 1024 makes no difference.
For example, in the latest test I was running 4600 links - 750 Gb and JD's original mem consumption was:

RAM: 115 Mb
VM: 125 VM

After one day of running, whithout adding new links and cleaning up the old ones, JD was using:

RAM: 310 Mb
VM: 325 Mb

(the numbers in RAM and VM consumption fluctuate a little bit, but they always increse).

It takes a few days of downloading for JD to consume all VM, so I'll wait until it crashes to submit the latest log.
Reply With Quote
  #14  
Old 04.09.2011, 15:08
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Quote:
Originally Posted by remi View Post
@globus999

Update 22 is not the best version of java. I would remove it and replace it by JRE (1.)6 update 21. See "**External links are only visible to Support Staff**. Also use JavaRa before installing the new version to be sure that all remaining bits of java are removed.

Note also that the JRE will only reclaim memory when it's needed for other programs. If you don't run any other programs then the memory will increase until it reaches the maximum memory that you set at jD's startup.
Tx!

So 21 is better? OK, I guess I can give it a try asap I figure out how not to loose all my links. It may take a while to complete all started dlds.

What about 27? Any recommendations?

Wrt mem utilization, yes, I understand. The problem is that it goes well beyond the set up memory.
Reply With Quote
  #15  
Old 04.09.2011, 15:19
remi
Guest
 
Posts: n/a
Default

You've experienced a memory leak and it might be caused by update 22. Update 21 is stable and your links are safe.

If you upgrade to a version higher than 25 you might lose your links. First make backups or use jD's versioning feature (Ctrl-B).
Reply With Quote
  #16  
Old 04.09.2011, 19:10
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 17,614
Default

I understood all of what you stated previously just disagree with most of it, regarding correlation between bandwidth/queue, and even some of the memory related statements.

100Mbit connection at conservative ~10MiB/sec * 86400(second/day) = 864000 MiB/Day or 843.75GB/day, your queue would be cleared in less than one day. I get about 1GB/hour on 3mbit so you must be ~2 .:. flat out your current queue would take ~12.5 days to download (without adding any more content).

JD should be releasing memory once removed from the download list. As long as JD removes data from its memory allocations (regardless where its been stored, ram or disk), it should be cleared. As I understand JD currently stores info for said link in memory regardless 0% 5% or 100% only way to clear those from memory is to remove it from list. Hence my comments. The coming version of JD has new format and handles memory allocations differently, you should be able to store many more packages/links without the overheads. If it was your normal practice to store that many in your queue, and never came apon these problems until recently you have to wonder what changed. We have not pushed any updates into stable or beta since ~December last year, other than plugin updates, maybe theirs something running in loop?


I see in one of the logs hr_error which I assume was caused by your memory tool?

Try the different JRE see if it makes any difference, as remi recommended.

Maybe might be worth while starting with clean configs by:
finishing your already commenced downloads
create a dlc of all other files queued.
rename jdownloader/config/ to jdownloader/config.old/
run jdupdate.jar (creates new configs and make sure all JD matches CRC).
Reconfigure JD.
Import the DLC.
See how you go from there. You can always revert back by renaming your old folder back again.
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]

Last edited by raztoki; 04.09.2011 at 19:12.
Reply With Quote
  #17  
Old 05.09.2011, 10:19
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,532
Default

wait for next major update which uses a lot less cpu/ram. many memleaks fixed and so on.
you will notice the difference for sure
__________________
JD-Dev & Server-Admin
Reply With Quote
  #18  
Old 05.09.2011, 21:58
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Thank you kindly to all.
Will try different suggestions and see what happens.
Looking forward to the next version
Reply With Quote
  #19  
Old 07.09.2011, 20:58
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

FYI, well, it's official. Nothing works.
Any combination / permuation of the following yields the same result: memory leak.

Java 21 or 27
Brand new config
512 or 1024 RAM

And yes, I did take the precaution to run JavaRa before re-installing 21 or 27. Just for the heck of it, I also did refresh all W2K server critical files (sfc /purgecache).

The only thing that works is to shut-down and restart JD every 24 hrs or so
Reply With Quote
  #20  
Old 07.09.2011, 21:15
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 17,614
Default

maybe provide one of these logfiles? you have yet to provide anything useful as the previous logs where only from testing?. If you start windows_createlog.bat , it will log to file you can then upload via http://jdownloader.org/pastebin/

cheers
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]
Reply With Quote
  #21  
Old 08.09.2011, 10:06
Jiaz's Avatar
Jiaz Jiaz is offline
JD Manager
 
Join Date: Mar 2009
Location: Germany
Posts: 79,532
Default

what memleak? jd is set to use max 512mb heap. in case your java process uses more than ca. 600mb then its a memleak inside firewall/av. nothing unusual and has happened before aswell
__________________
JD-Dev & Server-Admin
Reply With Quote
  #22  
Old 08.09.2011, 13:06
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Quote:
Originally Posted by Jiaz View Post
what memleak? jd is set to use max 512mb heap. in case your java process uses more than ca. 600mb then its a memleak inside firewall/av. nothing unusual and has happened before aswell
I am puzzled, what firewall? I am using W2K SP4 Server and no third party firewall. Wrt av, I have been using *the same* McAfee (up to date, of course) for almost 5 years now and it is rock solid.... albeit a tad overzealous.
Reply With Quote
  #23  
Old 08.09.2011, 13:08
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Quote:
Originally Posted by raztoki View Post
maybe provide one of these logfiles? you have yet to provide anything useful as the previous logs where only from testing?. If you start windows_createlog.bat , it will log to file you can then upload via http://jdownloader.org/pastebin/

cheers
I will, I did not bother running JD to a crash point while doing tests since it take a few days. All the confirmation I needed for testing was that the total mem:

1 - Kept growing
2 - Did not diminish upon removing (a *significant*) amount of files from queue.
Reply With Quote
  #24  
Old 08.09.2011, 14:36
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 17,614
Default

Well without real indication (useful logs) of what might be causing the problem, we may as well close this thread. You haven't provided us with the necessary information to diagnose. We have already given you the possible causes/solutions, which are educated guesses at best based on conversation within this thread. The only other aid we can use is teamviewer but that's probably pointless also as it takes a long time to occur.
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]
Reply With Quote
  #25  
Old 08.09.2011, 18:29
globus999 globus999 is offline
Mega Loader
 
Join Date: May 2011
Posts: 61
Default

Quote:
Originally Posted by raztoki View Post
Well without real indication (useful logs) of what might be causing the problem, we may as well close this thread. You haven't provided us with the necessary information to diagnose. We have already given you the possible causes/solutions, which are educated guesses at best based on conversation within this thread. The only other aid we can use is teamviewer but that's probably pointless also as it takes a long time to occur.
Yes, and I appreciate the effort *very* much. As I said before, I'll post the log when it collapses. Meantime, as a diagnosis aid, mem size, java version(s) and config corruption can be discarded as sources of the error, which, is a step in the right direction.
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 03:03.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.