#1
|
|||
|
|||
![]()
With over several thousands links in either the "downloads" or linkgrabber" or both sections, the program becomes extremely unresponsive. Is this solvable or is there any other method of having control over link collections so I don't download from the same URLs several times? I normally store all downloaded links in JD2, but it's so slow now that it's unusable.
If I have a list of 1000 URLs, and I find 1000 new URLs I want to download, I need to know how many of them are duplicates so that I don't download 2000 files if I already have downloaded 1000 files. Something like "remove duplicate lines" but with URLs, so it shows me what links are already in the list. |
#2
|
||||
|
||||
![]()
Hi,
1. Please read: https://support.jdownloader.org/Know...loader-is-slow 2. Please post a screenshot of: Help -> About JDownloader
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#3
|
|||
|
|||
![]() Quote:
![]() |
#4
|
||||
|
||||
![]()
See:
https://support.jdownloader.org/Know...loader-is-slow -> Search for "JDownloader is freezes or is slow due to JVM running out of memory" -> Solution is listed there.
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#5
|
|||
|
|||
![]()
Thanks! I will try this.
|
![]() |
Thread Tools | |
Display Modes | |
|
|