#1
|
|||
|
|||
Possible improve the speed of the addition of a single or more packages to download?
Possible improve the speed of the addition of a single or more packages to download?
Ps. The current jdownloader installation on a fast SSD 500K-1M links | Current setting -Xmx7GB The duration of the addition of one or more package ~1 minutes ~1,5 minutes |
#2
|
||||
|
||||
Speed highly depends on number of links in package.
__________________
JD-Dev & Server-Admin |
#3
|
|||
|
|||
Quote:
TEST 1000 links (1 package) = FAST ADD 7000+ links (1 package) = SLOW ADD Links (6 + 11 + 18 + 14 + 12) (5 package) = SLOW ADD |
#4
|
||||
|
||||
You talked about adding package and NOT crawling!
Crawling highly depends on the plugin and limits. If this is about VK plugin then we can stop right here!
__________________
JD-Dev & Server-Admin |
#5
|
|||
|
|||
Add packages from [TAB Linkgrabber] to [TAB Download], very clearly I expressed itself.
|
#6
|
||||
|
||||
And how many Links/Packages are within your Linkgrabber/DownloadList?
If there are many then I already linked the correct ticket
__________________
JD-Dev & Server-Admin |
#7
|
|||
|
|||
Above 200k - 1000k (LinkGrabber list)
The problem only affect "Linkgrabber list" (does not apply to "Download list") |
#8
|
||||
|
||||
Linkgrabber List has to maintain a duplicate map. so very long list = more time required.
That's exactly what the linked ticket is about
__________________
JD-Dev & Server-Admin |
#9
|
|||
|
|||
This problem is always present, even after the new installation, and even if it contains only one link and one package (in Linkgrabber)
VERRRRYYYY SMALL LIST!!! It occurs only in the case of a domain Vk! Last edited by Jiaz; 31.10.2016 at 13:30. |
#10
|
||||
|
||||
The stacktrace shows that the zip file is empty and not correct written. Either filesystem error or you killed JDownloader during shutdown. And again this has NOTHING to do with the thread topic!
__________________
JD-Dev & Server-Admin |
#11
|
|||
|
|||
Quote:
Every Run or restart JD2 is the same error. I checked it on various system configurations 1. Acer (Laptop) XP/Vista 2. and PC Desktop Win 7 My friend is also checked and said that for him the same error. The error occurs only if the domain VK and a ONLY new version/build of JD2 In all the old versions (test many different installation), for example. 2013, 2014 (No error!!!) |
#12
|
|||
|
|||
1. Either filesystem error
2. killed JDownloader during shutdown I tested domain Code:
zippyshare.com, and various other domain b) cutting off Power Supply (!) = No error !!! I tested domain Code:
vk.com a)Correct Exit or Restart = ALWAYS Error b)Without power off = ALWAYS Error So what's going on here? Where there is an Internal Error in the new version JD2? |
#13
|
||||
|
||||
Please provide a single VK link for testing.
And PLEASE post in the correct thread! This is the wrong thread!
__________________
JD-Dev & Server-Admin |
#14
|
||||
|
||||
This error
Empty/Invalid Zip:C:\JDownloader2\cfg\linkcollector1416.zip|Size:0 Signals that the Zip file is empty. When there are issues during saving the file will be deleted and there are traces in logfile.
__________________
JD-Dev & Server-Admin |
#15
|
|||
|
|||
I found an old topic of August 2016, so you could move:
https://board.jdownloader.org/showpo...3&postcount=38 Restart or Run repeating a 1000 times restart or run generate 1000 errors in the log 1 Run or Restart = 1 Error Found 1000 Run or Restart = 1000 Error Found only the number of erroneous archive change (each time a different number) |
#16
|
||||
|
||||
Works perfectly fine for me.
I can offer you a teamviewer session about this. Just send me ID and PW to support@jdownloader.org
__________________
JD-Dev & Server-Admin |
#17
|
|||
|
|||
Jiaz - Now I know what causes this error.
The startup parameters causing this error: Empty/Invalid Zip Code:
"C:\Program Files\Java\jre1.8.0_112\bin\javaw.exe" -Xdiag -Xms2G -Xmx7G -jar C:\Jdownloader2\JDownloader.jar |
#18
|
||||
|
||||
The zip file is created during saving and not during loading
Nothing to fix in JDownloader. It simply fails to save the list because it is out of memory!
__________________
JD-Dev & Server-Admin |
#19
|
||||
|
||||
I will reduce memory usage during saving the list. Wait for next core update
__________________
JD-Dev & Server-Admin |
#20
|
||||
|
||||
Reduced memory usage during linklist saving, please check again with next core update
__________________
JD-Dev & Server-Admin |
#21
|
|||
|
|||
True, during the closing JD2 (process javaw.exe uses much more RAM)
The application (process javaw.exe) for a long time before it closes probably save) |
#22
|
|||
|
|||
Quote:
Code:
[jd.controlling.linkcollector.LinkCollector$16(run) CollectorList found: 10564/1157667 java.lang.OutOfMemoryError: Java heap space at java.io.BufferedOutputStream.<init>(Unknown Source) at jd.controlling.linkcollector.LinkCollector$23.run(LinkCollector.java:2315) at jd.controlling.linkcollector.LinkCollector$23.run(LinkCollector.java:2249) at org.appwork.utils.event.queue.QueueAction.start(QueueAction.java:202) at org.appwork.utils.event.queue.Queue.startItem(Queue.java:491) at org.appwork.utils.event.queue.Queue.runQueue(Queue.java:425) at org.appwork.utils.event.queue.QueueThread.run(QueueThread.java:64) |
#23
|
||||
|
||||
Thanks for the logfile. Now I know what you mean, Please update next core update in a few minutes
__________________
JD-Dev & Server-Admin |
#24
|
|||
|
|||
BUG: New update (31-10-16) causes the loss of 1000,000+ links
Quote:
1 mln - 1,4 mln links Update causes damage to the archives! Previous Size "LinkCollector": 657 MB Now 1,9 MB !!! zip.backup 9 MB linkcollector.zip Previously in the previous version JD2 (example date 30-10.16) NEVER damage archive even if forcibly cut off power supply (so better!) Old Archive rename larger number, but in the future again and again and again cause damage. A very serious bug! |
#25
|
|||
|
|||
Serious Bug JD2 (31.10.2016 - 1.11.2016)
Previous Version: (...) (...) - 30.10.2016 work
In previous logs is not has performed this entry, now find all the time New: This causes damage to the removal and proper archives (even old archives !!!) All the time the loss of links Code:
------------------------Thread: 95:Log.L.log----------------------- --ID:95TS:1478004548042-01.11.16 13:49:08 - [org.appwork.shutdown.ShutdownController$3(run)] -> Exit Now: Code: 0 ------------------------Thread: 21:Log.L.log----------------------- --ID:21TS:1478004548421-01.11.16 13:49:08 - [] -> java.net.SocketException: socket closed at java.net.TwoStacksPlainSocketImpl.socketAccept(Native Method) at java.net.AbstractPlainSocketImpl.accept(Unknown Source) at java.net.PlainSocketImpl.accept(Unknown Source) at java.net.ServerSocket.implAccept(Unknown Source) at java.net.ServerSocket.accept(Unknown Source) at org.appwork.utils.singleapp.SingleAppInstance$1.run(SingleAppInstance.java:364) at java.lang.Thread.run(Unknown Source) |
#26
|
||||
|
||||
Not confirmed - your screenshot doesn't prove anything.
How do we know that you didn't delete them by yourself? Here are mine list - before (21:47) and after (21:51) latest core update: Nothing was lost after updating, filesize was updated after I've moved 2 packages into Downloads. Last edited by raztoki; 01.11.2016 at 23:41. |
#27
|
|||
|
|||
editestowy -
1. Misread my previous post!!! Linkcollector!!! (NOT(!) DownloadList 2. You tested with 1,000,000 - 1,400,000 Link??? 600 - 700 MB ! Certainly not! So I do not understand your answer! Jiaz had to update the core (was supposed to reduce the use of creating a backup that caused damage to the archive eg. After a long terminate process javaw.exe (more than 60 seconds TimeOut! Socket Exception Error) I do not like the new update https://board.jdownloader.org/showpo...5&postcount=19 https://board.jdownloader.org/showpo...6&postcount=20 because it bring catastrophic consequences. I'm going back to the old version. I'm sorry! I'm telling the truth. Reproduce a) 1+ mln Links (600-700 MB) b) Restart JD2 WARNING: Unexpected end of archive 0000 (broken) (1 MB) |
#28
|
|||
|
|||
New update - unexpected bad results
Why after a restart JD2 damages archive (Practically all the time) Note (!) Also removes Xold archives Closing the application will take several minutes. Perhaps it may work for 100,000 - 200,000 but it does not work correctly for 1,400,000 K links (!) CollectorLinks |
#29
|
||||
|
||||
Please provide complete logfile. Nothing changed and that you have links loss is the same reason for your other bugreport about those empty/zero size files. It is the exact same cause.
The Stracktrace you've provided has nothing to do with this
__________________
JD-Dev & Server-Admin |
#30
|
||||
|
||||
Please stop telling lies. Only difference is that instead of writing everything to memory first and then one big write to disk (which caused the high memory usage for you) it is now written with smaller buffer. So in fact it is the same issue you suffer all the time that the shutdown has not enough time to write list and instead of having a zero size file (before) you now have a half finished file.
__________________
JD-Dev & Server-Admin |
#31
|
|||
|
|||
f I'm lying, please withdraw this update. I repeat once again, please remove this update. I do not need an update, which causes damage to (the loss of links) Thanks.
Previously, no damage archive LinkCollector (only bakup) 0byte. That's all from my side! I do not have the strength to prove everything every time (this is my personal opinions / testing) Sorry for wszytsko. I'm just experiencing problems so he writes. You do not believe me. Excuse me :( |
#32
|
|||
|
|||
Quote:
Note very successfully re-START JD2 (not to be confused with the launch (run JD2) !!!) It is unfair suspicion if someone does not have a million links and test :( |
#33
|
|||
|
|||
|
#34
|
|||
|
|||
02.11.16 10.50.48 <--> 02.11.16 10.51.03 jdlog://3681881887641/
|
#35
|
|||
|
|||
New Update
I set -Xmx11 GB is too little? for 1 million links? Why damages zip archive XXX.zip & XXXzip.bakup and lose links after restart. Please answer? Old Version -Xmx6G or 7G - support 1+ mln links (new version - NOT SUPPORT) corrupt archiwe when restart JD2 |
#36
|
|||
|
|||
How to: always autorecover links after a Run or restart JD2?
I mean more than 10,000 packages and millions of links? |
#37
|
||||
|
||||
This is still the same error as before.
Before you had zero byte Files and now you can see that JDownloader has not enough time to save the complete list. Do you close JD normally or shutdown Computer?
__________________
JD-Dev & Server-Admin |
#38
|
||||
|
||||
Quote:
The question is, why is the file broken/not finished. So do you close JDownloader normally or kill it (eg on computer shutdown)?
__________________
JD-Dev & Server-Admin |
#39
|
||||
|
||||
Quote:
It is the same cause for 0 byte files. Now JDownloader is able to start writing the files but because of unknown reason JDownloader is killed before saving the list.
__________________
JD-Dev & Server-Admin |
#40
|
||||
|
||||
Before the update:
Write 500 mbyte to memory and then begin to write to disk ->result: 0 byte file written, broken list Now: Directly write to disk with smaller buffer (1 Mbyte max) -> result: file size depends how long JDownloader before it gets killed, broken list HOW do you do it? Please the exact steps! This problem only happens for you and we need to find out why. I've just tested with 500k links and 10k packages. all fine
__________________
JD-Dev & Server-Admin Last edited by Jiaz; 02.11.2016 at 14:42. |
Thread Tools | |
Display Modes | |
|
|