View Single Post
Old 08.10.2021, 09:07
EquiNox EquiNox is offline
Wind Gust
Join Date: Sep 2021
Posts: 40

Ok, thanks for the very detailed explanation, psp!

And fully understood the meaning of all those settings now!

Originally Posted by pspzockerscene View Post
All you need to do is to disable:
...and restart JD.
Yes, that's exactly what I have done.

Originally Posted by pspzockerscene View Post
Now I've completely read your post but at this moment there is no point for me to go into details regarding your test results.
When adding many items there is a lot of stuff that could go wrong such hitting one of flickrs rate-limits so better test with one link and whenever you experience "missing items" we can check it and see if we maybe have to slow down our crawler to avoid hitting such rate-limits...though this is just an assumption here.
Until now I've never hit any of their rate-limits during testing.
The solution for this -> getting all albums with full number of items (when duplicate check has been deactivated the way we just talked about) is absolutely easy (in theory :-p). I only need the entire list of all album links...

..and copy paste them one-by-one (always waiting until the last album has been completely parsed!) and I get the desired result immediately because every album is allowed to have any item in the stream even when it's already part of another album. So would it be possible to automat this? Just curious. As said this only happens seldom and I do not want to slow down album parsing just because for such rare occasions.

In a batchscript a one-liner would do it for me, album-links.txt contains the links of all albums with their id like in the code box above:
@for /f %%j in (album-links.txt) do echo Copy to clipboard: %%j && timeout /t 5 && echo %%j | clip
Copy the link to the clipboard, wait 5 seconds and then copy the next to the clipboard. But after testing now this was maybe still too fast for JD, I don't know. At least in the LinkGrabber there did not yet show up all items and albums after one time parsing this way. Maybe I should slow it down little more.

Originally Posted by pspzockerscene View Post
Without replying to all of your further tests, I'll try to add an explanation:
First of all, to simplify tests, I'm using an URL to a single album here:
--> Contains 37 items
Reason for this:
Adding thousands of items for testing the dupes behavior simply takes too much time so it doesn't make sense

Keep in mind that the first two dupes settings need a JDownloader restart to become active!

Enabled = Do not allow duplicated items in linkgrabber
--> If you add the above album twice, you will only get all items once.
Disabled = allow duplicated items in linkgrabber
--> If you add the above item twice, resulting in a single package containing 74 URLs. If you add it again and again, that package will get bigger each time.
That stream is also a good example as it has a lot of items that are part of more than one album. But one album does not help to test the issue. You need a few more at least that do have an overlap of items, so some items are part of more than one album, maybe the "Origami"-albums.

Originally Posted by pspzockerscene View Post
Originally Posted by EquiNox View Post
Ok, this is probably not entirely correct. When albums have the same name (this is allowed at Flickr), they are simply merged in one package in LinkGrabber. That could also have happened here.
Please provide some examples for that.
**External links are only visible to Support Staff**
Exactly the same album name, but now thanks to your great packagename change this issue is fixed and they both can get their own package!
Thanks for the big update, it's much appreciated!
The one for Menu Manager too, still need to test.

Thank you both again very much, psp and Jiaz!
Reply With Quote