JDownloader Community - Appwork GmbH
 

Reply
 
Thread Tools Display Modes
  #1  
Old 24.04.2023, 12:15
fuknuckle fuknuckle is offline
Black Hole
 
Join Date: Aug 2015
Posts: 253
Default Reddit plugin issues and Imgur TOS change

So Imgur is shutting down on may 15th and I am trying to download as many subreddits and user profiles as I can before Imgur joins GeoCities, MySpace, Tumblr, Digg, Photobucket and a whole shit ton of other now dead but influential services from the earlier days of the internet.

anyways so I tried scaping just 3 subs and JD cam back with much fewer results than it should have. it does this with user profiles as well.

Now I am not sure how the Reddit crawler works or if it is using the Reddit API but in my test this is what I found.

for r/Squirting JD returned 1192 items but when I copied the urls I manually gatherd which included 3480 reddit post urls, I got a total of 5710 items.

for r/maturewomen JD returned 1603 items but when I copied the urls I manually gatherd which included 3412 reddit post urls, I got a total of 5960 items.

for r/Feet JD returned 1116 items but when I copied the urls I manually gatherd which included 4372 reddit post urls, I got a total of 6058 items.

As for user profiles. these are the users I tested.

for u/NadyaBasinger JD returned 959 items initially but when I copied the urls I manually gatherd, JD increased the total to 1276 items out of 1361 urls.

for u/mideastrose JD returned 932 items initially but when I copied the urls I manually gatherd, JD increased the total to 2236 items out of 2287 urls.

for u/tightpixienurse JD returned 346 items initially but when I copied the urls I manually gatherd, JD would not change the total items. I gathered 789 urls.

for u/AlwaysAtPlay JD returned 657 items initially but when I copied the urls I manually gatherd, JD increased the total to 977 items out of 1462 urls.

so what's with these discrepancies? Maybe some of the posts are crossposts and JD isn't indicating that to the user?

So like I said, I don't know how the Reddit plugin works but it clearly isn't grabbing everything it can. so what I am doing is I am using a legacy version of FireFox and an extension called Copy All Links and I just open up every page in the default view "Hot" as far as it will let me go (1000) then I do the same for Top All Time then Top Year then Top Month then repeat for Controversial and then I use that extension to copy all links from all tabs then I past them into NotePad++ and then start sorting and filtering out the garbage and duplicate links. This allows me to grab every possible link but it is time consuming and JD should be doing a much better job of it.

there are hundreds of NSFW subs and many probably tens of thousands of user accounts that have years worth of content that is about to be purged and I want to archive as much as I can but there isn't much time!

PLEASE fix the Reddit plugin ASAP so it actually grabs EVERYTHING and goes back further than 1000 posts with the API (if that's possible) and please do so quickly. I imagine all these pics and videos are going to be around 15-20TB and I only have 3 weeks to grab what I can.
Reply With Quote
  #2  
Old 24.04.2023, 12:20
notice notice is offline
JD Supporter
 
Join Date: Mar 2023
Posts: 505
Default

Quote:
Originally Posted by fuknuckle View Post
So Imgur is shutting down on may 15th
I can only find information about them doing purge for explicit content. Nothing about them shutting down?
Reply With Quote
  #3  
Old 24.04.2023, 12:33
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

Quote:
Originally Posted by fuknuckle View Post
anyways so I tried scaping just 3 subs and JD cam back with much fewer results than it should have. it does this with user profiles as well.
...
so what's with these discrepancies? Maybe some of the posts are crossposts and JD isn't indicating that to the user?
Our reddit crawler does by default only return the first page of added subreddits and reddit profiles as this would otherwise return a lot of results which a lot of users most likely won't want by default and it would cause a lot of http requests.
See Settings -> Plugins -> reddit.com -> Set crawler limits for profiles and subreddits to "-1" to make JD crawl everything.

Quote:
Originally Posted by fuknuckle View Post
PLEASE fix the Reddit plugin ASAP so it actually grabs EVERYTHING...
As always:
Please provide example URLs and logs.
(Yes I do see your subreddits your've mentioned but I'm not going to test all of them separately. So far it looks good here.)
In this case, debug logs are needed!

Please post your log-ID here | bitte poste deine Log-ID hier.

-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?

Last edited by pspzockerscene; 24.04.2023 at 12:36. Reason: +1 sentence
Reply With Quote
  #4  
Old 24.04.2023, 12:34
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 17,614
Default

Think he is over exaggerating, they are not shutting down. All the profiles he mentioned are adult content related which content would be removed under the new terms of service.
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]
Reply With Quote
  #5  
Old 24.04.2023, 21:44
fuknuckle fuknuckle is offline
Black Hole
 
Join Date: Aug 2015
Posts: 253
Default

Quote:
Originally Posted by notice View Post
I can only find information about them doing purge for explicit content. Nothing about them shutting down?
Quote:
Originally Posted by raztoki View Post
Think he is over exaggerating, they are not shutting down. All the profiles he mentioned are adult content related which content would be removed under the new terms of service.
I stand by every word I have said. They are removing all NSFW content and all content uploaded without an account which together is probably 90% of what's on Imgur. They are essentially committing suicide so yes, they are shutting down.
Reply With Quote
  #6  
Old 25.04.2023, 11:02
notice notice is offline
JD Supporter
 
Join Date: Mar 2023
Posts: 505
Default

Quote:
Originally Posted by fuknuckle View Post
I stand by every word I have said. They are removing all NSFW content and all content uploaded without an account which together is probably 90% of what's on Imgur. They are essentially committing suicide so yes, they are shutting down.
From this point of view, I totally agree!
Reply With Quote
  #7  
Old 26.04.2023, 14:56
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

@fuknuckle
I've looked into this again specifically in our reddit crawler.
Checking "/r/Squirting/", I get ~1184 results.
284 of those are imgur.com items.

Looking into the logs, I can see the stop condition:
Code:
INFO [ jd.plugins.decrypter.RedditComCrawler(crawlPagination) ] -> Stopping because: nextPageToken is not given -> Looks like we've reached the last page: 7
URL of supposedly last page:
reddit.com/r/Squirting/.json?limit=100&after=t3_11u0zgg

Looks true so far so either there is some serverside bug or we're using that API in a wrong way or that specific API is limited somehow.
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #8  
Old 26.04.2023, 17:02
notice notice is offline
JD Supporter
 
Join Date: Mar 2023
Posts: 505
Default

Quote:
Originally Posted by fuknuckle View Post
for u/tightpixienurse JD returned 346 items initially but when I copied the urls I manually gatherd, JD would not change the total items. I gathered 789 urls.
Results of the plugin: Crawled page 18 | Found unique items so far: 415(dupes:1412)| Walked through items so far: 1793
Reddit api reports 1793 entries
Of the 415 unique items, there are 346 supported ones.

Quote:
Originally Posted by fuknuckle View Post
for u/AlwaysAtPlay JD returned 657 items initially but when I copied the urls I manually gatherd, JD increased the total to 977 items out of 1462 urls.
Results of the plugin: Crawled page 15 | Found unique items so far: 768(dupes:1409)| Walked through items so far: 1412
Reddit api reports 1412 entries
Of the 768 unique items, there are 673 supported ones.

Without having access to your *manually gathered* links, it's hard to tell what is going on. For example your manually gathered link might be similiar to one in JDownloader but with different query, and then duplicate handling fails.Also do you copy them into clipboard or paste into JDownloader? I'm asking because by pasting you tell JDownloader to deepdecrypt unsupported/unknown links and thus finding more than normally.

You may send us your list to support@jdownloader.org and we can do further checks but at the moment I can't see/reproduce any problem with the plugin.

Last edited by notice; 26.04.2023 at 17:15.
Reply With Quote
  #9  
Old 26.04.2023, 22:45
fuknuckle fuknuckle is offline
Black Hole
 
Join Date: Aug 2015
Posts: 253
Default

Quote:
Originally Posted by pspzockerscene View Post
Our reddit crawler does by default only return the first page of added subreddits and reddit profiles as this would otherwise return a lot of results which a lot of users most likely won't want by default and it would cause a lot of http requests.
See Settings -> Plugins -> reddit.com -> Set crawler limits for profiles and subreddits to "-1" to make JD crawl everything.
Okay so I decide to import fresh batches of links from 2 new subs and 2 new users. I am also running into other issues. I will be providing individual logs for each process/issue so you guys don't have a single massive log to scour through. and yes, my setting were and still are set to -1

First up is r/Milfie. JD returned 705 images but when I copied the urls I manually gatherd which was 4,213 reddit post urls along with 7,198 mostly direct Reddit, Imgur and Redgifs links, I only ended up with a total of 5,622 images.

26.04.23 00.02.48 <--> 26.04.23 00.54.38 jdlog://7594311370661/

Second is r/PLASTT. JD returned 974 images but when I copied the urls I manually gatherd which was 3,460 reddit post urls along with 6,084 mostly direct Reddit, Imgur and Redgifs links, I only ended up with a total of 4,592 images.

26.04.23 00.57.59 <--> 26.04.23 01.27.35 jdlog://8594311370661/

for the two example user profiles.

First up is u/maynardpoindexter. JD returned 1,315 images but when I copied the urls I manually gatherd which was 1,273 reddit post urls along with 1,584 mostly direct Reddit and Imgur links, I only ended up with a total of 1,590 images.

26.04.23 02.12.31 <--> 26.04.23 02.28.41 jdlog://0694311370661/

Second is u/Crystal_Sunshine_. JD returned 987 images but when I copied the urls I manually gatherd which was 3,551 reddit post urls along with 3,560 mostly direct Reddit links, I only ended up with a total of 3,574 images.

26.04.23 02.31.34 <--> 26.04.23 03.00.30 jdlog://1694311370661/

I realized after the two subreddits that many of the links I ended up with still had thousands of duplicates in them.

so here are some issues I have. I have a bunch of redgifs/gyfcat links that I added a while ago and never downloaded. most of them are still online but JD is give me the "file not found" error even after I confirmed they are still online in the browser. 2 of them even downloaded after first being told they were 404.

**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**
**External links are only visible to Support Staff****External links are only visible to Support Staff**

26.04.23 03.41.55 <--> 26.04.23 03.43.36 jdlog://3694311370661/

so one of the issues I am having is that the reddit plugin doesn't seem to be letting me download very many files simultaneously just like what the imagefap plugin was doing. I have JD set to 20 but it is only downloading about 3-5 at a time even when forced

I will cover this problem with another log in a bit but the real issue specifically is that JD tried endlessly for more than 8 hours to download some Reddit hosted videos that I had added to JD a while ago that got removed but JD has not error for it or any way to skip it and since I am being restricted to so few consecutive DLs, literally nothing else downloaded

26.04.23 03.49.43 <--> 26.04.23 03.51.04 jdlog://4694311370661/

So another major issue preventing me from downloading is that many of the images and videos on Imgur that I added months ago give me server errors when I try to download them but if I load the troublesome DLs into my browser and then try to downloads them in JD, they'll suddenly succeed. all of the downloads that I just added to give me this problem

https://i.imgur.com/1S5NrAx.mp4
https://i.imgur.com/3zw1boS.jpg
https://i.imgur.com/4M9hBWe.mp4
https://i.imgur.com/8P35XZ0.mp4

26.04.23 04.24.54 <--> 26.04.23 04.29.08 jdlog://5694311370661/

Lastly is the issue I mentioned earlier with the downloads only being 3-5 at a time. Normally with Reddit this is plenty fast with newly added downloads and no issues with Imgur but once you start to encounter errors on Imgur like hitting the client request limit and removed files, JD starts acting weirs and treats Reddit/Imgur/Redgifs as a single host and nothing ever gets downloaded as JD is trying endlessly to download the same files over and over again

this happened two days in a row and each day, the logs were over 25GB and almost nothing had downloaded. The only way to fix it is to simply monitor JD and disable any links or packages that JD isn't wanting to skip which would be much less of an issue if I could download at 20-40 sequential downloads and JD wasn't treating these 3 separate hosts as one.

26.04.23 13.25.59 <--> 26.04.23 13.40.09 jdlog://7794311370661/

anyways, sorry for taking so long to respond. hope this helps.
Reply With Quote
  #10  
Old 26.04.2023, 22:59
fuknuckle fuknuckle is offline
Black Hole
 
Join Date: Aug 2015
Posts: 253
Default

Quote:
Originally Posted by pspzockerscene View Post
@fuknuckle
I've looked into this again specifically in our reddit crawler.
Checking "/r/Squirting/", I get ~1184 results.
284 of those are imgur.com items.

Looking into the logs, I can see the stop condition:
Code:
INFO [ jd.plugins.decrypter.RedditComCrawler(crawlPagination) ] -> Stopping because: nextPageToken is not given -> Looks like we've reached the last page: 7
URL of supposedly last page:
reddit.com/r/Squirting/.json?limit=100&after=t3_11u0zgg

Looks true so far so either there is some serverside bug or we're using that API in a wrong way or that specific API is limited somehow.

So by default without an account, Reddit displays 25 items per page and will only let you browse back 1000 posts for any given view (hot, new, rising, top(1day, 1week, 1month, 1 year, all time), controversial(1day, 1week, 1month, 1 year, all time), gilded) so you should be getting about 40 pages at most per view. But that is only in the browser. I don't know if it is possible to view posts further back than 1000 with the API but if it is then I would like to be able to crawl an entire subs history even if it has been around for 10 years.

So it does seem like your plugin is implemented incorrectly if you're only getting 7 pages
Reply With Quote
  #11  
Old 26.04.2023, 23:49
fuknuckle fuknuckle is offline
Black Hole
 
Join Date: Aug 2015
Posts: 253
Default

Quote:
Originally Posted by notice View Post
Without having access to your *manually gathered* links, it's hard to tell what is going on. For example your manually gathered link might be similiar to one in JDownloader but with different query, and then duplicate handling fails.Also do you copy them into clipboard or paste into JDownloader? I'm asking because by pasting you tell JDownloader to deepdecrypt unsupported/unknown links and thus finding more than normally.

You may send us your list to support@jdownloader.org and we can do further checks but at the moment I can't see/reproduce any problem with the plugin.
I can't compose emails from my email service at the moment (long story) so I tried pastebin but it was being likey so I hope these mega links work for you.

**External links are only visible to Support Staff****External links are only visible to Support Staff**

**External links are only visible to Support Staff****External links are only visible to Support Staff**
Reply With Quote
  #12  
Old 27.04.2023, 02:03
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

@fuknuckle

Regarding post #9:
No need to provide so many examples.
If we manage to fix it "for one reddit link", it will most likely work for all links.
You don't need to invest too much time to provide multiple examples regarding subreddit/profiles crawling.

It's fine to just use one for testing - I will use "/r/Squirting/" for now.
It would be good if you could provide some single links which you think JD is supposed to grab when adding "/r/Squirting/" but which are not grabbed at this moment.

Also, regarding the reddit crawler, you can look into your own logs and search for the text "Stopping because:" to easily find the place- and cause why JD thinks that a specific subreddit or profile has been crawled completely.
Please also keep in mind that we do not have plugins for _all_ websites which might be used for linking reddit media files though the main ones (imgur, reddit itself, redgifs.com, gfycat.com) are supported.
This could be another reason for why JD didn't find all items you expect it to find.

Quote:
Originally Posted by fuknuckle View Post
so here are some issues I have. I have a bunch of redgifs/gyfcat links that I added a while ago and never downloaded. most of them are still online but JD is give me the "file not found" error even after I confirmed they are still online in the browser. 2 of them even downloaded after first being told they were 404.
Looks like this issue only affects older items which were added before a bigger internal plugin change a longer time ago.
EDIT: Issue affects all redgifs.com items added until 2022-12-19.
Other team-members need to look into this to decide if it is possible to "rescue" them to make them downloadable.
However you might also just want to copy the original links from your downloadlist, save the links, delete the old entries in your downloadlist and re-add them and you will be able to download them just fine.

Quote:
Originally Posted by fuknuckle View Post
so one of the issues I am having is that the reddit plugin doesn't seem to be letting me download very many files simultaneously just like what the imagefap plugin was doing. I have JD set to 20 but it is only downloading about 3-5 at a time even when forced
Looking at our reddit host plugin, there is currently no limit in place.
Please provide specific links which can be used to reproduce this issue -> Not links to a subreddit but a list of e.g. "i.redd.it" or "v.redd.it" links.

Quote:
Originally Posted by fuknuckle View Post
I will cover this problem with another log in a bit but the real issue specifically is that JD tried endlessly for more than 8 hours to download some Reddit hosted videos that I had added to JD a while ago that got removed but JD has not error for it or any way to skip it and since I am being restricted to so few consecutive DLs, literally nothing else downloaded
It would be good if you could include the errormessage you can see in JDownloader for this case in your post next time.
Either way I found it in your log - here is what happened:
- Reddit returned http status 403 for such offline .mpd URLs
- Usually 403 is not an offline indicator but more something like "Content temporarily not available" or "You can't access this content now/without account" --> Our plugin handled this as "Server error 403" with a retry later on.
--> I've updated the check so that .mpd links with response 403 will result in error "File not found"/offline-status instead.

Quote:
Originally Posted by fuknuckle View Post
So another major issue preventing me from downloading is that many of the images and videos on Imgur that I added months ago give me server errors when I try to download them but if I load the troublesome DLs into my browser and then try to downloads them in JD, they'll suddenly succeed. all of the downloads that I just added to give me this problem
I can't reproduce this here.
According to your log, imgur returned error 400 for a request which in the same form for the same links is working fine here.
Snippet of your log:
Code:
----------------Request-------------------------
GET /CENSORED.mp4 HTTP/1.1
Host: i.imgur.com
User-Agent: Mozilla/5.0 (Windows NT 10.0; WOW64; rv:76.0) Gecko/20100101 Firefox/76.0
Accept: video/*;q=0.9,*/*;q=0.5
Accept-Language: de,en-gb;q=0.7,en;q=0.3
Accept-Encoding: gzip, deflate, br
Cache-Control: no-cache
Referer: https://api.imgur.com/3/image/CENSORED
Authorization: Client-ID CENSORED(your personal clientID)

----------------Response Information------------
Connection-Time: 14ms
Request-Time: 77ms
----------------Response------------------------
HTTP/1.1 400 Bad Request
My assumption is that you've entered an invalid ClientID in the imgur.com plugin settings and/or the data you've entered got blocked/limited by imgur or you've revoked access inside your imgur account after adding API data to JDownloader.
Possible solutions include:
1. Check and re-new your API cxredentials in the imgur.com plugin settings.
or:
2. Disable this setting in the imgur.com plugin settings: "Use API in anonymous mode?".

Quote:
Originally Posted by fuknuckle View Post
D starts acting weirs and treats Reddit/Imgur/Redgifs as a single host
Nope that is impossible.

Quote:
Originally Posted by fuknuckle View Post
this happened two days in a row and each day, the logs were over 25GB and almost nothing had downloaded. The only way to fix it is to simply monitor JD and disable any links or packages that JD isn't wanting to skip which would be much less of an issue if I could download at 20-40 sequential downloads and JD wasn't treating these 3 separate hosts as one.

26.04.23 13.25.59 <--> 26.04.23 13.40.09 jdlog://7794311370661/
Summary of what I can see in that log:
- gfycat.com links ending up with "Server error 404" instead of offline status -> Fixed with next CORE-update
- i.redd.it links with server response 404 being marked as offline -> Working as intended
- Over 1000 errors with "Rate limit reached" -> Working as intended -> What else do you think JD should do whenever that error happens? There is no way around this -> As said before, maybe disable API usage


Quote:
Originally Posted by fuknuckle View Post
I don't know if it is possible to view posts further back than 1000 with the API
I don't know it either and I'm not planning to do a deep-dive here.
Also reddit has multiple APIs.
We're using the simple public json API.
Also, yes, we can get more than 25 items per page (we're getting 100) though that is unrelated to the maybe existent "go back max 1000 items" limit.
I know I'm repeating myself but you can check it yourself - here is the first page of "/r/Squirting" via API:
reddit.com/r/Squirting/.json?limit=100
You can copy that json and paste it e.g. into webtool "jsoneditoronline.org" to view it in a more human readable way.

Quote:
Originally Posted by fuknuckle View Post
So it does seem like your plugin is implemented incorrectly if you're only getting 7 pages
Please re-read my last post.
I'll also briefly explain:
Via API there is no simple "page 1, 2, 3":
Every page contains 1-2 tokens which allow you to access the next/previous page.
For "/r/Squirting" the last token is called "t3_11u0zgg" and that is page 7.
You can verify this here:
reddit.com/r/Squirting/.json?limit=100&after=t3_11u0zgg
I'm not super familiar with the technical details of the different reddit APIs so I recommend finding someone in the reddit community who has better knowledge.
I also don't know if that API behaves differently when browser login cookies are present and if it then, e.g. returns more/all items.

I also want to mention again that while we'll be doing our best to help you here, for specific tasks like crawling entire/complex reddit profiles/subreddits, sometimes it is better/easier to use tools/scripts that are built to do exactly that.
Especially for reddit and imgur.com archiving, you should be able to find plenty of such open source projects on github.com.
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?

Last edited by pspzockerscene; 27.04.2023 at 03:50. Reason: Added more information
Reply With Quote
  #13  
Old 27.04.2023, 04:03
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

Quote:
Originally Posted by fuknuckle View Post
I can't compose emails from my email service at the moment (long story) so I tried pastebin but it was being likey so I hope these mega links work for you.

http...mega.nz...
My last reply for this night:

Looking at your provided lists of example links, this is what I can see so far:
1. Some links are unsupported and or go to dead websites anyways e.g.:
awesomeporntube.com/video.php?videoId=12345
--> Item is unsupported and website is down -> You won't see any item in JD when adding this.
If it gets returned by other crawlers such as reddit.com, items like this also won't ever show up in the linkgrabber as there is no plugin available to handle it.
Same for:
hedjaz.com
And:
codioli.com
-> If links look like direct URLs (URL contains common file-extensions) they can be picked up by our directHTTP plugins so this is the case here: codioli.com/tolo/frtuiklou/Busty-and-cute.jpg

2. I can also see at least one unsupported reddit URL-type:
redditmedia.com/mediaembed/bla

Also just as a sidenote:
If you do add huge amounts of links "manually", that can lead to a huge amount of http requests being made by JD.
E.g. both of your .txt documents together contain at least 16 000 URLs.
This can then:
- Take a long time to get processed
- Cause websites to block you or you might run into rate limiting issues faster than normally

To speed this up tremendously and also for testing purposes, you might want to disable the linkcheck temporarily.
See Settings -> Advanced Settings ->
Code:
LinkCollector.dolinkcheck
--> Disable that.
Keep in mind that this may have some unwanted side-effects.
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #14  
Old 27.04.2023, 15:24
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

Added simple crawler plugin for:
redditmedia.com/mediaembed/bla

Bitte auf das nächste CORE-Update warten!

Please wait for the next CORE-Update!

Wartest du auf einen angekündigten Bugfix oder ein neues Feature?
Updates werden nicht immer sofort bereitgestellt!
Bitte lies unser Update FAQ! | Please read our Update FAQ!

---
Are you waiting for recently announced changes to get released?
Updates to not necessarily get released immediately!
Bitte lies unser Update FAQ! | Please read our Update FAQ!


-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #15  
Old 28.04.2023, 08:36
fuknuckle fuknuckle is offline
Black Hole
 
Join Date: Aug 2015
Posts: 253
Default

Quote:
Originally Posted by pspzockerscene View Post
It's fine to just use one for testing - I will use "/r/Squirting/" for now.
It would be good if you could provide some single links which you think JD is supposed to grab when adding "/r/Squirting/" but which are not grabbed at this moment.
I'll do you one better. so I ran the Reddit crawler on r/Squirting again and got 1169 links. I then in my browser, I just went to "new" and went back as far as it would let me go which is about 2 months. I collected all the links then copied them into JD and got a bunch more that JD didn't get. rather than give you all those links, I'll just provide you with the logs again and the linklist. everything JD got is at the tops in "_r_squirting" and everything else I collected that JD missed is below.

**External links are only visible to Support Staff****External links are only visible to Support Staff**

27.04.23 18.53.03 <--> 27.04.23 21.20.42 jdlog://0005311370661/

Quote:
Also, regarding the reddit crawler, you can look into your own logs and search for the text "Stopping because:" to easily find the place- and cause why JD thinks that a specific subreddit or profile has been crawled completely.
How does that help me?

I am relying on JD to do in minutes what would take me hours or to do in hours what would take me weeks/months and I am relying on JD to do it properly and completely. Searching through the logs does not help me in any way.

Quote:
Looks like this issue only affects older items which were added before a bigger internal plugin change a longer time ago.
EDIT: Issue affects all redgifs.com items added until 2022-12-19.
Other team-members need to look into this to decide if it is possible to "rescue" them to make them downloadable.
However you might also just want to copy the original links from your downloadlist, save the links, delete the old entries in your downloadlist and re-add them and you will be able to download them just fine.
I mean, if you can fix these issues and fix the Reddit, Imgur and Gyfcat issues than yeah I'll just re-add them which will be easier.

Quote:
Looking at our reddit host plugin, there is currently no limit in place.
Please provide specific links which can be used to reproduce this issue -> Not links to a subreddit but a list of e.g. "i.redd.it" or "v.redd.it" links.
Well I never said there was a limit, I said JD is acting as if there is and I was watching JD NOT download, on average, more than 5 images at a time even though there are multiple active hosts and JD would actively NOT skip ahead. so if there were some problematic Imgur links, JD would not continue on and download enabled Reddit or Redgifs links. I witnessed this time and time again.

this should be in one of the logs I posted

And just like with the linklist, I'm going to provide you guys with my entire downloadlist so you guys can play around with that and (hopefuly) see for yourself what I am talking about

**External links are only visible to Support Staff****External links are only visible to Support Staff**

And just in case it matters, here is a copy of the instance of JD that I was using. Just know this zip doesn't have the linklist or downloadlist since I provided those separately. It also doesn't have the accounts file, accounts usage rules or proxy list.

**External links are only visible to Support Staff****External links are only visible to Support Staff**

Quote:
Nope that is impossible.
You say that but I am just telling you what I have witnessed. See my comment above.

Quote:
It would be good if you could include the errormessage you can see in JDownloader for this case in your post next time.
that's because with Reddit, there were no error message which is why I provided none. It would start then stop then start then stop endlessly for more than 8 hours. That's probably why my log files were so massive.

Quote:
I can't reproduce this here.
According to your log, imgur returned error 400 for a request which in the same form for the same links is working fine here.

My assumption is that you've entered an invalid ClientID in the imgur.com plugin settings and/or the data you've entered got blocked/limited by imgur or you've revoked access inside your imgur account after adding API data to JDownloader.
Possible solutions include:
1. Check and re-new your API cxredentials in the imgur.com plugin settings.
or:
2. Disable this setting in the imgur.com plugin settings: "Use API in anonymous mode?".
So I did those things and the Imgur links that were giving me problems download but it's weird. The vast majority of Imgur links download just fine but some will only download in JD if I load the link in my browser. The API and my personal Imgur API credentials work just fine for most Imgur links.

Would it be possible to add a setting to Imgur to have JD fallback to NOT use the API but only for links that fail/throw errors when the API/account is enabled?

Quote:
Summary of what I can see in that log:
- gfycat.com links ending up with "Server error 404" instead of offline status -> Fixed with next CORE-update
- i.redd.it links with server response 404 being marked as offline -> Working as intended
- Over 1000 errors with "Rate limit reached" -> Working as intended -> What else do you think JD should do whenever that error happens? There is no way around this -> As said before, maybe disable API usage
Rate limit for what? Imgur? I know that there is a daily API-client request limit. As for Reddit, is there a rate limit for Reddit? That's news to me. If that is the case, JD isn't communicating that. All I see are offline links and errored links, links in a perpetual start/stop cycle, some other BS or links with no status at all.

I know what it looks like when hoster links say "you've hit your limit" but out of these hosters, I have only seen that for Imgur and not on the links but on the account page.

Quote:
I don't know it either and I'm not planning to do a deep-dive here.
Also reddit has multiple APIs.
We're using the simple public json API.
Have you guys thought about trying another API or maybe writing your own API for Reddit?

Quote:
Every page contains 1-2 tokens which allow you to access the next/previous page.
For "/r/Squirting" the last token is called "t3_11u0zgg" and that is page 7.
You can verify this here:
reddit.com/r/Squirting/.json?limit=100&after=t3_11u0zgg
I understand what your talking about. But if your set limit is 100 then you are still missing about 300 posts or 3 pages and what about crawling Top, Controversial etc? Have you tried setting the limit to 25 so you get fewer posts per page but hopefully get more pages returned?

Quote:
I also want to mention again that while we'll be doing our best to help you here, for specific tasks like crawling entire/complex reddit profiles/subreddits, sometimes it is better/easier to use tools/scripts that are built to do exactly that.
Especially for reddit and imgur.com archiving, you should be able to find plenty of such open source projects on github.com.
Yeah, I get what you're saying but this is one of the reasons why people use JD and besides, most purpose built "tools" on github are like python scripts and whatnot and the vast majority of people (myself included) don't know how to run those. I don't want to deal with janky CLI BS. That's why I DESPISE Linux with a passion and why I much prefer GUI applications like JD for every possible thing. Also. We're not talking about some obscure website here. Reddit is one of the largest and most popular platforms in the world.
Reply With Quote
  #16  
Old 28.04.2023, 13:22
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

Quote:
Originally Posted by fuknuckle View Post
I'll just provide you with the logs again and the linklist. everything JD got is at the tops in "_r_squirting" and everything else I collected that JD missed is below.
That won't help me atm.
Also you're still using the current public version of the reddit plugin (the changes I made aren't live yet!) so if helpful at all, yopu can provide new logs after the release of our next set of CORE-updates. If you want to test changes earlier, you can get the dev build here:
https://support.jdownloader.org/Know...up-ide-eclipse

Once I find the time, I will create a script which will create a list of all reddit post URLs which JD should find atm. as .txt file.
This should make it way easier to find out what is missing and why.


Quote:
Originally Posted by fuknuckle View Post
How does that help me?
Using the information I provided about the API we're using, you can look into it to e.g. find if it is limited somehow.
Probably there are also subreddits about the reddit API which might contain more information.
From my JDownloader point of view as a developer, our reddit plugin is just one of many plugins we got and I can't put all of my daily JDownloader time into that one plugin.

Quote:
Originally Posted by fuknuckle View Post
I mean, if you can fix these issues and fix the Reddit, Imgur and Gyfcat issues than yeah I'll just re-add them which will be easier.
Re-adding links is, if at all (as explained), only needed for those redgifs.com URLs (old ones are still labeled as host "gfycat.com" in your JDownloader).
Also that problem still is being worked on so if you got a little bit more time, you might be able to just download the existing items in your list.

Quote:
Originally Posted by fuknuckle View Post
I am relying on JD to do in minutes what would take me hours or to do in hours what would take me weeks/months and I am relying on JD to do it properly and completely. Searching through the logs does not help me in any way.
As explained, JD is not a specific tool for downloading from reddit.
We will continue to work on it but if you're planning to download these complete subreddit before imgur is removing all of the adult content: I'm sorry but that won't pressure me to work harder on that reddit plugin atm.

Quote:
Originally Posted by fuknuckle View Post
Well I never said there was a limit, I said JD is acting as if there is and I was watching JD NOT download, on average, more than 5 images at a time even though there are multiple active hosts and JD would actively NOT skip ahead.
Can you always reproduce this or does it only happen sometimes?
Maybe reddit has serverside limits of max X requests per Y milliseconds which would explain such an issue.

Quote:
Originally Posted by fuknuckle View Post
You say that but I am just telling you what I have witnessed. See my comment above.
I'm not saying I don't believe behavior XY you saw but that's coincidences which are not related to each other.

Quote:
Originally Posted by fuknuckle View Post
that's because with Reddit, there were no error message which is why I provided none. It would start then stop then start then stop endlessly for more than 8 hours. That's probably why my log files were so massive.
I could see them in your logs and as long as you got the packages extended and the "Status" column visible, you should see the errormessages.

Quote:
Originally Posted by fuknuckle View Post
Would it be possible to add a setting to Imgur to have JD fallback to NOT use the API but only for links that fail/throw errors when the API/account is enabled?
I'm not planning on doing this since it would make everything more complicated and the API settings were designed to be a "either website or API" thing.
Why are you using the imgur API in the first place?

Quote:
Originally Posted by fuknuckle View Post
Rate limit for what? Imgur? I know that there is a daily API-client request limit.
Yes it was actually the daily API limit.
Serverside though they will return http response 429 which is typically used for rate limits:
developer.mozilla.org/en-US/docs/Web/HTTP/Status/429
Quote of the imgur API response of your log 7794311370661:
Code:
{"data":{"error":"Daily client requests exceeded","request":"\/3\/image\/CENSORED","method":"GET"},"success":false,"status":429}
I will update the wording of that errormessage for the next update but technically, this can- and will happen again.


Quote:
Originally Posted by fuknuckle View Post
As for Reddit, is there a rate limit for Reddit?
Not in our plugins and not that I know but a lot of websites do have some kind of rate-limits as a protection against mis-use of their APIs and also against overloading of their APIs by e.g. buggy software applications doing a lot of requests in a short time.

Quote:
Originally Posted by fuknuckle View Post
All I see are offline links and errored links, links in a perpetual start/stop cycle
I've explained everything I could find in your logs to you.
I didn't find any links which were marked as offline due to a rate-limit. If that was the case, that would be wrong!

Quote:
Originally Posted by fuknuckle View Post
some other BS or links with no status at all.
For every status there is a reason so if you got some links with an unknown status, please provide a specific log for that and I'll try to find the cause.

Quote:
Originally Posted by fuknuckle View Post
Have you guys thought about trying another API
No. We're not planning to implement the use of another reddit API as this would basically mean that most parts of our reddit plugins would have to be re-written from scratch.


Quote:
Originally Posted by fuknuckle View Post
or maybe writing your own API for Reddit?
That is impossible:
Looks like you haven't understood the concept of APIs(?)
The API is serverside running on the reddit servers so reddit is in control of it and only reddit can provide/"write" APIs, see:
en.wikipedia.org/wiki/API

Quote:
Originally Posted by fuknuckle View Post
I understand what your talking about. But if your set limit is 100 then you are still missing about 300 posts
Looks like you haven't understood what I ment:
With the parameters I mentioned, you can define the "max items per page".
If there are 300 items in total and you set this to 100, you will get all of the items spreead on 3 (300 divided by 100) pages. If you set it to 25, you will still get all of the items but spread over 12 pages (300 deivided by 25).

All I said was that, if there is a limit for that specific API which we are using, that would explain some of the problems you're having.

So far I haven't seen you using other tools to at least compare how they're performing in comparison to JDownloader. I've also seen other tools using the same reddit API so either they will be having similar issues (missing items) or our reddit crawler is having some problems.

Quote:
Originally Posted by fuknuckle View Post
Yeah, I get what you're saying but this is one of the reasons why people use JD and besides, most purpose built "tools" on github are like python scripts and whatnot and the vast majority of people (myself included) don't know how to run those.
I'm sorry but that's not a big argument. With the amount of time you're spending atm for tests with JDownloader, you could easily google a tutorial on how to run a commandline script and I gurantee you it's not super hard and you don't need to be a programmer to be able to do that!

Quote:
Originally Posted by fuknuckle View Post
That's why I DESPISE Linux...
Sounds like you just don't like the "terminal part of Linux" but you don't even need to use the terminal once when using Linux so there might be a fundamental misunderstanding here.
...though I admit I'm a Windows user myself but for other reasons.

Quote:
Originally Posted by fuknuckle View Post
Reddit is one of the largest and most popular platforms in the world.
Yes but in the end, our reddit plugin is just one of many plugins we have and there is a limited amount of time available to spend on each.
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #17  
Old 28.04.2023, 15:21
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

EDIT Sidenote:
We're having a holiday in germany (monday, so it's gonna be a long weekend) so I'll be offline until tuesday now.
Have a nice weekend!

Again regarding imgur.com:
Based on your feedback, I made the following changes:
  • Cosmetic: Added debug plugin setting which lets you visually see the remaining account API requests as a dummy "traffic left" value
  • updated wording of errormessages regarding API request limits when they're reached - they are now not called "Rate limits" anymore -> Also prefer errormessage from imgur API if one is available
EDIT2

Technically speaking, those limits are called "Rate Limits" and you can find them listed here:
apidocs.imgur.com
-> Search for "Rate Limits"

Wartest du auf einen angekündigten Bugfix oder ein neues Feature?
Updates werden nicht immer sofort bereitgestellt!
Bitte lies unser Update FAQ! | Please read our Update FAQ!

---
Are you waiting for recently announced changes to get released?
Updates to not necessarily get released immediately!
Bitte lies unser Update FAQ! | Please read our Update FAQ!


-psp-
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?

Last edited by pspzockerscene; 28.04.2023 at 17:24. Reason: Addee "EDIT2"
Reply With Quote
  #18  
Old 28.04.2023, 22:58
fuknuckle fuknuckle is offline
Black Hole
 
Join Date: Aug 2015
Posts: 253
Default

Quote:
Originally Posted by pspzockerscene View Post
That won't help me atm.
Also you're still using the current public version of the reddit plugin (the changes I made aren't live yet!) so if helpful at all, yopu can provide new logs after the release of our next set of CORE-updates. If you want to test changes
Alright, cool. I guess I'll wait for the next core update.

Quote:
Once I find the time, I will create a script which will create a list of all reddit post URLs which JD should find atm. as .txt file.
This should make it way easier to find out what is missing and why.
Interesting... I look forward to seeing what it does. there are many many subs I want to start regularly archiving and not just porn.

Quote:
but if you're planning to download these complete subreddit before imgur is removing all of the adult content: I'm sorry but that won't pressure me to work harder on that reddit plugin atm.
It would be nice but with a little over two weeks to go before the great puss purge of 23, I have very little hope of using JD to backup TB's of images before then especially with the rate limits in place.

My only hope is the efforts the madlads at
**External links are only visible to Support Staff****External links are only visible to Support Staff**

Quote:
Can you always reproduce this or does it only happen sometimes?
Maybe reddit has serverside limits of max X requests per Y milliseconds which would explain such an issue.

I'm not saying I don't believe behavior XY you saw but that's coincidences which are not related to each other.
Yes I can always reproduce this which is why I provided you my linklist, my downloadlist and a copy of the entire instance of JD I was using so you could (hopefully) see for yourself.

Quote:
I'm not planning on doing this since it would make everything more complicated and the API settings were designed to be a "either website or API" thing.
Why are you using the imgur API in the first place?
Because some time ago when I tried adding any images or videos from Imgur to JD, JD would say they were offline even though they weren't. This happened with the majority of images on Imgur. I think it was you guys on this forum who suggested to use the API as a workaround.

Quote:
Yes it was actually the daily API limit.
alright, I will no longer use the API

Quote:
That is impossible:
Looks like you haven't understood the concept of APIs(?)
The API is serverside running on the reddit servers so reddit is in control of it and only reddit can provide/"write" APIs
I do actually understand what an API is but in Reddit's case, as you said, there are multiple APIs and some of those like the Pushshift API (**External links are only visible to Support Staff****External links are only visible to Support Staff**) were created not by Reddit but by the community. I just thought that maybe you guys could also write your own API for Reddit

Quote:
Looks like you haven't understood what I ment:
With the parameters I mentioned, you can define the "max items per page".
If there are 300 items in total and you set this to 100, you will get all of the items spreead on 3 (300 divided by 100) pages. If you set it to 25, you will still get all of the items but spread over 12 pages (300 deivided by 25).

All I said was that, if there is a limit for that specific API which we are using, that would explain some of the problems you're having.
I do understand and what I am saying is that maybe the issue is the parameters you are using. maybe if you used the same parameters as is default in a browser with no account, you might get different results. Maybe you are only getting 700/1000 posts because you are asking for 100 posts per page instead of 25. it was just an idea. but again, for any given sub, you should be getting thousands of results when you also crawl top(all time, 1 year), gilded, controversial etc.

Quote:
So far I haven't seen you using other tools to at least compare how they're performing in comparison to JDownloader. I've also seen other tools using the same reddit API so either they will be having similar issues (missing items) or our reddit crawler is having some problems.
Because all those other tools are scripts and other command line BS with no GUI. Point me to an actual Reddit downloader that is an actual real program with an actual GUI and I'll give it a try but it's not the 1970's and I ain't going to try and manage the archival of 1000+ subs and many thousands of users accounts and all those directories with some barbaric old timey text based command line interface.

Quote:
I'm sorry but that's not a big argument. With the amount of time you're spending atm for tests with JDownloader, you could easily google a tutorial on how to run a commandline script and I gurantee you it's not super hard and you don't need to be a programmer to be able to do that!
🤮 🤮 🤮 It's not about "difficulty"

Quote:
Sounds like you just don't like the "terminal part of Linux" but you don't even need to use the terminal once when using Linux so there might be a fundamental misunderstanding here.
...though I admit I'm a Windows user myself but for other reasons.
Just about everything I have seen on Linux requires the terminal. even installing a program on Ubuntu is still an overly complicated multistep process involving throwing commands at the terminal and pointing it to a repository.

Where as on Windows and MAC you have.... INSTALLERS... OMG, what a wonderful concept! having the entire program and all its components contained in a single file with an automated graphical setup... WOW, such a revolutionary game changing innovation...... 🙄 One of these days, Linux is going to catch up to Windows 95.
Reply With Quote
  #19  
Old 29.04.2023, 04:19
fuknuckle fuknuckle is offline
Black Hole
 
Join Date: Aug 2015
Posts: 253
Default

So I found **External links are only visible to Support Staff**this post on r/datahoarder that might interest you.

Quote:
Thanks to the help of a fellow anonymous Redditor I've released a new version of **External links are only visible to Support Staff**RedditScrape. This new version now uses the push shift API to gather gigantic levels of data for you to download. This means we no longer need to provide any form of reddit credentials.

While the previous version was hard capped at 1,000 posts using the Reddit API, this new version has no limits at all, other than what resources and disk space you have.

For example, if you're brave enough to try and scrape something like gonewild, you'll find it takes DAYS just to get all of the data back from push shift. The JSON text alone is over 9 gigs (3.3 million posts) and climbing.

Running this is now a two step process, but results in a substantially larger set of media from your favorite subs.

Instructions can be found here. I hope I've fixed a few of the problems that people had with the first iteration along the way.
So I mentioned the Pushshift API in my previous post and this script uses it and is able to grab 3.3+ million posts from r/gonewild. Maybe you should look into incorporating the Pushshift API into your plugin or at least take a **External links are only visible to Support Staff**look at it to see how it works and think about modifying your Reddit plugin.

Now before you tell me "tHaT wOuLd ReQuIrE a CoMpLeTe ReWrItE"... so? Isn't that what JDownloader is for? you know, DOWNLOADING!?
Reply With Quote
  #20  
Old 03.05.2023, 03:57
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

Quote:
Originally Posted by fuknuckle View Post
Because some time ago when I tried adding any images or videos from Imgur to JD, JD would say they were offline even though they weren't. This happened with the majority of images on Imgur. I think it was you guys on this forum who suggested to use the API as a workaround.
Could be though we've never identified the main issue so maybe we should work on that rather than leave you with using the API as a workaround.
The most common reason why to use the imgur API is when you're trying to download content which can only be viewed/downloaded when you're logged in in an imgur account.

Quote:
Originally Posted by fuknuckle View Post
alright, I will no longer use the API
In case you still want/need to use it, enable the setting "Use API in anonymous mode? To be able to use the API you will have to add your own API credentials below otherwise this will render the imgur plugin useless!".

Quote:
Originally Posted by fuknuckle View Post
I do actually understand what an API is but in Reddit's case, as you said, there are multiple APIs and some of those like the Pushshift API github.com/pushshift/api were created not by Reddit but by the community. I just thought that maybe you guys could also write your own API for Reddit
From what I understand that is an unofficial reddit API built on top of the existing API.
So yes it's a new API but in the end it might not be anything else than a proxy.
And by the way it says "To be continued (Currently under active development) ..." but the last change is over 6 years ago. Don't know if that's good or bad though.

Quote:
Originally Posted by fuknuckle View Post
I just thought that maybe you guys could also write your own API for Reddit
Building a custom reddit API only for the purpose of then using it in JDownloader still doesn't make any sense. I'm giving up on explaining this one - maybe you can just take my word for it or wait for other users to confirm that what I wrote is no BS.

Quote:
Originally Posted by fuknuckle View Post
I do understand and what I am saying is that maybe the issue is the parameters you are using. maybe if you used the same parameters as is default in a browser with no account, you might get different results.
I don't think so but then again: You can just verify this on your own.
You can change the parameters and check if what you get now is different to what you got before.
You don't even need to be a programmer to do that.

Quote:
Originally Posted by fuknuckle View Post
Point me to an actual Reddit downloader that is an actual real program with an actual GUI
I'm afraid most website crawl projects are GUI-less scripts.
Adding a GUI will at least double the amount of work you need to put into such a project.

Quote:
Originally Posted by fuknuckle View Post
nd I'll give it a try but it's not the 1970's
I'm sorry but using a terminal is timeless - I think you misunderstood something there.

Quote:
Originally Posted by fuknuckle View Post
and I ain't going to try and manage the archival of 1000+ subs and many thousands of users accounts and all those directories with some barbaric old timey text based command line interface.
Well if all it takes is creating a text-document, adding all of the links you want to download and running one command... If I were you and I was eager to archive those subreddits I would maybe just believe what I'am being told by a developer and use those other projects.
If I'd want to download complete subreddits, I'd most likely not even use JDownloader myself.

Quote:
Originally Posted by fuknuckle View Post
Just about everything I have seen on Linux requires the terminal. even installing a program on Ubuntu is still an overly complicated multistep process involving throwing commands at the terminal and pointing it to a repository.
The "complicated" part is your opinion.
For others that's the fastest way.

Quote:
Originally Posted by fuknuckle View Post
Maybe you should look into incorporating the Pushshift API into your plugin or at least take a look at it to see how it works and think about modifying your Reddit plugin.
I've looked into it and decided: I'm not going to implement it.
If you want, I can create a ticket for you though and you can re-ask about the progress in 10 years.

Quote:
Originally Posted by fuknuckle View Post
Now before you tell me "tHaT wOuLd ReQuIrE a CoMpLeTe ReWrItE"... so? Isn't that what JDownloader is for? you know, DOWNLOADING!?
Thanks for that nice comment.
And yes JDownloader is made for downloading but JDownloader has never been intended to be a perfect archiving tool but appearently that's what you're looking for here.

All I can see here is one single user begging us to implement whatever he wants to have.

In the future, I won't bother putting so much time into answering your threads, especially this one.

Coming back to the topic, this is where we are:
- Old redgifs.com items in your downloadlist being displayed as "gfycat.com" -> Wait for a team member to address this or delete- and re-add those to be able to download them
- Incomplete/missing reddit items: Wait for me to come back to this
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #21  
Old 03.05.2023, 09:56
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

Again about possibly missing items:
Link used for testing: reddit.com/r/squirting/
Results (with .txt downloads enabled): 1218
Results without text items: 1217
imgur.com: 265
Number of posts which crawler walks through: 659

Posts crawled walks through as text/json:
Spoiler:

Code:
[
 "/r/squirting/comments/nzb4t4/just_leaving_this_here_for_anyone_curious_about/",
 "/r/squirting/comments/12tizsf/attention_imgur_is_removing_all_nudity_and/",
 "/r/squirting/comments/1361cte/oh_like/",
 "/r/squirting/comments/135wtq3/turn_me_into_a_waterfall/",
 "/r/squirting/comments/135zwaq/squirting_so_hard_from_anal/",
 "/r/squirting/comments/135xgxn/dont_let_a_single_drop_of_my_creamyy_squirt_go_to/",
 "/r/squirting/comments/1364t05/this_is_the_best/",
 "/r/squirting/comments/135kq8b/mouthwatering_every_time/",
 "/r/squirting/comments/136b18f/mommy_loves_to_squirt_daily/",
 "/r/squirting/comments/1366ir0/anal_squirting/",
 "/r/squirting/comments/136b83z/this_orgasm_was_caught_on_camera_and_it_was_epic/",
 "/r/squirting/comments/1364enq/the_start_of_a_huge_wave/",
 "/r/squirting/comments/136bxp6/i_ended_up_in_a_puddle/",
 "/r/squirting/comments/1360v42/his_moans_make_me_squirt/",
 "/r/squirting/comments/1352zk9/could_i_rain_my_sweet_pussy_juice_all_over_you/",
 "/r/squirting/comments/135r07b/stuffing_both_holes_has_her_squirting_streams/",
 "/r/squirting/comments/135u8nn/sometimes_my_pussy_is_so_wet/",
 "/r/squirting/comments/1358ysk/drink_up/",
 "/r/squirting/comments/135yf2k/my_tiny_pussy_gripping_his_massive_cock_until_i/",
 "/r/squirting/comments/135ifc7/i_make_a_mess_but_it_feels_sooo_good/",
 "/r/squirting/comments/136cnrs/i_love_to_squirt_on_hard_things/",
 "/r/squirting/comments/135ly99/the_clit_play_always_gets_me_need_help_cleaning_up/",
 "/r/squirting/comments/135kq3p/i_was_so_turned_on_i_stood_up_and_soaked_him/",
 "/r/squirting/comments/134vk4h/the_housekeeper_was_probably_wondering_why_my/",
 "/r/squirting/comments/135b5im/i_made_a_huge_mess_in_my_chair/",
 "/r/squirting/comments/135usul/ready_for_you/",
 "/r/squirting/comments/135e0o7/just_a_tiny_squirt/",
 "/r/squirting/comments/134m100/waterfalls_you_should_go_chasing/",
 "/r/squirting/comments/1353ijq/another_angle_of_the_yoni_masseuse_making_me/",
 "/r/squirting/comments/134bt7r/get_a_squirter_like_me_youll_never_be_dehydrated/",
 "/r/squirting/comments/134phsw/amazing_orgasm/",
 "/r/squirting/comments/13512dl/say_ahhh_open_wide/",
 "/r/squirting/comments/134yf1s/the_way_i_use_to_cum_f34/",
 "/r/squirting/comments/134u2yj/sound_on/",
 "/r/squirting/comments/134tqd0/right_in_your_face/",
 "/r/squirting/comments/1344a28/pussy_eaters_open_your_mouth/",
 "/r/squirting/comments/134mg83/cumming_so_hard_my_pussy_explodes/",
 "/r/squirting/comments/134olxb/theres_a_constant_trickle_of_squirt_dripping_down/",
 "/r/squirting/comments/134jtsb/that_smile_when_i_squirt_though/",
 "/r/squirting/comments/1342w8f/for_my_birthday_i_made_myself_cum/",
 "/r/squirting/comments/13434za/cant_help_but_squirt_while_imagining_squirting/",
 "/r/squirting/comments/134kzwg/huge_dildo_making_me_gush_and_squirt/",
 "/r/squirting/comments/134oi56/if_only_my_aim_was_this_good_in_cod/",
 "/r/squirting/comments/1346de0/infinity_squirt_with_cream/",
 "/r/squirting/comments/1349i2e/she_sure_likes_to_squirt_on_herself_caw_caw/",
 "/r/squirting/comments/1342rgk/it_feels_so_good_to_be_back_home_with_my_toys/",
 "/r/squirting/comments/1348fem/my_new_toy_makes_me_so_wet/",
 "/r/squirting/comments/1349z03/tonights_weather_will_consist_of_light_to_heavy/",
 "/r/squirting/comments/133hhsu/to_squirt_all_over_you_like_this_would_be_such_a/",
 "/r/squirting/comments/1341edr/finishing_and_my_pussy_still_squirting/",
 "/r/squirting/comments/133eb8y/i_get_so_horny_knowing_your_stroking_your_cockk/",
 "/r/squirting/comments/133u3hk/shouldve_put_a_towel_down/",
 "/r/squirting/comments/1347uoi/open_wide/",
 "/r/squirting/comments/133twwm/karlee_greys_wet_hairy_pussy/",
 "/r/squirting/comments/133ta13/a_helping_hand/",
 "/r/squirting/comments/133kfvo/fisting_turns_her_into_a_fountain/",
 "/r/squirting/comments/133lsyp/some_fun_made_a_mess/",
 "/r/squirting/comments/1331y6t/making_me_squirt_while_eating_my_ass_and_using_a/",
 "/r/squirting/comments/1337aye/my_pussy_loves_to_squirt/",
 "/r/squirting/comments/132r42b/accidentally_squirted_on_the_airplane/",
 "/r/squirting/comments/133fzhm/this_pussy_is_explosive/",
 "/r/squirting/comments/133he1j/ill_squirt_your_eye_out_hehe/",
 "/r/squirting/comments/1335ta5/like_a_waterfall/",
 "/r/squirting/comments/132r5e8/oh_god_it_felt_like_it_would_go_on_forever/",
 "/r/squirting/comments/132z49w/squirt_on_the_public_beach/",
 "/r/squirting/comments/1330mfy/up_to_no_good_scissoring_in_our_matching_panties/",
 "/r/squirting/comments/132zjcw/had_to_get_myself_off_after_hearing_my_uni/",
 "/r/squirting/comments/132rf91/it_never_stopped_cumming/",
 "/r/squirting/comments/1338bah/anal_squirting/",
 "/r/squirting/comments/132wulq/i_love_grabbing_a_hold_of_his_cock_and_using_the/",
 "/r/squirting/comments/132c2in/i_love_washing_my_mirrors_with_my_squirt/",
 "/r/squirting/comments/13370k7/i_just_kept_going/",
 "/r/squirting/comments/132qvfl/legs_were_shaking_after_this_one/",
 "/r/squirting/comments/132luba/i_somehow_made_the_view_even_better/",
 "/r/squirting/comments/132rhsb/omfg_this_thing_can_make_me_orgasms_for_as_long/",
 "/r/squirting/comments/132lajh/let_me_sit_right_on_top_of_your_face/",
 "/r/squirting/comments/131qtra/what_a_view/",
 "/r/squirting/comments/131prmw/i_need_help_cleaning_up/",
 "/r/squirting/comments/131n9tr/almost_made_me_stand_up_because_of_how_intense_it/",
 "/r/squirting/comments/131nb0t/if_only_you_could_see_the_puddle_i_made_on_my_bed/",
 "/r/squirting/comments/132aptk/squirting_is_my_favourite_hobbie/",
 "/r/squirting/comments/131tewa/im_here_just_to_show_my_garden_routine/",
 "/r/squirting/comments/1326dtv/squirting_on_your_cockpov_51f/",
 "/r/squirting/comments/13257qt/god_i_love_doing_this/",
 "/r/squirting/comments/1321pzt/want_some_gamer_girl_squirt/",
 "/r/squirting/comments/131w1hs/the_need_to_cum_but_having_to_be_desperately/",
 "/r/squirting/comments/1318de4/my_legs_are_shaking_and_i_cant_stop_i_knew_i_had/",
 "/r/squirting/comments/131ox70/violet_summers/",
 "/r/squirting/comments/131s429/i_want_this_all_over_your_face/",
 "/r/squirting/comments/131kt9b/this_is_the_beginning_of_the_fountain/",
 "/r/squirting/comments/131oae0/soaked_and_ready/",
 "/r/squirting/comments/130tiry/my_biggest_squirt_gush_yet/",
 "/r/squirting/comments/131d809/i_pushed_my_toy_out_of_my_pussy_from_cumming_and/",
 "/r/squirting/comments/131ngcs/my_pussy_made_bell_of_a_puddle_i_even_had_a_taste/",
 "/r/squirting/comments/1313ye7/this_could_be_in_your_face/",
 "/r/squirting/comments/131fn7u/finger_me_just_right_and_ill_completely_drench_you/",
 "/r/squirting/comments/1318wjq/making_mess_outside/",
 "/r/squirting/comments/130rmr5/your_pov_when_i_sit_on_your_face/",
 "/r/squirting/comments/13143ep/my_pussy_squirting_slowmotion/",
 "/r/squirting/comments/130jn55/now_imagine_this_but_all_over_your_face/",
 "/r/squirting/comments/131839a/daddys_fingers_are_amazing_at_making_my_pussy_leak/",
 "/r/squirting/comments/130el32/squirt_on_the_hotel_pool_lounger/",
 "/r/squirting/comments/130kuz4/this_wand_always_gets_my_pussy_flowing/",
 "/r/squirting/comments/1304emd/i_love_cumming_squirting_for_youu/",
 "/r/squirting/comments/1306805/i_made_a_little_puddle/",
 "/r/squirting/comments/12zkifr/to_make_sure_my_uni_housemates_cant_hear_me_i/",
 "/r/squirting/comments/130a73c/place_your_mouth_right_here/",
 "/r/squirting/comments/130aowc/i_love_to_explode_for_you/",
 "/r/squirting/comments/1307qmc/anna_thorn_at_it_again/",
 "/r/squirting/comments/13066y0/your_cock_hits_all_the_right_places/",
 "/r/squirting/comments/12zfg6j/lowkey_concerned_that_i_wont_be_satisfied_with/",
 "/r/squirting/comments/12zisc0/could_it_be_that_i_can_jump_like_this_in_your/",
 "/r/squirting/comments/12z1oqs/i_need_a_guy_whos_willing_to_make_me_squirt_daily/",
 "/r/squirting/comments/12zdu0q/where_is_your_face/",
 "/r/squirting/comments/12zgkft/gave_myself_a_shower/",
 "/r/squirting/comments/12z2csg/i_want_you_to_have_all_of_my_squirt/",
 "/r/squirting/comments/12z9m7m/i_want_this_all_over_your_face/",
 "/r/squirting/comments/12yn9ur/so_likeing_horny_i_soaked_my_panties_with_my/",
 "/r/squirting/comments/12ylm1t/i_have_very_intense_orgasms/",
 "/r/squirting/comments/12zb1h3/that_little_squirt/",
 "/r/squirting/comments/12zg1ug/imagine_me_doing_this_while_your_cock_is_inside_me/",
 "/r/squirting/comments/12z76eb/lets_take_turns_draining_each_other/",
 "/r/squirting/comments/12z280l/the_smile_on_my_face_while_i_squirt_hehehe/",
 "/r/squirting/comments/12yoyqu/having_an_orgasm_whilst_trying_to_be_quiet_is/",
 "/r/squirting/comments/12z32kz/frozen_in_time/",
 "/r/squirting/comments/12yud1k/he_gave_me_a_quivering_orgasm_and_made_me_squirt/",
 "/r/squirting/comments/12yt9d0/this_yoni_masseuse_was_great_at_making_me_squirt/",
 "/r/squirting/comments/12ydk0i/have_i_mentioned_im_a_squirter/",
 "/r/squirting/comments/12ywg58/i_love_to_make_a_mess/",
 "/r/squirting/comments/12yidpb/when_he_gets_you_with_his_tongue_dont_let_him_go/",
 "/r/squirting/comments/12y6yyb/watching_my_pussy_get_closer_to_o_makes_me_cum_so/",
 "/r/squirting/comments/12xyaue/i_need_a_guy_who_makes_me_squirt_daily/",
 "/r/squirting/comments/12yuc2y/slippery_when_wet/",
 "/r/squirting/comments/12yjhq6/your_cock_always_makes_me_gush/",
 "/r/squirting/comments/12yit7x/my_leg_trembles_as_i_make_myself_squirt/",
 "/r/squirting/comments/12yjivr/tiffany_watson_exploding_and_sucking/",
 "/r/squirting/comments/12yjuoo/i_want_you_to_taste_it/",
 "/r/squirting/comments/12yfb1t/i_need_you_down_on_your_knees_mouth_open_taking/",
 "/r/squirting/comments/12yhb94/ive_never_squirted_so_hard_in_my_life_i/",
 "/r/squirting/comments/12xrcor/i_always_get_so_creamy_right_after_i_squirt/",
 "/r/squirting/comments/12y9xos/time_to_change_the_bedsheets/",
 "/r/squirting/comments/12ygzc7/ooops_i_made_a_mess/",
 "/r/squirting/comments/12xudqh/i_love_to_squirt_for_you_oc/",
 "/r/squirting/comments/12yay2j/i_love_when_my_orgasms_keep_going/",
 "/r/squirting/comments/12y4l8d/id_love_to_make_a_mess_all_over_your_cock/",
 "/r/squirting/comments/12xf6ua/my_secret_tiny_cave_behind_the_waterfalls/",
 "/r/squirting/comments/12xziop/my_pussy_with_white_cream_on_big_dildo/",
 "/r/squirting/comments/12xxixq/i_just_drank_my_own_juices_lets_share_next_time/",
 "/r/squirting/comments/12xpee0/alot_of_squirt_doesnt_hurt/",
 "/r/squirting/comments/12xfhpl/shes_doing_this_kind_of_thing_all_the_time/",
 "/r/squirting/comments/12xxr92/a_squirting_compilation/",
 "/r/squirting/comments/12wyv9m/i_wannna_rideee_you_until_i_squirt_uncontrollably/",
 "/r/squirting/comments/12xzv68/still_dripping_after_i_squirted_all_over_the/",
 "/r/squirting/comments/12xemgo/i_guess_ill_get_another_gaming_chair/",
 "/r/squirting/comments/12xom84/oktokuro/",
 "/r/squirting/comments/12xb3b0/kenzie_reeves_cant_stop/",
 "/r/squirting/comments/12xnqvu/my_pussy_close_up_as_the_multiple_orgasm_makes_me/",
 "/r/squirting/comments/12xfl10/my_squirt_actually_took_this_photo_whilst_i_was/",
 "/r/squirting/comments/12wq15l/im_just_surprised_the_pipes_can_handle_this_water/",
 "/r/squirting/comments/12xfi30/open_your_mouth_and_lick_daddy/",
 "/r/squirting/comments/12x0cf6/i_want_you_to_taste_it/",
 "/r/squirting/comments/12xelct/ups_i_ruined_my_gaming_chair/",
 "/r/squirting/comments/12wwdec/clean_up_my_mess_with_your_tongue/",
 "/r/squirting/comments/12wv8z0/my_tiny_pinay_pussy_squirting_while_likeed_by_bwc/",
 "/r/squirting/comments/12wsdz9/a_little_squirt_wont_hurt/",
 "/r/squirting/comments/12x2gqd/it_takes_a_village/",
 "/r/squirting/comments/12wody5/adriana_chechik_makes_anna_bell_peaks_squirt/",
 "/r/squirting/comments/12wlvap/abella_danger_leaking_like_crazy/",
 "/r/squirting/comments/12w35rg/this_one_nearly_broke_my_phone_but_was_worth_it/",
 "/r/squirting/comments/12vhk6p/dont_let_a_single_drop_of_my_squirt_go_to_waste/",
 "/r/squirting/comments/12vlmb3/i_cant_help_but_squirt_when_he_eats_it/",
 "/r/squirting/comments/12vm166/real_men_love_it_messy/",
 "/r/squirting/comments/12v6qdq/time_to_clean_the_windows/",
 "/r/squirting/comments/12vx7ro/with_all_the_eye_candy_around_me_i_couldnt_help/",
 "/r/squirting/comments/12vpk1g/letting_go_just_feels_so_good/",
 "/r/squirting/comments/12vt8kf/paige_and_skylar_have_some_wet_fun/",
 "/r/squirting/comments/12vo29n/anal_fisting_squirt/",
 "/r/squirting/comments/12vuzwl/i_wish_i_was_soaking_you_with_this_pussy/",
 "/r/squirting/comments/12vois2/spreading_my_ass_while_i_squirt_pov_rev_cowgirl/",
 "/r/squirting/comments/12v2mzd/this_was_just_the_start/",
 "/r/squirting/comments/12v3z8l/when_i_say_lets_have_fun_i_mean_the_kind_you/",
 "/r/squirting/comments/12vobbk/its_earth_day_heres_a_reminder_to_water_your/",
 "/r/squirting/comments/12v5kxn/i_dressed_up_to_squirt_on_my_floor/",
 "/r/squirting/comments/12vffza/lil_ass_squirt/",
 "/r/squirting/comments/12vaydn/i_cant_help_but_be_loud_when_cumming/",
 "/r/squirting/comments/12vffre/queen_t_flowing/",
 "/r/squirting/comments/12ungtx/i_have_a_superpoweri_squirt_like_a_waterfall/",
 "/r/squirting/comments/12v2hz8/clean_up_isle_3/",
 "/r/squirting/comments/12url0j/i_love_turning_my_pussy_into_a_waterpark_for_you/",
 "/r/squirting/comments/12vb2ar/likeed_until_she_squirts/",
 "/r/squirting/comments/12uslkq/the_buttplug_makes_me_squirt_even_harder/",
 "/r/squirting/comments/12uul1s/vibrator_dildo_butt_plug/",
 "/r/squirting/comments/12u6tm5/had_to_leave_the_pool_party_early_so_i_could_cum/",
 "/r/squirting/comments/12uspyh/car_wash/",
 "/r/squirting/comments/12uxcgf/abella_danger/",
 "/r/squirting/comments/12uz6ln/hitting_all_the_spots/",
 "/r/squirting/comments/12ty2v6/my_orgasm_is_so_intense_it_feels_like_theres_an/",
 "/r/squirting/comments/12ur8dz/creamy_solo_squirting/",
 "/r/squirting/comments/12up1a6/the_bed_was_absolutely_soaked_after_this_one_dont/",
 "/r/squirting/comments/12ulx2z/pressure_shot/",
 "/r/squirting/comments/12ubrju/i_really_want_you_to_fill_me_with_your_cream/",
 "/r/squirting/comments/12tldmq/i_love_squirting_for_you/",
 "/r/squirting/comments/12try32/ready_to_drink_every_drop/",
 "/r/squirting/comments/12twxpn/i_cant_start_my_day_without_it/",
 "/r/squirting/comments/12txm62/this_is_what_a_pussy_looks_like_after_its_been/",
 "/r/squirting/comments/12uaj1z/exploding_into_a_couple_of_nice_squirting_orgasms/",
 "/r/squirting/comments/12tye7o/wet_and_wild/",
 "/r/squirting/comments/12tsxh7/prepare_your_tongue/",
 "/r/squirting/comments/12t9acc/its_not_a_good_morning_until_i_finish/",
 "/r/squirting/comments/12tsa7p/a_fountain/",
 "/r/squirting/comments/12tdcp7/my_fingers_make_the_pussy_a_fountain/",
 "/r/squirting/comments/12tecyi/i_love_the_release/",
 "/r/squirting/comments/12tlisi/when_it_rains_it_likeing_pours_baby/",
 "/r/squirting/comments/12swx7k/my_future_bride/",
 "/r/squirting/comments/12t0jbb/squirming_on_that_tentacle_until_i_squirt/",
 "/r/squirting/comments/12tdeij/wife_cums_and_squirts_all_over_the_porch/",
 "/r/squirting/comments/12svdgx/open_wide_im_aiming_for_your_mouth/",
 "/r/squirting/comments/12tdze6/anna_thorn/",
 "/r/squirting/comments/12tcz4f/katie_kush_drinking_from_the_coco_spring/",
 "/r/squirting/comments/12t5ybe/watch_my_pants_get_wetter_and_wetter_hehe/",
 "/r/squirting/comments/12sytf9/what_a_beautiful_mess_i_made/",
 "/r/squirting/comments/12sc1p1/riding_your_face_makes_me_squirt_so_much/",
 "/r/squirting/comments/12tggtn/yes_im_a_squirter/",
 "/r/squirting/comments/12sbw4t/the_second_squirt_felt_better_than_the_first/",
 "/r/squirting/comments/12rw0tp/i_love_washing_my_mirrors_clean_with_my_squirt/",
 "/r/squirting/comments/12rzxxb/queen/",
 "/r/squirting/comments/12slcl3/her_toes_curl_so_hard_when_she_squirts/",
 "/r/squirting/comments/12rqu1t/right_at_the_camera/",
 "/r/squirting/comments/12rqxzt/close_your_eyes_open_your_mouth_and_let_me_soak/",
 "/r/squirting/comments/12rtiom/lips_gripping_seahorse_until_my_pussy_cums/",
 "/r/squirting/comments/12rx27k/feeding_the_need/",
 "/r/squirting/comments/12rwd9u/really_need_to_invest_in_waterproof_sheets/",
 "/r/squirting/comments/12rn8az/its_all_yours/",
 "/r/squirting/comments/12ru1vv/can_someone_help_clean_up_this_mess/",
 "/r/squirting/comments/12r4tqw/i_wish_i_could_soak_you_with_my_pussy_juice/",
 "/r/squirting/comments/12qtcxf/the_second_squirt_felt_better_than_the_first/",
 "/r/squirting/comments/12qzkk9/once_i_start_squirting_i_just_cant_stop/",
 "/r/squirting/comments/12r8fg6/extra_juicy_milfy_pussy/",
 "/r/squirting/comments/12r73ts/pov_for_this_gif/",
 "/r/squirting/comments/12ri7yw/i_wish_i_had_someone_to_squirt_on/",
 "/r/squirting/comments/12qp902/im_not_being_accommodating_i_crave_that_cock_deep/",
 "/r/squirting/comments/12rahvv/i_want_you_to_taste_every_drop_of_my_pussy_juice/",
 "/r/squirting/comments/12rbbte/first_time_posting_me_squirting/",
 "/r/squirting/comments/12r44xr/i_havent_squirted_in_a_year/",
 "/r/squirting/comments/12r2fr2/it_just_wouldnt_stop/",
 "/r/squirting/comments/12qmonz/professional_garden_sprinkler/",
 "/r/squirting/comments/12qjq87/my_pussy_made_it_rain/",
 "/r/squirting/comments/12q5cjf/smack_my_pussy_and_im_a_super_soaker/",
 "/r/squirting/comments/12pp1ry/i_love_squirting_for_you/",
 "/r/squirting/comments/12q0gbd/in_this_position_is_better/",
 "/r/squirting/comments/12piduz/my_likesaw_basically_turns_my_pussy_into_a_power/",
 "/r/squirting/comments/12pj2a3/one_day_ill_find_someone_who_will_let_me_squirt/",
 "/r/squirting/comments/12pfx74/make_your_monday_better_with_a_splash_of_squirt/",
 "/r/squirting/comments/12pxco3/my_creamy_pussy_squirting_for_you/",
 "/r/squirting/comments/12ps9je/sorry_if_you_get_soaked_youre_in_the_splash_zone/",
 "/r/squirting/comments/12pcb7y/i_think_he_understood_the_assignment/",
 "/r/squirting/comments/12opynh/i_think_you_should_make_me_squirt_on_your_face/",
 "/r/squirting/comments/12ppymf/my_holes_get_so_tight_when_i_squirt_it_is_hard_to/",
 "/r/squirting/comments/12pbqt2/my_gfs_fingers_make_me_squirt_drippingg/",
 "/r/squirting/comments/12pcgo0/i_want_to_do_this_on_your_cock_instead/",
 "/r/squirting/comments/12osiiw/in_case_you_were_wondering_your_boss_has/",
 "/r/squirting/comments/12q2dy4/down_the_wood/",
 "/r/squirting/comments/12osxlj/squirting_in_my_panties_is_fun/",
 "/r/squirting/comments/12op3pa/i_should_warn_you_when_youre_with_me_things_get/",
 "/r/squirting/comments/12oa1io/i_love_squirting_for_you/",
 "/r/squirting/comments/12of138/the_second_squirt_is_always_more_intense_than_the/",
 "/r/squirting/comments/12pbn63/bottle_squirt_gape/",
 "/r/squirting/comments/12oikij/giving_his_face_a_good_soaking/",
 "/r/squirting/comments/12ounp5/explosive_squirting_in_your_face/",
 "/r/squirting/comments/12oof6d/perfect_squirting_pussy/",
 "/r/squirting/comments/12o4d9k/like_popping_open_a_bottle/",
 "/r/squirting/comments/12ojrs4/all_rhoades_lead_to_lana/",
 "/r/squirting/comments/12okgky/melting_cotton_candy/",
 "/r/squirting/comments/12nldhi/i_wish_i_was_squirting_right_into_a_mouth/",
 "/r/squirting/comments/12njihe/i_love_the_sound_of_my_gushes/",
 "/r/squirting/comments/12o8j4z/cant_help_but_squirt_when_he_cums/",
 "/r/squirting/comments/12ops18/i_drank_my_squirt_to_find_out_what_people_taste/",
 "/r/squirting/comments/12nuryh/soaking_my_feet_with_pussys_fountains/",
 "/r/squirting/comments/12nba0o/i_love_squirting_over_over_for_you/",
 "/r/squirting/comments/12nzfok/its_gonna_be_a_wet_weekend/",
 "/r/squirting/comments/12mtv9z/one_touch_and_youll_be_hooked/",
 "/r/squirting/comments/12ni71v/he_loves_to_make_me_squirt_again_and_again/",
 "/r/squirting/comments/12n09r6/a_mini_squirt_appears/",
 "/r/squirting/comments/12n0aqd/i_love_when_my_girlfriend_makes_me_squirt/",
 "/r/squirting/comments/12n0dyk/creaming_and_squirting_is_a_good_sign_of_an/",
 "/r/squirting/comments/12ngv51/so_glad_i_have_squirting_orgasms_they_are_just/",
 "/r/squirting/comments/12m1pn5/so_much_squirt_dripping_down_my_legs/",
 "/r/squirting/comments/12mpovz/place_your_face_right_under_here/",
 "/r/squirting/comments/12m8qey/i_know_how_to_do_me_right/",
 "/r/squirting/comments/12m7786/making_her_wet/",
 "/r/squirting/comments/12mefdd/i_am_addicted_to_making_a_squirty_mess/",
 "/r/squirting/comments/12m61xj/sofa_got_wet_but_so_did_i/",
 "/r/squirting/comments/12lryr4/an_up_close_view/",
 "/r/squirting/comments/12lu0w2/you_can_tell_i_wasnt_satisfied_with_how_quickly_i/",
 "/r/squirting/comments/12l7zgk/my_type_a_guy_thirsty/",
 "/r/squirting/comments/12lt5b9/bouncing_my_ass_hard_on_a_thick_cock_until_my/",
 "/r/squirting/comments/12lv0pu/fisting_squirting_and_the_thick_end_of_a_bottle/",
 "/r/squirting/comments/12l98p3/i_love_waterworks/",
 "/r/squirting/comments/12lon2d/my_fluids_are_falling/",
 "/r/squirting/comments/12l52kq/gushing_squirt_everywhere/",
 "/r/squirting/comments/12li9b1/competition_distance_intense_squirt_in_reverse/",
 "/r/squirting/comments/12l2f9p/squirting_on_his_cock/",
 "/r/squirting/comments/12lqgf3/sometimes_i_squirt_a_little/",
 "/r/squirting/comments/12kltue/any_possible_future_bf_will_need_to_catch_this/",
 "/r/squirting/comments/12ks2jp/who_said_only_guys_can_give_facials/",
 "/r/squirting/comments/12kl1vf/bunny_butt_in_whilst_i_squirt/",
 "/r/squirting/comments/12ku14q/i_wet_the_sofa_whoops/",
 "/r/squirting/comments/12kplw7/i_love_making_a_wet_mess_of_myself/",
 "/r/squirting/comments/12kq474/good_girls_gush_through_their_leggings/",
 "/r/squirting/comments/12kyjb2/thursday_squirting_fun_for_you/",
 "/r/squirting/comments/12jwvyp/i_love_cummin_squirting_for_you/",
 "/r/squirting/comments/12kvif3/water_works/",
 "/r/squirting/comments/12k4o26/when_its_raining_outside_ive_gotta_make_it_rain/",
 "/r/squirting/comments/12kpwl8/thirsty_thursday/",
 "/r/squirting/comments/12kkzhv/lets_squirt_juices/",
 "/r/squirting/comments/12k0lcq/3_fingers_feels_soo_good/",
 "/r/squirting/comments/12kjity/love_this_toy_x/",
 "/r/squirting/comments/12jom23/im_calling_it_squirt_art_this_is_squirt_art/",
 "/r/squirting/comments/12jukc9/i_wish_this_was_your_cock_making_my_pussy_wet/",
 "/r/squirting/comments/12jgvit/it_was_a_good_idea_to_protect_the_machine/",
 "/r/squirting/comments/12k1t2i/i_think_i_wet_the_table/",
 "/r/squirting/comments/12jy9xh/soaked_my_phone/",
 "/r/squirting/comments/12k20ps/she_has_a_fan/",
 "/r/squirting/comments/12k22je/what_teachers_do_after_work/",
 "/r/squirting/comments/12jzcfv/squirting_all_over_you_open_your_mouth_daddy_awh/",
 "/r/squirting/comments/12k0s1r/i_will_have_to_bathe_for_the_second_time_today/",
 "/r/squirting/comments/12jt3ei/in_your_face/",
 "/r/squirting/comments/12jk7hw/a_little_spill_here_a_little_splatter_there_all/",
 "/r/squirting/comments/12j1ecx/squirted_on_my_face/",
 "/r/squirting/comments/12jy3s0/the_forbidden_juice/",
 "/r/squirting/comments/12jbnkb/squirt_fest/",
 "/r/squirting/comments/12jhvvk/creampie_squirting_and_my_pussy_up_close_warning/",
 "/r/squirting/comments/12jg7wx/he_was_teaching_me_stickshift_and_i_squirted/",
 "/r/squirting/comments/12jeiwa/his_monster_cock_destroying_my_tight_cunt_while/",
 "/r/squirting/comments/12jjb0t/omg_i_am_squirted_again_on_my_chair/",
 "/r/squirting/comments/12itont/i_want_to_fill_your_mouth_with_all_my_pussy_juice/",
 "/r/squirting/comments/12j90wv/absolutely_massive_cytheria_style_hard_squirt/",
 "/r/squirting/comments/12ihwp5/your_pov_when_dating_a_squirter/",
 "/r/squirting/comments/12ijbga/others_might_see_a_mess_but_you_and_i_see_a/",
 "/r/squirting/comments/12ikdmg/he_made_me_squirt_then_i_milked_out_his_pre_cum/",
 "/r/squirting/comments/12j0qf0/everything_is_better_down_where_its_wetter/",
 "/r/squirting/comments/12jb1wr/soaking_it_all/",
 "/r/squirting/comments/12iv36z/your_face_is_next/",
 "/r/squirting/comments/12j0cr8/hope_everyone_had_a_good_easter_like_i_did_with/",
 "/r/squirting/comments/12ijjh8/fisting_my_pussy_until_it_sprays_in_submission/",
 "/r/squirting/comments/12hv305/i_love_squirting_for_you/",
 "/r/squirting/comments/12i5g4s/i_even_surprised_myself_with_that_one/",
 "/r/squirting/comments/12iaygb/i_love_to_get_wet_before_bed/",
 "/r/squirting/comments/12hjwiq/restrained_and_made_to_squirt/",
 "/r/squirting/comments/12hdsd2/i_love_squirting_for_you/",
 "/r/squirting/comments/12htmce/so_you_can_feel_it_on_your_face/",
 "/r/squirting/comments/12hiayv/squirting_is_pure_bliss_i_didnt_want_it_to_end/",
 "/r/squirting/comments/12hxewr/always_fun_making_my_panties_wet/",
 "/r/squirting/comments/12ibvtz/destroy_my_little_cunt_and_shell_squirt_all_over/",
 "/r/squirting/comments/12i00j9/huge_multiple_shaking_squirting/",
 "/r/squirting/comments/12gxd2v/catch_the_drips_in_your_mouth/",
 "/r/squirting/comments/12hgfl7/my_pussy_didnt_have_a_chancei_came_likeing_hard/",
 "/r/squirting/comments/12hbdnt/gripping_his_huge_cock_while_he_makes_me_squirt_3/",
 "/r/squirting/comments/12ha0fo/take_a_picture_midstream_like_yea/",
 "/r/squirting/comments/12h6jvs/so_whos_thirsty/",
 "/r/squirting/comments/12gsizo/my_pussy_exploded_o/",
 "/r/squirting/comments/12gknbm/close_up_of_my_shaved_pussy_squirting/",
 "/r/squirting/comments/12gvmio/the_waters_so_wild_dont_let_it_sweep_you_off_your/",
 "/r/squirting/comments/12gmucl/my_pussy_always_gets_so_creamy_before_i_squirt/",
 "/r/squirting/comments/12gxgag/anyone_want_a_drink/",
 "/r/squirting/comments/12gugv3/all_over_my_shoes_and_myself/",
 "/r/squirting/comments/12gpfhj/pov_i_am_squirting_on_you/",
 "/r/squirting/comments/12g2u8v/the_forecast_is_showing_heavy_rain_today/",
 "/r/squirting/comments/12gn1cu/tight_pussy_fat_cock_pounding_squirt_waterfall/",
 "/r/squirting/comments/12gizlz/so_much_more_squirt_in_this_sessioncumcumcum/",
 "/r/squirting/comments/12gmc6b/jesse_andrews_squirting/",
 "/r/squirting/comments/12gipfb/riding_his_big_thick_cock_hard_and_deep_til_i/",
 "/r/squirting/comments/12gc0k4/catching_up_with_an_old_friend_ff/",
 "/r/squirting/comments/12g8lqz/my_pussy_wouldnt_stop_spraying/",
 "/r/squirting/comments/12fwop3/replace_my_toy_let_me_squirt_all_over_your_cock/",
 "/r/squirting/comments/12ful74/i_can_squirt_several_times_at_a_time_3/",
 "/r/squirting/comments/12fji2h/squirting_with_my_friends/",
 "/r/squirting/comments/12fwmg7/squirting_is_my_fav_way_to_cum/",
 "/r/squirting/comments/12g1lxi/mop_the_ceiling/",
 "/r/squirting/comments/12fq4hl/i_squirted_so_much_i_made_a_huge_puddle/",
 "/r/squirting/comments/12g019o/god_damn_i_squirt_so_hard/",
 "/r/squirting/comments/12fmtai/a_cucumber_a_day_keeps_the_doctor_away_but_wont/",
 "/r/squirting/comments/12fk659/dont_just_sip_take_a_gulp_of_my_juice/",
 "/r/squirting/comments/12fxgqz/like_me_better/",
 "/r/squirting/comments/12gac27/soaking_in_this_alone_time/",
 "/r/squirting/comments/12g00ko/he_made_me_squirt_everywhere/",
 "/r/squirting/comments/12fso3k/you_know_just_hanging_out/",
 "/r/squirting/comments/12foprn/i_made_such_a_mess_hehe/",
 "/r/squirting/comments/12fj5vu/this_is_why_dick_does_to_me/",
 "/r/squirting/comments/12evao6/i_know_you_are_tryna_hit_my_wap/",
 "/r/squirting/comments/12fofqt/my_pussy_just_wouldnt_stop_cumming_even_as_i/",
 "/r/squirting/comments/12f6j7w/when_you_work_from_home_lunch_is_a_whole_lot/",
 "/r/squirting/comments/12ei8bp/i_wish_i_could_soak_you_with_all_my_pussy_juice/",
 "/r/squirting/comments/12f73pj/punch_and_squirt/",
 "/r/squirting/comments/12eg7x1/nothing_soothes_a_dry_mouth_like_a_squirty_pussy/",
 "/r/squirting/comments/12eseyk/a_squirting_orgasm_is_hands_down_the_best_i_love/",
 "/r/squirting/comments/12f5ocs/i_so_wish_i_could_spray_this_into_your_mouth/",
 "/r/squirting/comments/12ez8tq/april_showers/",
 "/r/squirting/comments/12ewkwg/little_soak/",
 "/r/squirting/comments/12ekgcz/lets_have_a_cute_candlelit_date_and_you_can_watch/",
 "/r/squirting/comments/12ef5u1/feed_me_squirt/",
 "/r/squirting/comments/12ei2ov/avery_christy_kenzie_anne/",
 "/r/squirting/comments/12ecfta/i_wish_more_guys_were_into_getting_squirted_on/",
 "/r/squirting/comments/12dvysc/its_rare_when_he_makes_you_squirt_with_his_tongue/",
 "/r/squirting/comments/12dmox8/squirting_from_an_ass_pounding/",
 "/r/squirting/comments/12eaqg1/i_accidentally_squirt_all_over_my_chair/",
 "/r/squirting/comments/12dge49/ripped_leggings_and_squirting/",
 "/r/squirting/comments/12dh5uw/still_accepting_bf_applications_if_you_can_make/",
 "/r/squirting/comments/12di5w0/i_need_a_bf_thats_into_squirting/",
 "/r/squirting/comments/12djkkw/squirting_with_cum_on_my_face_new_kink_unlocked/",
 "/r/squirting/comments/12dmz50/if_only_the_food_pyramid_knew_what_it_was_missing/",
 "/r/squirting/comments/12dis5f/like_that_pretty_little_pussy/",
 "/r/squirting/comments/12czv6q/lick_my_pussy_and_ill_leave_your_face_soaking_wet/",
 "/r/squirting/comments/12dlmcl/wetter_than_a_river_oc/",
 "/r/squirting/comments/12dj51b/i_love_when_he_makes_me_squirt_with_just_his/",
 "/r/squirting/comments/12crl7m/the_thought_of_me_making_a_mess_in_your_office/",
 "/r/squirting/comments/12cxmfv/only_thing_im_missing_is_your_tongue_to_lick_me/",
 "/r/squirting/comments/12cgzcc/i_wish_i_could_gush_in_your_mouth/",
 "/r/squirting/comments/12d3nbt/the_biggest_puddle_ive_ever_made/",
 "/r/squirting/comments/12cuumq/the_landscapers_got_a_wet_show/",
 "/r/squirting/comments/12d4mk1/pop/",
 "/r/squirting/comments/12cqrcw/i_always_make_squirt_fountain_when_i_think_about/",
 "/r/squirting/comments/12d2e3j/was_so_horny_tonight_wasnt_expecting_to_squirt/",
 "/r/squirting/comments/12cfu4x/synchronized_squirting/",
 "/r/squirting/comments/12d16e2/intense_bodyshaking_squirting_orgasm/",
 "/r/squirting/comments/12d2ps4/i_get_so_happy_when_my_pussy_surprises_me_with/",
 "/r/squirting/comments/12cov8r/playing_a_little_before_my_roomie_arrives/",
 "/r/squirting/comments/12cr0mv/open_your_mouth_let_me_cover_you_in_my_juices/",
 "/r/squirting/comments/12cn4jw/cowgirl_impaled_until_squirt/",
 "/r/squirting/comments/12c169y/squirted_so_hard_my_eyes_rolled_back/",
 "/r/squirting/comments/12ch566/intense_orgasam/",
 "/r/squirting/comments/12cl88f/omfg_do_not_get_in_front_of_of_my_pussy_unless/",
 "/r/squirting/comments/12ci7gc/messy_bed_happy_lady/",
 "/r/squirting/comments/12c3z27/omfg_i_squirted_on_my_face/",
 "/r/squirting/comments/12cgxzb/its_been_sooooo_long/",
 "/r/squirting/comments/12bfpy1/my_ideal_bf_is_one_whod_catch_my_squirt_with_his/",
 "/r/squirting/comments/12bx8sa/uncontrollable_standing_squirt/",
 "/r/squirting/comments/12caza2/i_love_making_a_big_mess_on_my_floor/",
 "/r/squirting/comments/12c6gom/look_how_messy_baby/",
 "/r/squirting/comments/12c55ua/squirting_hard_in_your_face_from_reverse_cowgirl/",
 "/r/squirting/comments/12c8jty/likeing_my_ass_made_me_squirt_so_hard/",
 "/r/squirting/comments/12bfpi4/help_from_a_friend_on_the_lawn/",
 "/r/squirting/comments/12bteet/i_wish_more_men_were_into_squirting/",
 "/r/squirting/comments/12c38o3/this_was_so_intense/",
 "/r/squirting/comments/12c11sk/partypussy/",
 "/r/squirting/comments/12bin1g/squirting_is_my_gift_from_the_heavens_and_im/",
 "/r/squirting/comments/12bsxoz/i_bet_you_cant_make_me_orgasm_harder/",
 "/r/squirting/comments/12bamjz/let_me_drain_your_balls/",
 "/r/squirting/comments/12bq0or/wait_until_the_end_to_see_me_forced_to_squirt_on/",
 "/r/squirting/comments/12bhnho/you_can_hear_me_moan_through_the_phone/",
 "/r/squirting/comments/12blt52/soaked_my_bed_body_suit_and_floor/",
 "/r/squirting/comments/12br64u/things_got_a_littlewet/",
 "/r/squirting/comments/12b9k2z/squirting_hard_for_you/",
 "/r/squirting/comments/12asmxd/fresh_juice_for_you_just_need_to_open_your_mouth/",
 "/r/squirting/comments/12bc8a1/id_prefer_to_be_squirting_on_the_real_thing/",
 "/r/squirting/comments/12b5jqi/ill_drink_your_juice_if_you_drink_mineee/",
 "/r/squirting/comments/12aglgd/some_shower_time_fun_i_had_to_stay_extra_quiet/",
 "/r/squirting/comments/12an3jg/if_we_date_a_waterpark_is_included_in_the_package/",
 "/r/squirting/comments/12b29g2/speaking_pussy/",
 "/r/squirting/comments/129zdzk/my_type_a_guy_who_likes_my_squirt_on_his_face/",
 "/r/squirting/comments/12an0cw/theres_nothing_like_rubbing_your_pussy_fast_and/",
 "/r/squirting/comments/12ae839/she_keeps_on_squirting_in_her_mouth/",
 "/r/squirting/comments/12ale4y/you_asked_for_the_video_here_it_is_i_felt_so/",
 "/r/squirting/comments/12ak73z/dont_move_i_want_to_practice_my_aim_using_your/",
 "/r/squirting/comments/12aevbe/a_good_cock_and_strong_vibrator_invariably_mean/",
 "/r/squirting/comments/12aezww/i_masturbate_way_to_much/",
 "/r/squirting/comments/12abi2m/squirting_upside_down/",
 "/r/squirting/comments/12a8wje/just_a_little_cute_one/",
 "/r/squirting/comments/129relf/if_you_drink_my_juice_i_will_drink_yours/",
 "/r/squirting/comments/129wibv/squirting_in_the_car/",
 "/r/squirting/comments/129liyv/watching_myself_in_the_mirror_while_i_squirt/",
 "/r/squirting/comments/12955dx/squirting_in_stockings/",
 "/r/squirting/comments/129peuk/sweet_stream/",
 "/r/squirting/comments/129lzqm/were_gonna_need_a_clean_up_on_aisle_three/",
 "/r/squirting/comments/129hncx/caught_in_action/",
 "/r/squirting/comments/129mqrs/well_that_felt_amazing/",
 "/r/squirting/comments/129mdjx/spanking_and_squirting/",
 "/r/squirting/comments/1290xeq/my_squirt_shot_across_the_room/",
 "/r/squirting/comments/1298ger/my_puffy_pussy_is_explosive/",
 "/r/squirting/comments/128zt7t/making_myself_so_wet_for_you/",
 "/r/squirting/comments/1297k9x/i_wish_all_men_loved_being_squirted_on/",
 "/r/squirting/comments/128se0y/pov_i_squirt_on_your_face/",
 "/r/squirting/comments/129drvz/i_wish_men_liked_to_be_squirted_on/",
 "/r/squirting/comments/129df1b/live_cam_gets_wet_n_wild/",
 "/r/squirting/comments/128u7ak/i_squirt_pretty_much_every_time_i_cum_now/",
 "/r/squirting/comments/128y7fy/i_squirt_on_someone_squirting_on_me/",
 "/r/squirting/comments/128zhk7/oops_i_made_a_bit_of_a_mess_on_my_bed/",
 "/r/squirting/comments/128jtey/he_loves_making_it_happen/",
 "/r/squirting/comments/128zlx8/i_didnt_think_i_was_gonna_squirt_everywhere/",
 "/r/squirting/comments/128kdk2/my_pussy_pranked_me_for_april_fools_def_was_not/",
 "/r/squirting/comments/128jj1j/tiffany_watson/",
 "/r/squirting/comments/128lo4w/oh_my_my_pussy_needed_to_get_this_wet/",
 "/r/squirting/comments/127t6a7/i_want_to_fill_your_mouth_with_all_my_pussy_juice/",
 "/r/squirting/comments/128k3q2/i_need_a_real_cock_to_squirt_on/",
 "/r/squirting/comments/128lw0q/if_you_watch_closely_you_can_see_squirt_dripping/",
 "/r/squirting/comments/127wom0/my_phone_needs_good_water_protection_falls_under/",
 "/r/squirting/comments/127ufqu/i_want_to_squirt_pussy_in_your_face_3/",
 "/r/squirting/comments/1288zne/i_only_squirt_from_anal/",
 "/r/squirting/comments/127klip/soaked_my_steering_wheel/",
 "/r/squirting/comments/127gdis/perhaps_i_should_come_with_a_beware_of_drowning/",
 "/r/squirting/comments/127leen/making_my_feet_wet/",
 "/r/squirting/comments/127hg7g/he_gets_it/",
 "/r/squirting/comments/127lc30/office_sluts_squirt_too/",
 "/r/squirting/comments/127007h/lick_my_pussy_and_ill_leave_your_face_soaking_wet/",
 "/r/squirting/comments/127ea2u/she_cant_get_enough_of_her_squirt/",
 "/r/squirting/comments/127agjg/fountain_of_squirt/",
 "/r/squirting/comments/127egv5/squirting_volcano_grip_tight_to_the_bones/",
 "/r/squirting/comments/1278ryt/squirting_waterfalls_off_of_my_couch/",
 "/r/squirting/comments/126kh04/such_an_intense_orgasm_left_me_breathless/",
 "/r/squirting/comments/126wifw/i_cant_control_myself/",
 "/r/squirting/comments/1271s6c/im_a_squirter_and_i_love_that/",
 "/r/squirting/comments/126plum/im_actually_worried_that_ill_never_again_be/",
 "/r/squirting/comments/1270byi/open_your_mouth_please/",
 "/r/squirting/comments/126lkv1/squirting_explosion/",
 "/r/squirting/comments/1276v0o/in_a_puddle_of_squirt/",
 "/r/squirting/comments/12736tr/the_tap_would_do_the_job_to_clean_myself_but_id/",
 "/r/squirting/comments/126wyra/sexy_as_like_squirting_vid_slows_down_just_in/",
 "/r/squirting/comments/125xi6w/i_wish_i_could_soak_you_with_my_pussy_juice/",
 "/r/squirting/comments/125t5lj/drenched_my_socks/",
 "/r/squirting/comments/126a3d6/wish_i_had_something_real_to_squirt_on/",
 "/r/squirting/comments/125qmb4/made_a_mess_in_my_panties/",
 "/r/squirting/comments/125w09g/i_wont_stop_until_the_bed_is_soaked/",
 "/r/squirting/comments/125mq7g/a_squirty_pussy_is_a_happy_pussy/",
 "/r/squirting/comments/125y5o3/if_you_drink_my_juice_i_will_drink_yours/",
 "/r/squirting/comments/1267cnt/making_a_mess/",
 "/r/squirting/comments/125tkls/i_just_love_being_an_oiled_up_squirty_mess/",
 "/r/squirting/comments/125p6zt/i_wanted_him_to_hold_me_while_i_made_myself_squirt/",
 "/r/squirting/comments/125kisl/when_hes_as_excited_for_you_to_squirt/",
 "/r/squirting/comments/125oduh/when_my_pussy_is_so_wet_he_cant_swallow_it_all/",
 "/r/squirting/comments/125l2u1/helping_her_friend_cum/",
 "/r/squirting/comments/124yy4w/i_need_a_friend_willing_to_clean_my_squirt/",
 "/r/squirting/comments/125djpu/once_unlocked_i_can_squirt_on_command_hehe/",
 "/r/squirting/comments/125c25j/drain_it/",
 "/r/squirting/comments/125ld1o/i_return_now_we_can_play/",
 "/r/squirting/comments/124wgom/my_creamy_pussy_make_squirt_again/",
 "/r/squirting/comments/124ybey/its_gonna_rain/",
 "/r/squirting/comments/124p2yz/let_me_introduce_you_to_my_milf_waterfall/",
 "/r/squirting/comments/124o6f9/forget_alcohol_you_can_get_drunk_on_me_tonight/",
 "/r/squirting/comments/124v0r7/squirting_on_his_cock_while_he_likes_me/",
 "/r/squirting/comments/124i98f/creamy_squirt/",
 "/r/squirting/comments/12410sp/i_wish_i_could_soak_you_with_my_pussy_juice/",
 "/r/squirting/comments/12530us/squirting_drowning_marathon/",
 "/r/squirting/comments/124kpm2/ive_only_just_found_out_i_can_squirt_and_thus_is/",
 "/r/squirting/comments/124dfwx/imagine_if_that_was_your_dick/",
 "/r/squirting/comments/124lj4e/having_my_ass_stuffed_makes_me_squirt/",
 "/r/squirting/comments/12407tp/double_shower/",
 "/r/squirting/comments/123mo08/opening_the_floodgates/",
 "/r/squirting/comments/123ypbu/i_want_to_fill_your_mouth_with_all_my_pussy_juice/",
 "/r/squirting/comments/123qmbf/cumming_in_little_waves_until_i_soak_the_floor/",
 "/r/squirting/comments/123kz51/fat_flared_head_turns_my_pussy_inside_out_and/",
 "/r/squirting/comments/123oxy9/she_is_squirting_all_over_her_face/",
 "/r/squirting/comments/123mirq/i_love_washing_my_mirrors/",
 "/r/squirting/comments/123v7o5/h2woah/",
 "/r/squirting/comments/123x1aw/surprised_myself_this_time/",
 "/r/squirting/comments/123dhet/its_more_fun_when_i_know_that_you_can_see_it/",
 "/r/squirting/comments/123km3f/fisting_makes_me_squirt/",
 "/r/squirting/comments/1239ypk/it_was_so_hot_knowing_i_was_going_to_squirt_all/",
 "/r/squirting/comments/122mwyh/my_pussy_surprised_me_with_the_biggest_squirt/",
 "/r/squirting/comments/122vsmh/a_little_squirt_wont_hurt/",
 "/r/squirting/comments/1235kvu/if_you_told_me_i_was_a_good_girl_i_would_squirt/",
 "/r/squirting/comments/123cixu/heels_n_squirts/",
 "/r/squirting/comments/122twy3/squirting_all_over_the_front_yard/",
 "/r/squirting/comments/122z6cj/being_a_squirter_is_lots_of_fun/",
 "/r/squirting/comments/122wq4p/_/",
 "/r/squirting/comments/122peda/nothing_like_200_squirting_in_the_sauna/",
 "/r/squirting/comments/122ziqn/i_really_tried_to_keep_it_together_but_gave_in/",
 "/r/squirting/comments/122nml4/queen_tahshaar/",
 "/r/squirting/comments/122vs4h/i_was_so_shocked_at_how_much_i_could_squirt/",
 "/r/squirting/comments/122b608/the_real_reason_to_get_hardwood_floors/",
 "/r/squirting/comments/1226q0u/need_a_friend_willing_to_lick_up_my_sweet_wet/",
 "/r/squirting/comments/121v6vg/i_want_to_fill_your_mouth_with_all_my_pussy_juice/",
 "/r/squirting/comments/121uxgs/maybe_youd_let_me_squirt_on_you_dick_just_like/",
 "/r/squirting/comments/122eili/the_wonderful_release/",
 "/r/squirting/comments/121jbo7/made_the_carpet_wet/",
 "/r/squirting/comments/12239e0/public_train_stop_loved_shooting_this/",
 "/r/squirting/comments/121xiki/im_addicted_to_squirting_there_truly_is_no_better/",
 "/r/squirting/comments/121xhn3/during_anal_masturbation/",
 "/r/squirting/comments/121vogc/gusher/",
 "/r/squirting/comments/121wwh5/slippery_when_wet/",
 "/r/squirting/comments/121juid/squirted_all_over_my_ass_now_we_have_lube_for_anal/",
 "/r/squirting/comments/121uvjr/she_wets_herself/",
 "/r/squirting/comments/121632n/i_was_shocked_i_just_kept_squirting/",
 "/r/squirting/comments/121qjkm/hidden_throat_secrets_the_vagus_nerve/",
 "/r/squirting/comments/121nbb4/riverdance/",
 "/r/squirting/comments/120zr7r/oops_my_pussy_drips_like_a_likeing_waterfall_when/",
 "/r/squirting/comments/121orkp/the_shower_helps_with_clean_up/",
 "/r/squirting/comments/1218mbg/giving_myself_a_shower_with_massive_squirt/",
 "/r/squirting/comments/121mdax/i_squirted_like_a_broken_fountain/",
 "/r/squirting/comments/120y45y/my_pussy_is_a_waterpark/",
 "/r/squirting/comments/120iso7/my_fpov_squirt_at_the_beach/",
 "/r/squirting/comments/1213kp8/levitation_squirting/",
 "/r/squirting/comments/1207ng3/sometimes_i_surprise_myself/",
 "/r/squirting/comments/120g2u0/showing_the_mirror/",
 "/r/squirting/comments/120qppt/turning_a_blowjob_into_a_soak_job/",
 "/r/squirting/comments/120qykh/the_wet_bandits/",
 "/r/squirting/comments/120k15o/let_me_squirt_on_your_cock/",
 "/r/squirting/comments/120goma/watering_the_lawn/",
 "/r/squirting/comments/120fn0j/i_love_it_when_my_squirt_goes_on_and_on_and_on/",
 "/r/squirting/comments/120fnqk/i_almost_pushed_my_buttplug_out_while_squirting/",
 "/r/squirting/comments/120l0hd/this_felt_so_good/",
 "/r/squirting/comments/120ddd6/she_was_ready_for_her_squirt/",
 "/r/squirting/comments/11zvw7f/if_you_drink_my_juice_i_will_drink_yours/",
 "/r/squirting/comments/1208kht/he_loves_getting_all_wet_with_me/",
 "/r/squirting/comments/120b60i/i_take_all_shapes_and_sizes/",
 "/r/squirting/comments/11ziayo/trying_not_to_wakeup_my_housemates/",
 "/r/squirting/comments/11zxbmw/my_lunch_date_cancelled_on_me_so_i_took_care_of/",
 "/r/squirting/comments/11zt9rc/i_think_you_must_open_your_mouth/",
 "/r/squirting/comments/11zo5x4/i_hope_this_is_what_youve_been_looking_for/",
 "/r/squirting/comments/11zpu0c/my_blast_radius_is_my_entire_room_really/",
 "/r/squirting/comments/1200gin/licking_her_squirt_off_the_camera_before/",
 "/r/squirting/comments/1207pd2/guess_my_max_squirts_per_minute_f/",
 "/r/squirting/comments/11zjse4/suck_my_toes_and_watch_me_squirt/",
 "/r/squirting/comments/11yy3bt/im_gonna_make_you_get_addicted_to_clean_squirt/",
 "/r/squirting/comments/11zn8xv/why_choose_between_squirters_or_creamers_when_you/",
 "/r/squirting/comments/11znekb/this_gpop_makes_me_squirts_like_a_broken_pipe/",
 "/r/squirting/comments/11zn8z3/i_can_get_your_juice_ready_in_under_10_seconds/",
 "/r/squirting/comments/11zt7sl/making_my_asian_hotwife_squirt/",
 "/r/squirting/comments/11yh1h7/i_tried_so_hard_to_be_quiet/",
 "/r/squirting/comments/11zbbqo/i_hope_you_like_creammm_with_your_squirt/",
 "/r/squirting/comments/11zdwih/edging_for_days_makes_me_explode/",
 "/r/squirting/comments/11z0hl4/he_loves_to_make_me_squirt_again_and_again/",
 "/r/squirting/comments/11yv4u0/i_want_to_fill_your_mouth_with_all_my_pussy_juice/",
 "/r/squirting/comments/11ys2wi/my_squirt_can_quench_your_thirst_im_sure_about_it/",
 "/r/squirting/comments/11yo78j/the_whole_sofa_got_wet_under_my_booty_3/",
 "/r/squirting/comments/11yqpws/make_me_squirt_more_than_my_toys_do/",
 "/r/squirting/comments/11yn55m/holding_the_door_is_nice_but_im_more_wooed_by_the/",
 "/r/squirting/comments/11yglfn/this_wand_makes_me_go_from_barely_horny_to/",
 "/r/squirting/comments/11xy6p7/im_sorry_if_i_was_too_loud/",
 "/r/squirting/comments/11yxzff/i_love_getting_fingered/",
 "/r/squirting/comments/11ygyqo/lets_give_your_face_a_little_squirt_treatment_lie/",
 "/r/squirting/comments/11y7eoa/after_the_first_squirt_i_knew_i_wasnt_finished/",
 "/r/squirting/comments/11yrofl/nice_and_wet/",
 "/r/squirting/comments/11y7am5/i_wish_i_was_squirting_right_into_a_mouth/",
 "/r/squirting/comments/11yi8cs/my_favourite_way_to_start_the_day/",
 "/r/squirting/comments/11ye9uh/protect_your_face/",
 "/r/squirting/comments/11ymb2j/ohh_like_yes_that_will_make_me_squirt_and_orgasm/",
 "/r/squirting/comments/11xxind/watch_your_step_in_my_foyer/",
 "/r/squirting/comments/11xpejt/just_a_tease_of_my_full_potential/",
 "/r/squirting/comments/11xwjxc/my_leggings_were_soaked/",
 "/r/squirting/comments/11y6bue/pushing_her_squirt_button/",
 "/r/squirting/comments/11xur38/i_need_a_friend_willing_to_clean_my_squirt/",
 "/r/squirting/comments/11xhgqs/i_got_my_feet_wet/",
 "/r/squirting/comments/11y6ap0/g_spot_stimulator/",
 "/r/squirting/comments/11y374n/he_knows_how_to_soak_my_panties/",
 "/r/squirting/comments/11xxuox/i_broke_my_record_and_squirted_16_times_today/",
 "/r/squirting/comments/11xxrqa/a_couple_of_rounds_with_adriana_chechik_after/",
 "/r/squirting/comments/11x7r7l/either_let_me_squirt_all_over_you_or_not_at_all/",
 "/r/squirting/comments/11xmkup/my_pussy_is_a_waterpark/",
 "/r/squirting/comments/11xi593/hopefully_he_doesnt_mind_the_mess_i_left_on_his/",
 "/r/squirting/comments/11xy39p/slimthick_vics_pussy_leaks_as_her_ass_gets_beat/",
 "/r/squirting/comments/11xrqy5/using_my_big_dildo_to_bring_myself_to_a_very_wet/",
 "/r/squirting/comments/11xsa8r/we_always_bring_the_towels_when_we_play/",
 "/r/squirting/comments/11xhmn8/made_to_squirt_hard_and_soak_the_bed_hear_how_wet/",
 "/r/squirting/comments/11x5l7l/mesmerizing_shooting_water_of_pinnacle_apex/",
 "/r/squirting/comments/11xgb7a/my_favorite_way_to_start_the_day/",
 "/r/squirting/comments/11xdnk4/rubbing_my_spot_with_his_cock/",
 "/r/squirting/comments/11x541b/i_want_to_fill_your_mouth_with_all_my_yummy_pussy/",
 "/r/squirting/comments/11x2rhh/_/",
 "/r/squirting/comments/11wnqha/when_the_sex_is_so_good_you_dont_even_care_that/"
]

Download as file:
workupload.com/file/w8bNydP8eua

My results when adding those posts manually are nearly the same (1216 items in total).

Finally at least the amount of posts does match up with the total amount of posts you get via website.
Here is the last page of that subreddit via old.reddit.com:
old.reddit.com/r/squirting/?count=650&after=t3_11xsa8r
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?

Last edited by pspzockerscene; 03.05.2023 at 10:01. Reason: Added "download as file"
Reply With Quote
  #22  
Old 05.05.2023, 17:49
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

imgur.com:

We took a deep dive into it.
There are two main issues:
1. We got one internal "request limit" set to 250ms so on all domains globally, JD is allowed to do 1 request every 250ms.
That allone was the biggest slowdown and will be tweaked to differ between the different domains such as imgur.com, i.imgur.com and api.imgur.com.
-> To be honest I just overlooked that otherwise, I'd have told you earlier. Either way those limits are there for a reason and they do make our plugins work very well for most of all users while powerusers sometimes complain about those.

2. While investigating this, we found a bug in the wait handling associated with that rate-limiting.

So here are your options for now:
1. Wait for us to fix this. Then try again and if our default rate-limits are still too high for you, we might add a setting so you can customize them.
and/or:
2. Workaround by changing the source code:
Grab our source code, open jd.plugins.hoster.ImgurComHoster and remove all lines containing "Browser.setRequestIntervalLimitGlobal".

As always, we won't be providing any ETAs for the release any fixes or requested features.
Thanks for your report and have a nice weekend!
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
  #23  
Old 08.05.2023, 14:28
pspzockerscene's Avatar
pspzockerscene pspzockerscene is offline
Community Manager
 
Join Date: Mar 2009
Location: Deutschland
Posts: 71,076
Default

We've just released CORE-updates.

Regarding the last post (mainly imgur.com):
1. We've lowered the request limits of imgur.com domains.
New internal request limits (in milliseconds) are:
Code:
        Browser.setRequestIntervalLimitGlobal("imgur.com", false, 125);
        Browser.setRequestIntervalLimitGlobal("api.imgur.com", true, 125);
        Browser.setRequestIntervalLimitGlobal("i.imgur.com", true, 100);
If imgur.com downloads are still not starting fast enough for you, we can add a plugin setting for this but we won't lower the current/default settings.

2. The bug in the rate limit handling was also fixed so downloads of plugins using rate-limits should be running from top to bottom without items seemingly freezing in the "Starting..." state.
__________________
JD Supporter, Plugin Dev. & Community Manager

Erste Schritte & Tutorials || JDownloader 2 Setup Download
Spoiler:

A users' JD crashes and the first thing to ask is:
Quote:
Originally Posted by Jiaz View Post
Do you have Nero installed?
Reply With Quote
Reply

Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 09:21.
Provided By AppWork GmbH | Privacy | Imprint
Parts of the Design are used from Kirsch designed by Andrew & Austin
Powered by vBulletin® Version 3.8.10 Beta 1
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.