#41
|
|||
|
|||
Quote:
|
#42
|
|||
|
|||
now crawling for reddit links brings IMGUR prompt to add API credentials...
I am very sorry, but I dont see and find entry fields to add both "Client ID" and "Client Secret" ... and where to activate API? please see screenshots... I registered for my very own imgur account and did create an App like it is necessary with steps from jd prompt... |
#43
|
||||
|
||||
Quote:
This has been fixed and the current revision will work just fine without the need of API login credentials. You can leave your own created imgur app active in your imgur account - it might be required some time in the future -psp-
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#44
|
||||
|
||||
Added support for new linktype:
reddit.com/gallery/XXX This also includes support for reddit galleries inside single posts which the crawler wasn't able to handle so far. Wartest du auf einen angekündigten Bugfix oder ein neues Feature? Updates werden nicht immer sofort bereitgestellt! Bitte lies unser Update FAQ! | Please read our Update FAQ! --- Are you waiting for recently announced changes to get released? Updates to not necessarily get released immediately! Bitte lies unser Update FAQ! | Please read our Update FAQ! -psp-
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#45
|
||||
|
||||
I've closed our reddit.com SVN ticket now.
It's working as it should - only reddit selfhosted content cannot be downloaded with sound which is another issue not related to our reddit crawler plugin. -psp-
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#46
|
||||
|
||||
I got a PM with the question "is it possible to crawl comnplete subreddits".
The answer is: No, not yet. At this moment, JD can only crawl the first page of an added subreddit-URL. I've created a ticket for this feature request: JD can however, crawl single reddit posts so if you want to crawl all of one subreddit you could do it like this: 1. Get a browser addon like "Link Gopher". 2. Scroll all the way down to the end/beginning of the subreddit (this can take a while). 3. Use the mentioned browser-addon to extract all links to the individual posts and add them to JD. -psp-
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download Last edited by pspzockerscene; 10.03.2022 at 11:12. |
#47
|
||||
|
||||
I've worked on it a bit but it doesn't yet work as it should:
- By default, JD will only crawl the first "page" (first 100 items) of a user profile/subreddit - I've added plugin settings (Settings -> Plugins -> reddit.com) to allow it to crawl complete subreddits/profiles - It doesn't yet work as expected as I was only aber to get max. the last ~1000 items but I've added my changes nevertheless Again: This feature is not finished yet, see progress marked in the ticket I've linked in my last reply! Wartest du auf einen angekündigten Bugfix oder ein neues Feature? Updates werden nicht immer sofort bereitgestellt! Bitte lies unser Update FAQ! | Please read our Update FAQ! --- Are you waiting for recently announced changes to get released? Updates to not necessarily get released immediately! Bitte lies unser Update FAQ! | Please read our Update FAQ! -psp-
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#48
|
||||
|
||||
New feature for the next update:
Text download of added posts (default = 'Always'):
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#49
|
|||
|
|||
Hi, this appears to work well in general, however it seems that on repeated crawls of the same subreddit, nothing is crawled anymore. The Linkgrabber shows a progress for a short time, but nothing is actually added to the LinkGrabber list. I tried this on the subreddit "lordoftherings". The first time a few days ago it worked fine and got a few hundred files, but now it doesn't find anything anymore from the same subreddit.
On another note, the download of texts of posts doesn't work properly because every one of the text files has the same name and jDownloader2 thinks they are mirrors of the same file. Would be best to add some kind of filename-ized version of the title. Or maybe the post id. Or both. |
#50
|
||||
|
||||
Quote:
Edit: I can reproduce the issue
__________________
JD-Dev & Server-Admin |
#51
|
||||
|
||||
Quote:
https://board.jdownloader.org/showpo...7&postcount=47 https://board.jdownloader.org/showpo...8&postcount=46
__________________
JD-Dev & Server-Admin |
#52
|
||||
|
||||
Quote:
Crawling complete subreddits is deactivated by default because those can result in a lot of results. You can enable it under Settings -> Plugins -> reddit.com -> Crawl complete subreddits? -> Enable this setting About the other reported problem: I'm working on it.
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#53
|
||||
|
||||
Fixed bad filenames for .txt downloads when they were part of a post containing only text and no media content.
@TomArrow You can either wait for the next CORE-Updates or work around it right away by using a Packagizer rule to define the filenames for reddit content which will override the scheme set in the plugin setting which can then be used to work around that bug. For all code changes mentioned in this post, the following information applies: Bitte auf das nächste CORE-Update warten! Please wait for the next CORE-Update! Wartest du auf einen angekündigten Bugfix oder ein neues Feature? Updates werden nicht immer sofort bereitgestellt! Bitte lies unser Update FAQ! | Please read our Update FAQ! --- Are you waiting for recently announced changes to get released? Updates to not necessarily get released immediately! Bitte lies unser Update FAQ! | Please read our Update FAQ! -psp-
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#54
|
|||
|
|||
Cool, thanks for solving the filename issue!
About the "crawl complete subreddit" feature, I had forgotten to mention it, but I did have that activated actually, so consider my entire post to be in the context of having "crawl complete subreddit" active. So I think that problem still remains. Or did you fix that as well? An example link of where I had said issue: Quote:
|
#55
|
||||
|
||||
For the next update:
- Changed the last two setting for subreddit- and user profile crawling from a checkbox to a "limit how many of the last pages should be crawled" numerous setting - Updated the wording to make it clear that the features of crawling complete user profiles and complete subreddits are both unfinished Related internal ticket: Bitte auf das nächste CORE-Update warten! Please wait for the next CORE-Update! Wartest du auf einen angekündigten Bugfix oder ein neues Feature? Updates werden nicht immer sofort bereitgestellt! Bitte lies unser Update FAQ! | Please read our Update FAQ! --- Are you waiting for recently announced changes to get released? Updates to not necessarily get released immediately! Bitte lies unser Update FAQ! | Please read our Update FAQ! -psp-
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#56
|
||||
|
||||
Updated reddit plugin so "expected filenames" will also be found when directly adding single "v.redd.it" links.
Wartest du auf einen angekündigten Bugfix oder ein neues Feature? Updates werden nicht immer sofort bereitgestellt! Bitte lies unser Update FAQ! | Please read our Update FAQ! --- Are you waiting for recently announced changes to get released? Updates to not necessarily get released immediately! Bitte lies unser Update FAQ! | Please read our Update FAQ! -psp-
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#57
|
|||
|
|||
jDownloader on Mac downloads video but not audio
jDownloader on Mac downloads video but not audio of this:
The Nimbus Data ExaDrive Go is a 16TB 2.5” SATA SSD available in a bunch of colors, but the kicker is, it has a USB C port integrated so the drive doubles as a portable SSD. **External links are only visible to Support Staff****External links are only visible to Support Staff** Yet, such page shows video with audio in a browser like Safari on Mac. It would be great if that could be fixed, if technically possible. Thanks! |
#58
|
||||
|
||||
Merged same topic threads.
@jDoX Please use the search before creating a new thread.
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#59
|
|||
|
|||
Thanks. I have seen such thread now as you have moved my message into it (thanks!). Yet, it does not work in the latest version of jDownloader on macOS 13.6.9 (22G830) Ventura on Mac (Intel). It would be great if you could fix it.
|
#60
|
||||
|
||||
Please re-read this thread.
Read the comments and look at the prefix of this thread.
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
#61
|
|||
|
|||
Thanks for the message. I understand that what I must do is:
"Settings -> Plugins -> reddit.com -> Crawl complete subreddits? -> Enable this setting" But I cannot find the "Crawl complete subreddits?" option there in jDownloader for Mac. How to select it? Thanks again for all. |
#62
|
||||
|
||||
That setting was changed to "Subreddit crawler: Crawl max X last pages"
__________________
JD Supporter, Plugin Dev. & Community Manager
Erste Schritte & Tutorials || JDownloader 2 Setup Download |
Thread Tools | |
Display Modes | |
|
|