View Single Post
  #2  
Old 06.03.2018, 02:33
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 17,296
Default

without a dedicated plugin to scan for filename information (within source code) and if the filename info isn't within the URL it will mean you will have to resort to the following..

If you want fast adding of links, you can do this with changing 'linkchecking in advanced setting > LinkCollector.dolinkcheck. Note: requires client restart, and you wont have filename/filesize/availablity information

best solution for you as I see it if its just a small amount of links

you have the ability to add "directhttp:/ /" + url(https:/ /domain/path)" prefix to urls to treat them as directdownloadable and the directhttp plugin will pick it up + the disable linkcheck all links will add instantly. you wont have filename or filesize or availability.
else
you can create linkcrawler rule to listen to your url pattern. Make the rule for directhttp and not deepanlyse as you don't want to crawl. Their is plenty info on the forum about crawler rules. with linkchecking disabled it will be fast as above solution without the need to copy urls to notepad and add url prefix.
else
create dedicate plugin(s) to perform specific task.

I would say though, there is a clear design flaw when users deep analyse directly downloadable content, as this task is done within crawler classes and then has to pass over the directhttp plugin in order to get filename/filesize checks. And yet http requests are already done in crawler... == twice as slow.
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]

Last edited by raztoki; 06.03.2018 at 02:41.
Reply With Quote