#1
|
|||
|
|||
Parsing web page url doesn't add links to LinkGrabber properly
Reproduction steps:
1. Set the filter to filter out all links except for one service. 2. Copy to clipboard url of web page containing multiple url links to different services. Result: No link is added to LinkGrabber. Button with "Restore 1 filtered Links" is activated. User has to click this button to parse the webpage and then link is properly added based on filter from #1. Expected result: Download link is properly added automatically, without user interaction. It was working in the past that way (like 1-2 months ago) and it stopped after some issues with parsing in JDownloader. Version: See attached file. Logs id: 11.06.16 05.49.38 <--> 11.06.16 17.13.44 jdlog://3339925891641/ |
#2
|
||||
|
||||
please show us your filter re:1.
please provide example links for re:2 raztoki
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] |
#3
|
|||
|
|||
Hello,
please mark this as not an issue - it was my error. After JDownloader had some issues with links filtering I have changed my Filter and it was set to filter our if Source URL doesn't contain certain service. It should be Hoster URL instead. I corrected this and it works fine. Sorry for confusion. |
#4
|
||||
|
||||
thought as much =]
thanks for the following feedback raztoki
__________________
raztoki @ jDownloader reporter/developer http://svn.jdownloader.org/users/170 Don't fight the system, use it to your advantage. :] |
Thread Tools | |
Display Modes | |
|
|