Thread: [Solved] Web Monitoring
View Single Post
  #5  
Old 06.01.2020, 10:30
raztoki's Avatar
raztoki raztoki is offline
English Supporter
 
Join Date: Apr 2010
Location: Australia
Posts: 17,614
Default

Jiaz will be back =]

you can create decrypter plugins or link crawler rules to scrape&parse your desired content.

to add on interval you could do this via event scripter on interval.

what is missing is feedback on previously crawled content. crawlers just return everything, and has no feedback to previously crawled content. You really need feedback on previously scraped content to halt process otherwise you could be crawling unnecessarily creating load on server end and yours.

I said for years that JD requires Database support, eg. for previously downloaded content (history svn tickets exist), but also for previously crawled content based on generic (standard) and specific named databases (for specific user desired crawling).

raztoki
__________________
raztoki @ jDownloader reporter/developer
http://svn.jdownloader.org/users/170

Don't fight the system, use it to your advantage. :]

Last edited by raztoki; 06.01.2020 at 10:33.
Reply With Quote