OK, I'm making progress, I can identify non-standard URLs using a DEEPDECRYPT rule, and emit the href links below, which are subsequently handled by a DIRECTHTTP rule.
However, the page source treats different media types differently:
HTML Code:
<a class="u-anchorTarget" id="attachment-2062965"></a>
<a class="file-preview js-lbImage" href="/phpBB2/index.php?attachments/capture-jpg.2062965/" target="_blank">
<a class="u-anchorTarget" id="attachment-2062963"></a>
<a class="file-preview" href="/phpBB2/data/video/1989/1989225-00fcffd963a4710b33ea59fdd998dba7.mp4" target="_blank">
This results in the video files receiving the 'machine-generated' names instead of the 'human-generated' names which appear on the rendered page.
I have discovered that if I remodel the video URLs using the attachment Id thus:
HTML Code:
"/phpBB2/index.php?attachments/2062963/"
This gets resolved to
HTML Code:
"/phpBB2/index.php?attachments/Rover1.mp4.2062963/"
and then JD downloads it as 'Rover1.mp4'.
Bizarrely, if I have regex errors, there are conditions where JDownloader finds the 'human-generated' video names, none of the image files, but also dozens of crud files.
I haven't found any clues in the logs, and the URLs it finds are nowhere in the page source.
If my DEEPDECRYPT were to emit the
HTML Code:
"<a class="u-anchorTarget" id="attachment-2062963"></a>"
instead of the URL, would a REWRITE rule's "pattern" be able to recognise this, so that I could capture the '2062963' and form the URL I need?
Or MUST rule "patterns" contain fully-qualified URLs?
If this isn't technically possible, is it possible within Event Scripter to create new links and submit them to the Link Crawler?
I could then use getPage(myString/*PageURL*/); and pick out the "u-anchorTarget" lines to generate and submit my URLs.
I could also have knocked out a JavaScript routine to paste into my browser console, in the time it's taken me to compose this post!