#1
|
|||
|
|||
![]()
Hello guys,
I have running a headless centos 7 server with the jdownloader.jar installed. I get the error "invalid download directory" when the download directory is set to my mounted gsuite account. When i change the directory to a "real" folder (not just mounted rclone) it works. The permissions are the same.. so that couldnt be the problem. Maybe the jdownloader have problems with the size of the unlimited gsuite drive? It says in the settings menue of myjdownloader 1.13PB total. This is my log id: jdlog://1850230900751 hope someone can help.. regards:) |
#2
|
|||
|
|||
![]()
short update
it´s possible to store the files local on the server and then extract it to the mounted google drive. But directly download and extracting on the google drive mount is impossible.:huh: |
#3
|
||||
|
||||
![]()
You can customize the extract to folder but it would be better to use scripts/cronjob/eventscripter (see https://board.jdownloader.org/showthread.php?t=70525 ) to move extracted finished files. downloading directly to google drive mounted files will very likely fail because of requires random-read/write-access.
The available disk space/size is queried from OS and most *cloud mapped/mounted* filesystems have that large *available* space. The log doesn't contain any errors/downloads. When you create log via MyJDownloader webinterface, please make sure to select all sessions! I'm recommending to NOT use cloud mapped/mounted filesystems because of issues with random-read/write-access.
__________________
JD-Dev & Server-Admin |
#4
|
||||
|
||||
![]()
Please create a new log and I can check the exact cause though there won't be anything to fix in case the issues are caused by what I think it is
__________________
JD-Dev & Server-Admin |
#5
|
|||
|
|||
![]()
Thanks for your reply:)
I´ve made a new install and generated a new log: This is my log id: jdlog://6190230900751 The problem is, that i run the server on a vps host and i have just 20 GB´s SSD. Thats the point why i want to save the downloads directly on the mount. I thought changing the extraction folder was just a temporarily solution.. The user who´s running jdownloader has full acces to the mount(i mounted the gdrive with that user). With other programms it´s possible to have full access to the mount.. that surprises me ![]() regards ![]() |
#6
|
||||
|
||||
![]()
The issue is caused by this
java.io.IOException: Illegal seek and java.io.IOException: Operation not permitted As I though this is caused by *how google drive* works, see stackoverflow.com/questions/36102998/how-to-get-partial-write-access-to-a-file-from-private-folder-of-google-drive-an#
__________________
JD-Dev & Server-Admin |
#7
|
||||
|
||||
![]()
Yes, but the difference is *how they are using* the filesystem. normal copy/open/create/delete is working. But opening a file and then randomly seek and write chunks of data to it causes this issue.
__________________
JD-Dev & Server-Admin Last edited by Jiaz; 13.11.2019 at 12:02. |
#8
|
||||
|
||||
![]() Quote:
When you need help with it or got questions, please ask mgpai for help
__________________
JD-Dev & Server-Admin |
#9
|
||||
|
||||
![]()
Maybe it's possible to change file handling stuff to avoid those *critical filesystem calls*, but I can't guarantee it
__________________
JD-Dev & Server-Admin |
#10
|
|||
|
|||
![]()
thanks for your help
![]() i´ve found another solution, simply adding "--vfs-cache-mode writes --vfs-cache-max-age 0h0m0s" to the rclone mount command, then it works too, but the extracting is slow. This is caused by google drive |
#11
|
||||
|
||||
![]()
Thanks for the feedback! don't you have to specify where to cache those writes to?
__________________
JD-Dev & Server-Admin |
#12
|
|||
|
|||
![]()
no, in my home directory there is a .config folder where the rclone cache is stored
![]() the "vfs-cache-max-age 0h0m0s" simply deletes the files after there are uploaded to the google drive. now two weeks without errors, so thats working great! |
![]() |
Thread Tools | |
Display Modes | |
|
|