I want to download my backups to my local computer (home desktop).
I see i can do it in Hestia panel by clicking download in the backup section. But the transfer is incredibly slow (about 12 hours for 7GB Tar file).
I can connect to my server via SFTP and i can see all the tar files sitting in the Backup directory. But i cant download them (im using FileZilla).
I assume it is because the user that i am using to connect to server doesnt have ‘read’ access. But all the read access users like admin and ‘user’ (the owner of the domain) they can not access that part of the server when they connect with SFTP.
So how can they be accessed to download?
Do i have to give read permission to my Server SU? I have my root user disabled.
I am not sure if it is good idea to mess around with permisions of those backup files. Is there?
is there an easier way to do this so i can transfer these large files more quickly?
/home are on the same file system, you could create a hard link (as root or using sudo) to one of your dirs, like the private dir, and you will have access to the backup file using sftp and it won’t use more space on your disk:
ln /backup/BackupFileName.tar /home/Your_User/web/Your_Domain/private/
Thanks Sahanu. I tried your advice and it works great to link a single file. Unfortunately the file name changes daily as they always append the backup date and time stamp. (Your_User.2023-11-13_05-43-28.tar
I managed to get it done by adding a link as such. Hopefully that will keep linking the latest backup.
ln /backup/Your_User* /home/Your_User/web/Your_Domain/private/
Now using my admin user to log on with FTP. I can download all my backups from the /private folder.
No, it won’t keep linking the latest backup, only the existing ones when you executed the
If this isn’t a one-time thing, you should create a script to automate it (and also remove the files in private dir when the backup files are being removed from /backup/ dir).