How to make remote backup?

Post Reply
zenobie
Posts: 12
Joined: Mon Jul 27, 2020 8:06 pm

Is it possible to backup to an external resource? a remote ftp server or better a google drive account? It would be interesting to implement an elastic solution to be adapted to all environments.
Advice?

Tags:
User avatar
myVesta
Site Admin
Posts: 928
Joined: Fri Jun 19, 2020 9:59 am
Has thanked: 8 times
Been thanked: 6 times

Screenshot_49.png
Screenshot_49.png (14.88 KiB)
Screenshot_50.png
Screenshot_50.png (8.21 KiB)
Screenshot_51.png
Screenshot_51.png (9.94 KiB)
Screenshot_52.png
Screenshot_52.png (2.07 KiB)
If you want to keep only latest copy of backup on FTP server (to save remote space), in SSH as root run:

Code: Select all

echo "ONLY_ONE_FTP_BACKUP='yes'" >> /usr/local/vesta/conf/vesta.conf
prutser
Posts: 1
Joined: Fri Sep 11, 2020 1:08 pm

I'm a happy user since a few weeks. What I miss is a incremental backup scenario ... So I made my own. In the /backup folder are all "tar" files stored that myVesta can make. I made a small bash script that:
  • fires off a backup for all users (standard vesta binary)
  • unpacks the tar file(s)
  • upload the unpacked user folders (I use Duplicati for that)
  • erase the unpacked folders
And that's it. If you use Duplicati to restore user files, the bash script will see folders in the backup folder and assume that you try to restore users. The script will re-create a tar file, using the same name as the unpacked original. Than you are able the restore the users with the standard myVesta tooling. I'm not a programmer, so a real bash-script-guru will laugh his ass off if he/she sees my script, but it is fully functional :D

Duplicati is able to use Webdav, (S)FTP, Onedrive and lot's of other nice options for remote backup with all kind of options and backup strategies. Works great for me. The script I'm using now is included, be aware that you have to adjust the command that (in my case) Duplicati uses. Duplicati can export the command line, so that is only a copy/paste action. I tried to do everything in variables above the dashed line, but for one or the other reason Duplicat refuses to fire itself using a variable. That's why the commandline is there.

With crontab I fire every 4 hours this script, using Webdav to make the remote backup ...

Regards,

Code: Select all

#!/bin/bash
# tar/untar file for vestacp to make use of incremental backups. This script makes a backup and can upload it
# using Duplicati or another tool. Te restore files, simply restore from your backup a user in de /backup/ folder
# and run the script. If a folder is found, we assume it is a user, and we will recreate a tar file for this user
# reacreating a tar file with the original name

# Declare variables user for paths and filenames & backup executable
backupdir=/backup                                                                       # Where backups are stored by vestacp
orgfilename_stored_in="filename.txt"                                                    # filename where the original name is kept
command_backup="/usr/local/vesta/bin/v-backup-users"                                    # command to backup users
command_restore="/usr/local/vesta/bin/v-restore-user"
#duplicati_command=mono /usr/lib/duplicati/Duplicati.CommandLine.exe backup "webdavs://myVesta.webdav_provider_url.com:443/remote.php/webdav/duplicati/myVesta-backup?auth-username=justanexample&auth-password=YourPassword" /backup/ --backup-name=Mailserver  --dbpath=/root/.config/Duplicati/ASFKQQCVXH.sqlite --encryption-module=aes --compression-module=zip --dblock-size=50mb --passphrase="secret passphrase" --retention-policy="1W:1D,4W:1W,12M:1M" --disable--module=console-password-input --exclude="*.tar"module=console-password-input --exclude="*.tar"

# ================================================================================================================================
# If a folder is there in /backup/ we assume it is restore. Do restore from userfolders and restore ownership
# Read filename.txt to recover the original filename and erase it before we recreate the tar file

if [ $(ls -l "$backupdir"/ | grep -c ^d) -ne 0 ]; then                                  # Count folders, if <>0 do restore
        usersfound=0
        for user in $(find "$backupdir" -mindepth 1 -maxdepth 1 -type d)                # Find user including path
        do
                restore_filename=$(<"$user/$orgfilename_stored_in")                     # get original filename without path
                rm $user/$orgfilename_stored_in                                         # erase that file
                user_withoutpath="${user##*/}"                                          # get user without path

                tar -C $user -cf ../$restore_filename .                                 # create tar file again
                rm -r $user                                                             # and remove the backupped content

                chown admin:$user_withoutpath $backupdir/$restore_filename              # restore ownership
                echo User $user restored to $restore_filename                           # say something nice
                usersfound=$((usersfound+1))                                            # increase amount of restored users
                `echo $command_restore $user_withoutpath $restore_filename`
        done
        echo $usersfound users restored.

# =================================================================================================================================
else
        # We come here of no users are restored, so we assume a backup operation
        # Save current time to select the backupped files later in case there are already tar files available
        CurrentTime=$(date --rfc-3339=seconds)                                          # save current date/time for tar selection

        #Backup all users
        `echo $command_backup`                                                          # execute backup command

        # Find all backups made in this script and loop throught them.
        # Untar them one by one so we can use incremental backups from Duplicati
        # and finally erase the extracted data, until all tar files are looped
        # Store the original filename in filename.txt for the unpack part

        usersfound=0
        for file in $(find "$backupdir" -type f -newermt "$CurrentTime")                # loop through all created tar files
        do
                user=$(echo "$file" | cut -d'/' -f 3 | cut -d'.' -f 1)                  # find the username in the tar filename
                mkdir $backupdir/$user                                                  # create a folder with the username
                echo $file | cut -d'/' -f 3 > $backupdir/$user/$orgfilename_stored_in   # store the original tar filename without path
                tar -xf $file -C $backupdir/$user/                                      # and unpack the tar file in the user folder
                chmod --reference=$file $backupdir/$user/
                echo File $file for user $user unpacked and backupped                   # say someting nice
                usersfound=$((usersfound+1))                                            # increase amount of backupped users
        done                                                                            # and loop through all users

	#Duplicate backup command
        mono /usr/lib/duplicati/Duplicati.CommandLine.exe backup "webdavs://myVesta.webdav_provider_url.com:443/remote.php/webdav/duplicati/myVesta-backup?auth-username=justanexample&auth-password=YourPassword" /backup/ --backup-name=Mailserver  --dbpath=/root/.config/Duplicati/ASFKQQCVXH.sqlite --encryption-module=aes --compression-module=zip --dblock-size=50mb --passphrase="secret passphrase" --retention-policy="1W:1D,4W:1W,12M:1M" --disable--module=console-password-input --exclude="*.tar"module=console-password-input --exclude="*.tar"


        # Do your incremental backup stuff here, in $backupdir are folders with all your usernames
        # so you can do a backup of the complete backupdir and exclude .tar files

        for directory in $(ls -d "$backupdir"/*/)                                        # Erase all userdirectories. Should be nice to store userdirs in an array to erase only the folders we created earlier
        do
                rm -r $directory
        done

        echo $usersfound users backupped.

fi
Last edited by prutser on Fri Sep 11, 2020 1:36 pm, edited 1 time in total.
kombajnik
Posts: 20
Joined: Fri Feb 19, 2021 1:42 am

Hello, for me remote backups not working:

I want to send backups to /tmp folder on another myvestacp panel and have message:

Create directory operation failed.<br>Permission denied.<br>Error: can't create /tmp/vst.bK76A9SUkt folder on the ftp



But i see on remote ftp the directory is created.
User avatar
myVesta
Site Admin
Posts: 928
Joined: Fri Jun 19, 2020 9:59 am
Has thanked: 8 times
Been thanked: 6 times

try without first /
anyway, you can't access /tmp on remote server via FTP, you just can access local tmp/
kombajnik
Posts: 20
Joined: Fri Feb 19, 2021 1:42 am

dpeca wrote: Sat Mar 20, 2021 2:33 pm try without first /
anyway, you can't access /tmp on remote server via FTP, you just can access local tmp/
its strange, not working too,

if i manually upload via winscp from home pc something with provided user&pass then i can upload the folder & files





ps.

i have disabled downloading files from ftp for security reasons, its the fault?
User avatar
myVesta
Site Admin
Posts: 928
Joined: Fri Jun 19, 2020 9:59 am
Has thanked: 8 times
Been thanked: 6 times

Could be, because when you provide login, myVesta first test login, and maybe try to list or download file, in order to test login and connection.
Did you tried without slash at all, just a folder name?
kombajnik
Posts: 20
Joined: Fri Feb 19, 2021 1:42 am

dpeca wrote: Sun Mar 21, 2021 11:35 am Could be, because when you provide login, myVesta first test login, and maybe try to list or download file, in order to test login and connection.
Did you tried without slash at all, just a folder name?
I think it is a problem, because i have disabled possibility to download from this ftp server.
Post Reply