How to make remote backup?

Post Reply
Posts: 12
Joined: Mon Jul 27, 2020 8:06 pm
How to make remote backup?

Post by zenobie »

Is it possible to backup to an external resource? a remote ftp server or better a google drive account? It would be interesting to implement an elastic solution to be adapted to all environments.


User avatar
Site Admin
Posts: 364
Joined: Fri Jun 19, 2020 9:59 am
Re: backup on external resource

Post by dpeca »


Posts: 1
Joined: Fri Sep 11, 2020 1:08 pm
Re: How to make remote backup?

Post by prutser »

I'm a happy user since a few weeks. What I miss is a incremental backup scenario ... So I made my own. In the /backup folder are all "tar" files stored that myVesta can make. I made a small bash script that:
  • fires off a backup for all users (standard vesta binary)
  • unpacks the tar file(s)
  • upload the unpacked user folders (I use Duplicati for that)
  • erase the unpacked folders
And that's it. If you use Duplicati to restore user files, the bash script will see folders in the backup folder and assume that you try to restore users. The script will re-create a tar file, using the same name as the unpacked original. Than you are able the restore the users with the standard myVesta tooling. I'm not a programmer, so a real bash-script-guru will laugh his ass off if he/she sees my script, but it is fully functional :D

Duplicati is able to use Webdav, (S)FTP, Onedrive and lot's of other nice options for remote backup with all kind of options and backup strategies. Works great for me. The script I'm using now is included, be aware that you have to adjust the command that (in my case) Duplicati uses. Duplicati can export the command line, so that is only a copy/paste action. I tried to do everything in variables above the dashed line, but for one or the other reason Duplicat refuses to fire itself using a variable. That's why the commandline is there.

With crontab I fire every 4 hours this script, using Webdav to make the remote backup ...


Code: Select all

# tar/untar file for vestacp to make use of incremental backups. This script makes a backup and can upload it
# using Duplicati or another tool. Te restore files, simply restore from your backup a user in de /backup/ folder
# and run the script. If a folder is found, we assume it is a user, and we will recreate a tar file for this user
# reacreating a tar file with the original name

# Declare variables user for paths and filenames & backup executable
backupdir=/backup                                                                       # Where backups are stored by vestacp
orgfilename_stored_in="filename.txt"                                                    # filename where the original name is kept
command_backup="/usr/local/vesta/bin/v-backup-users"                                    # command to backup users
#duplicati_command=mono /usr/lib/duplicati/Duplicati.CommandLine.exe backup "webdavs://" /backup/ --backup-name=Mailserver  --dbpath=/root/.config/Duplicati/ASFKQQCVXH.sqlite --encryption-module=aes --compression-module=zip --dblock-size=50mb --passphrase="secret passphrase" --retention-policy="1W:1D,4W:1W,12M:1M" --disable--module=console-password-input --exclude="*.tar"module=console-password-input --exclude="*.tar"

# ================================================================================================================================
# If a folder is there in /backup/ we assume it is restore. Do restore from userfolders and restore ownership
# Read filename.txt to recover the original filename and erase it before we recreate the tar file

if [ $(ls -l "$backupdir"/ | grep -c ^d) -ne 0 ]; then                                  # Count folders, if <>0 do restore
        for user in $(find "$backupdir" -mindepth 1 -maxdepth 1 -type d)                # Find user including path
                restore_filename=$(<"$user/$orgfilename_stored_in")                     # get original filename without path
                rm $user/$orgfilename_stored_in                                         # erase that file
                user_withoutpath="${user##*/}"                                          # get user without path

                tar -C $user -cf ../$restore_filename .                                 # create tar file again
                rm -r $user                                                             # and remove the backupped content

                chown admin:$user_withoutpath $backupdir/$restore_filename              # restore ownership
                echo User $user restored to $restore_filename                           # say something nice
                usersfound=$((usersfound+1))                                            # increase amount of restored users
                `echo $command_restore $user_withoutpath $restore_filename`
        echo $usersfound users restored.

# =================================================================================================================================
        # We come here of no users are restored, so we assume a backup operation
        # Save current time to select the backupped files later in case there are already tar files available
        CurrentTime=$(date --rfc-3339=seconds)                                          # save current date/time for tar selection

        #Backup all users
        `echo $command_backup`                                                          # execute backup command

        # Find all backups made in this script and loop throught them.
        # Untar them one by one so we can use incremental backups from Duplicati
        # and finally erase the extracted data, until all tar files are looped
        # Store the original filename in filename.txt for the unpack part

        for file in $(find "$backupdir" -type f -newermt "$CurrentTime")                # loop through all created tar files
                user=$(echo "$file" | cut -d'/' -f 3 | cut -d'.' -f 1)                  # find the username in the tar filename
                mkdir $backupdir/$user                                                  # create a folder with the username
                echo $file | cut -d'/' -f 3 > $backupdir/$user/$orgfilename_stored_in   # store the original tar filename without path
                tar -xf $file -C $backupdir/$user/                                      # and unpack the tar file in the user folder
                chmod --reference=$file $backupdir/$user/
                echo File $file for user $user unpacked and backupped                   # say someting nice
                usersfound=$((usersfound+1))                                            # increase amount of backupped users
        done                                                                            # and loop through all users

	#Duplicate backup command
        mono /usr/lib/duplicati/Duplicati.CommandLine.exe backup "webdavs://" /backup/ --backup-name=Mailserver  --dbpath=/root/.config/Duplicati/ASFKQQCVXH.sqlite --encryption-module=aes --compression-module=zip --dblock-size=50mb --passphrase="secret passphrase" --retention-policy="1W:1D,4W:1W,12M:1M" --disable--module=console-password-input --exclude="*.tar"module=console-password-input --exclude="*.tar"

        # Do your incremental backup stuff here, in $backupdir are folders with all your usernames
        # so you can do a backup of the complete backupdir and exclude .tar files

        for directory in $(ls -d "$backupdir"/*/)                                        # Erase all userdirectories. Should be nice to store userdirs in an array to erase only the folders we created earlier
                rm -r $directory

        echo $usersfound users backupped.

Last edited by prutser on Fri Sep 11, 2020 1:36 pm, edited 1 time in total.

Post Reply