Contents

Backups to cloud using Rclone utility

Configure backups to cloud using Rclone utility.

I wrote this script to create a flexible and cost efficient backup solution for my organisation.

Rclone utility can be used to setup backups from a linux server to any rclone remote that we configure for example AWS S3, Google Drive, Dropbox, S3 compatible storage such as Digital Ocean Spaces, etc.

The backups are cost efficient as it can be stored on AWS s3, DigitalOcean Spaces, etc and are flexible as it can be configured as per needs. We can setup backups on any storage provider as per our needs and set-up retention policy as per our requirements.

Here is the GitHub link to clone the project:

https://github.com/akash69711/rclone-backups.git

Installing RClone

Guide to install Rclone https://rclone.org/install/

You can use this command to install Rclone:

curl https://rclone.org/install.sh | sudo bash -s beta

For Linux Server to Digital Ocean Spaces, I followed these two guides to configure the rclone remote “spaces”::

https://rclone.org/s3/#digitalocean-spaces

https://www.digitalocean.com/community/tutorials/how-to-migrate-from-amazon-s3-to-digitalocean-spaces-with-rclone

The backup script

This is how the final backup script looks:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44

#!/usr/bin/env bash

# Rclone Backups
# Author: Akash Mehta Website: http://akash.windhigh.com

#Script Start
CURRENTDATETIME=$(date +%d-%h-%Y_%H-%M);
LOGFILE='backup-logs.txt'

echo "BACKUP STARTED AT $(date +'%d-%m-%Y %H:%M:%S')" >> $LOGFILE

echo "Now performing backup ..."
 
time rclone sync "/" "rclone-remote-name-goes-here:the-spaces-name-goes-here/backups/${HOSTNAME}/files/$(date +%d-%h-%Y)" \
    --delete-excluded \
    --skip-links \
    --stats 60s \
    --stats-log-level NOTICE \
    --transfers=32 \
    --checkers=128 \
    --retries 1 \  #Default value is 3. This value overrides the default rclone retries.
    # Use include flags to include the directories you want to backup. 
    --include "/root/**" \
    --include "/home/**" \
    --include "/var/www/**" \
#    --no-check-dest \
#    --no-traverse \
#    --dry-run \
#    -v
 
echo "... performing backup done!"

echo "BACKUP FINISHED AT $(date +'%d-%m-%Y %H:%M:%S')" >> $LOGFILE

# # If lifecycle policy is not expiring old objects enable below script:
# # This script deletes 14th old backup
# echo "Purge started at $(date +'%d-%m-%Y %H:%M:%S')" >> $LOGFILE
# rclone purge rclone-remote-name:the-spaces-name-goes-here/backups/${HOSTNAME}/files/$(date -d '-14 day' '+%d-%h-%Y')/
# echo "Purge completed at $(date +'%d-%m-%Y %H:%M:%S')" >> $LOGFILE
# echo "##################################################" >> $LOGFILE

#Script End

Configuring the script.

Configure the backup script (backup.sh) as per requirement.

Setup cron

Next step is to setup a cron to run this backup script to backup the data daily, weekly, monthly or as required.

First make the script executable:

chmod +x backup.sh

Then open the crontab for the user using:

crontab -e

And make the required entry as per requirement

For example:

24 2 * * * /root/files/cron/backup.sh 2>&1 | /usr/bin/logger -t RcloneBackup

This makes the script run and take backup at 2:24AM.

To check the backup logs

To check the backup logs:

grep "RcloneBackup" /var/log/syslog

tail backup-logs.txt

Configure lifecycle policies to expire objects

We will need lifecycle policies to expire objects after a spscific period of time. This is also called retention policy.

This policy is in .xml format for use in DigitalOcean Spaces: We will set expiration for 14 days in this policy.

1
2
3
4
5
6
7
8
9
<LifecycleConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
  <Rule>
    <ID>ExampleRule</ID>
    <Prefix></Prefix>
    <Status>Enabled</Status>
    <Expiration>
      <Days>14</Days>
    </Expiration>
  </Rule>

Lifecycle policy in JSON format:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
{
    "Rules": [
        {
            "Filter": {
                "Prefix": ""
            },
            "Status": "Enabled",
            "Expiration": {
                "Days": 14
            },
            "ID": "ExampleRule"
        }
    ]
        }

If the expiration policy is not working then add this code to the script:

1
rclone purge rclone-remote-name:the-spaces-name-goes-here/backups/${HOSTNAME}/files/$(date -d '-5 day' '+%d-%h-%Y')/

Hope this helps. If you need any help do contact me.