Backups to cloud using Rclone utility
Configure backups to cloud using Rclone utility.
I wrote this script to create a flexible and cost efficient backup solution for my organisation.
Rclone utility can be used to setup backups from a linux server to any rclone remote that we configure for example AWS S3, Google Drive, Dropbox, S3 compatible storage such as Digital Ocean Spaces, etc.
The backups are cost efficient as it can be stored on AWS s3, DigitalOcean Spaces, etc and are flexible as it can be configured as per needs. We can setup backups on any storage provider as per our needs and set-up retention policy as per our requirements.
Here is the GitHub link to clone the project:
Guide to install Rclone https://rclone.org/install/
You can use this command to install Rclone:
curl https://rclone.org/install.sh | sudo bash -s beta
For Linux Server to Digital Ocean Spaces, I followed these two guides to configure the rclone remote “spaces”::
The backup script
This is how the final backup script looks:
Configuring the script.
Configure the backup script (backup.sh) as per requirement.
Next step is to setup a cron to run this backup script to backup the data daily, weekly, monthly or as required.
First make the script executable:
chmod +x backup.sh
Then open the crontab for the user using:
And make the required entry as per requirement
24 2 * * * /root/files/cron/backup.sh 2>&1 | /usr/bin/logger -t RcloneBackup
This makes the script run and take backup at 2:24AM.
To check the backup logs
To check the backup logs:
grep "RcloneBackup" /var/log/syslog
Configure lifecycle policies to expire objects
We will need lifecycle policies to expire objects after a spscific period of time. This is also called retention policy.
This policy is in .xml format for use in DigitalOcean Spaces: We will set expiration for 14 days in this policy.
Lifecycle policy in JSON format:
If the expiration policy is not working then add this code to the script:
Hope this helps. If you need any help do contact me.