bacup
An easy-to-use backup tool designed for servers - written in Rust.
The bacup service runs as a deamon and executes the backup of the services on the remotes.
The goal of bacup is to make the configuration straightforward: a single file where defining everything in a very simple way.
Configuration
3 steps configuration.
- Configure the remotes. A remote is a cloud provider, or a SSH host, or a git server.
- Configure the services. A service is a well-known software (e.g. PostgreSQL) with his own backup tool, or is a location on the filesystem.
- Map services (what to backup) to remotes (where to backup). Configure the backup.
When configuring the backups, the field when accepts configuration strings in the format:
"daily $hh:$mm
e.g.daily 15:30
weekly $day $hh:$mm
e.g.weekly mon 12:23
orweekly monday 12:23
.weekly
can be omitted.monthly $day $hh:$mm
e.g.monthly 1 00:30
- cron. If you really have to use it, use crontab guru to create the cron string.
NOTE: The time is ALWAYS in UTC timezone.
# remotes definitions
[]
[]
= ""# "eu-west-3"
= ""
= ""
# Not available yet!
#[gcloud]
# [gcloud.bucket1]
# service_account_path = ""
[]
[]
= "" # example.com
= "" # 22
= "" # myname
= "" # ~/.ssh/id_rsa
[]
# Like copy-paste in local. The underlying infrastructure manages
# the remote (if any) part. Below 2 examples
[]
= "" # local path where samba is mounted
[]
= "" # local path where the second disk of the machine is mounted
[]
[]
= "" #github.com
= "" #22
= "" #git
= "" # ~/.ssh/id_rsa
= "" # "galeone/bacup"
= "" # master
# what to backup. Service definition
[]
[]
= ""
= ""
= ""
= ""
[]
[]
= ""
[]
[]
= "docker_postgres_1"
= "pg_dumpall -c -U postgres" # dump to stdout always
# mapping services to remote
[]
# Compress the DB dump and upload it to aws
# everyday at 01:00 UTC
[]
= "postgres.service1"
= "aws.bucket_name"
= "daily 01:00"
= "/service1/database/"
= true
= 7
# Dump the DB and upload it to aws (no compression)
# every first day of the month
[]
= "postgres.service1"
= "aws.bucket_name"
= "monthly 1 00:00"
= "/service1/database/"
= false
# Archive the files of service 1 and upload them to
# the ssh.remote_host1 in the remote ~/backups/service1 folder.
# Every friday at 5:00
[]
= "folders.service1"
= "ssh.remote_host1"
= "weekly friday 05:00"
= "~/backups/service1"
= true
# Incrementally sync folders.service1 with the remote host
# using rsync (authenticated trough ssh)
# At 00:05 in August
[]
= "folders.service1"
= "ssh.remote_host1"
= "5 0 * 8 *"
= "~/backups/service1_incremental/"
= false # no compression = incremental sync
# Compress the DB dump and copy it to the localhost "remote"
# where, for example, samba is mounted
# everyday at 01:00 UTC
[]
= "postgres.service1"
= "localhost.samba"
= "daily 01:00"
= "/path/inside/the/samba/location"
= false
[]
= "folders.service1"
= "git.github"
= "daily 15:30"
= "/" # the root of the repo
= false
When compression = true
, the file/folder are compressed using Gzip and the file is archived (in the desired remote location) with the format:
YYYY-MM-DD-hh:mm-filename.gz # or .tar.gz if filename is an archive
Installation & service setup
cargo install bacup
Then put the config.toml
file in $HOME/.bacup/config.toml
.
There's a ready to use systemd
service file:
sudo cp misc/systemd/bacup@.service /usr/lib/systemd/system/
then, the service can be enabled/started in the usual systemd way:
sudo systemctl start bacup@$USER.service
sudo systemctl enable bacup@$USER.service
Remote configuration
Configuring the remotes is straightforward. Every remote have a different way of getting the access code, here we try to share some useful reference.
AWS
- Access Key & Secret Key: Understanding and getting your AWS credentials: programmatic access
- Region: the region is the region of your bucket.
SSH
You need a valid ssh account on your remote - only authentication via SSH key without passphrase is supported.
For incremental backup rsync
is used - you need this tool installed locally and remotely.
Git
You need a valid account on a Git server, together with a repository. Only SSH is supported.
Localhost
Not properly a remote, but you can use bacup
to bacup from a path to another (with/without compression). If the localhost remote is mounted on a network filesystem it's better :)