How to set-up Directus with Litestream to continuously backup your data
A how to guide on setting up Directus with Litestream to continuously backup your data
Litestream is a new tool that allows you to continuously backup your SQLite database. It does this by replicating the database to a remote storage provider. This makes sqlite even more attractive since it is incredibly simple to set-up and maintain, but now offers a quick and easy way to backup your data. Litestream can seem a little fiddly to set-up at first, but once you get the hang of it, it is actually quite simple.
The basic premise of Litestream is to have a local database and a remote database. The local database is the one you interact with, and the remote database is the back-up copy. That remote database can be stored as a simple file since sqlite is just a file, so S3 or even SFTP are valid options.
I personally recommend Cloudflare R2 because of the price and ease-to-use (looking at you AWS).
What is Directus?
Directus is an open-source headless CMS. It is a great tool to manage your content and data. It offers reliable APIs and it has a vibrant community around it. Luckily, it even supports sqlite out of the box, so it is a great fit for Litestream.
Setting up Litestream
I will assume that you are running Directus in a Docker container (preferably Docker Compose) and that you have a S3 bucket set-up. If you are not using Docker, you can still use Litestream. Just leave out the Directus part in the Docker Compose file.
Setting up the Litestream config
The first step is to set-up the Litestream config. This is a YAML file that tells Litestream where to find the database and where to replicate it to. You can find the full documentation here.
# The local database
dbs:
- path: /directus/database.db
# The remote database
replicas:
- type: s3
endpoint: BUCKET_ENDPOINT_HERE # e.g. s3.eu-west-1.amazonaws.com or https://${LITESTREAM_PERSONAL_KEY}.r2.cloudflarestorage.com/
bucket: BUCKET_NAME_HERE
region: BUCKET_REGION_HERE # e.g. eu-west-1, optional if using Cloduflare R2
# AWS/Cloudflare R2/other S3 compatible storage credentials
access-key-id: ${LITESTREAM_ACCESS_KEY}
secret-access-key: ${LITESTREAM_SECRET_ACCESS_KEY}
Setting up the Litestream Docker container
The second step on our adventure is to set-up the Litestream Docker container. This container will run Litestream and replicate the database to the remote storage provider. I have included a healthcheck in the Docker Compose file to make sure that the replication is working, but this is optional. You can find additional documentation here.
version: '3'
services:
litestream:
image: litestream/litestream
volumes:
- '/local/path/to/database.db:/directus/database.db'
- '/local/path/to/litestream.yml:/etc/litestream.yml'
environment:
- LITESTREAM_ACCESS_KEY=xxxx-xxxx
- LITESTREAM_SECRET_ACCESS_KEY=xxxx-xxxx
# We want to healthcheck the replication to make sure it is working
# Adjust the timeout to your needs (if you have a large database, you might need to increase it!)
healthcheck:
test: replicate
timeout: 15s
retries: 3
# The Litestream config file is placed at the default location, therefore we don't need to specify it
command:
- replicate
Setting it up together with Directus in Docker Compose
Personally, I love the simplicity of having my whole stack in a single Docker Compose file. It makes it easy to start and stop everything at once, and to manage everything.
This is how I would set-up Directus and Litestream together in a single Docker Compose file. You can simple fill in the environment variables and you are good to go. Run docker-compose up -d
to get your containers up and running.
You can read more about self-hosting Direcuts over here.
version: '3'
services:
directus:
# It is recommended to use a specific version instead of latest, see https://docs.directus.io/self-hosted/docker-guide.html#installing-specific-versions
image: 'directus/directus:latest'
volumes:
- '/local/path/to/database.db:/directus/database.db'
# I recommend using S3 for the uploads as well, but you can also use a local volume
# See https://docs.directus.io/self-hosted/config-options.html#file-storage
- '/local/path/to/uploads:/directus/uploads'
- '/local/path/to/extensions:/directus/extensions'
environment:
- KEY=BASE64_KEY
- SECRET=BASE64_SECRET
- [email protected]
- ADMIN_PASSWORD=xxxx-xxxx
- DB_CLIENT=sqlite3
- DB_FILENAME=/directus/database.db
- WEBSOCKETS_ENABLED=true
litestream:
image: litestream/litestream
volumes:
- '/local/path/to/database.db:/directus/database.db'
- '/local/path/to/litestream.yml:/etc/litestream.yml'
environment:
- LITESTREAM_ACCESS_KEY=xxxx-xxxx
- LITESTREAM_SECRET_ACCESS_KEY=xxxx-xxxx
# We want to healthcheck the replication to make sure it is working
# Adjust the timeout to your needs (if you have a large database, you might need to increase it!)
healthcheck:
test: replicate
timeout: 15s
retries: 3
# The Litestream config file is placed at the default location, therefore we don't need to specify it
command:
- replicate
How to manage all these Docker containers?
I personally use Coolify to manage all my Docker containers. It is a great tool to manage your Docker containers and it is very easy to use. It is still in beta, but it is already pretty stable. As an alternative, Caprover is another fabulous tool to make everything Docker related easier.
Conclusion: Is it worth it?
I personally think that Litestream is a great tool. It is easy to set-up and it is very reliable. It is also very cheap to run, especially if you use Cloudflare R2 or even SFTP as your storage provider. I would definitely recommend it if you are using sqlite and want to have a reliable backup solution.