Mastodon PG_Dump to S3 bash script

Here’s a complete bash script for performing a PostgreSQL database backup using pg_dump, allowing the user to specify a backup path and sending it to an S3-compatible service using aws-cli (or compatible tool like s3cmd).

You will need the following tools installed:

  • pg_dump (PostgreSQL client tool)
  • aws-cli (or s3cmd for S3 interaction)
#!/bin/bash

# Check for required tools
if ! command -v pg_dump &> /dev/null
then
echo "pg_dump could not be found. Please install PostgreSQL client tools."
exit 1
fi

if ! command -v aws &> /dev/null
then
echo "AWS CLI could not be found. Please install AWS CLI or configure s3cmd."
exit 1
fi

# Variables - replace or use environment variables as needed
DATABASES=("$@") # List of databases to backup, passed as arguments
BACKUP_PATH=${BACKUP_PATH:-"/backups"} # Default local backup directory
S3_BUCKET=${S3_BUCKET:-"my-s3-bucket"} # S3 bucket name
S3_ENDPOINT=${S3_ENDPOINT:-"https://s3.us-west-1.amazonaws.com"} # S3 endpoint (for S3 compatible services)
DB_USER=${DB_USER:-"your_db_user"} # Default DB user
DB_PASSWORD=${DB_PASSWORD:-"your_db_password"} # Database password
DB_HOST=${DB_HOST:-"localhost"} # Default DB host
DB_PORT=${DB_PORT:-"5432"} # Default DB port
S3_REGION=${S3_REGION:-"us-east-1"} # Default S3 region
BACKUP_RETENTION_DAYS=${BACKUP_RETENTION_DAYS:-7} # Retain backups for X days

# Ensure backup directory exists
mkdir -p "$BACKUP_PATH"

# Export the database password so pg_dump can use it
export PGPASSWORD="$DB_PASSWORD"

# Loop through each database
for DB_NAME in "${DATABASES[@]}"; do
TIMESTAMP=$(date +%Y-%m-%d_%H-%M-%S)
BACKUP_FILE="$BACKUP_PATH/$DB_NAME-$TIMESTAMP.dump"

echo "Backing up PostgreSQL database '$DB_NAME'..."

# Perform the backup using pg_dump in custom format (-Fc)
pg_dump -U "$DB_USER" -h "$DB_HOST" -p "$DB_PORT" -Fc "$DB_NAME" -f "$BACKUP_FILE"

if [ $? -eq 0 ]; then
echo "Backup successfully created: $BACKUP_FILE"
else
echo "Error creating backup for database '$DB_NAME'. Skipping to next."
continue
fi

# Upload to S3-compatible storage
echo "Uploading backup to S3 bucket: $S3_BUCKET..."
aws s3 cp "$BACKUP_FILE" s3://"$S3_BUCKET"/"$DB_NAME"/ --endpoint-url "$S3_ENDPOINT" --region "$S3_REGION"

if [ $? -eq 0 ]; then
echo "Backup for '$DB_NAME' successfully uploaded to S3 bucket: $S3_BUCKET"
else
echo "Error uploading backup for '$DB_NAME' to S3. Skipping to next."
continue
fi

# Cleanup old backups locally (older than retention period)
echo "Cleaning up old local backups for '$DB_NAME'..."
find "$BACKUP_PATH" -type f -name "$DB_NAME-*.dump" -mtime +$BACKUP_RETENTION_DAYS -exec rm {} \;
done
echo "All backups and uploads completed."

How It Works:

  1. Database List: You can pass a list of databases as arguments when running the script. The script will loop through each database and perform the backup.
  2. Backup Directory: The BACKUP_PATH is a default local directory where backups are stored before uploading.
  3. Backup Retention: The script will delete old backups that are older than the retention policy (default: 7 days).
  4. AWS CLI for S3: The script uses aws s3 cp to upload the backups to an S3-compatible service.
  5. Automatic Cleanup: After uploading to S3, it cleans up any backup files older than the defined retention days.

To ensure the script runs as the mastodon user, you need to set the cron job under the mastodon user account. Here's how you can do that:

1. Switch to the mastodon User

First, switch to the mastodon user by running:

sudo su - mastodon

This switches to the mastodon user account.

2. Edit the Crontab for the mastodon User

Now, edit the crontab for the mastodon user:

crontab -e

3. Add the Cron Job Entry

Add the following line to run the backup script at midnight every day:

0 0 * * * /path/to/pg_multi_backup_s3.sh db1 db2 db3 >> /path/to/logs/pg_backup.log 2>&1

  • Replace /path/to/pg_multi_backup_s3.sh with the actual path to your script.
  • Replace db1 db2 db3 with the actual database names.
  • Replace /path/to/logs/pg_backup.log with the path to the log file (create a directory if it doesn't exist).

4. Save and Exit

Save and exit the crontab editor.

5. Verify the Cron Job

To check that the cron job has been added for the mastodon user, run:

crontab -l

You should see the scheduled job listed. This ensures that the backup script will be run as the mastodon user every day at midnight.

To securely add AWS S3 credentials and configure the endpoint for the backup script, you have a couple of options. The recommended approach is to use the AWS credentials file or set the credentials as environment variables. Here's how you can do both.

Option 1: Use AWS Credentials File

  • Create or Edit the AWS Credentials File: Create a file at ~/.aws/credentials if it doesn't exist, and add your AWS credentials:

[default]
aws_access_key_id = YOUR_ACCESS_KEY_ID
aws_secret_access_key = YOUR_SECRET_ACCESS_KEY

  • Create or Edit the AWS Config File: Create a file at ~/.aws/config if it doesn't exist, and add your desired region and endpoint (if using a custom S3-compatible service):

[default]
region = us-east-1
endpoint_url = https://s3.us-west-1.amazonaws.com # Change to your endpoint if using a compatible service

  • pgdump, mastodon, s3
  • 0 Users Found This Useful
Was this answer helpful?

Related Articles

Installing a Mastodon on Ubuntu Focal Fossa

Update system packages​ apt update && apt upgrade -y   Install fail2ban so it...

Running multiple instances on a single server

In this tutorial, I am going to show you how to install more then one mastodon instances on a...

Setting up email in Mastodon using an external SMTP server

This is how I got Mastodon to use an external SMTP server to deliver outgoing mail.I received...

How to install glitch-soc version

In this tutorial I will show you how to install mastodon glitch version. Glitch-soc is based on...

Moving or leaving accounts

Exporting your information​   The data export page in settingsAt any time you want, you can...

Powered by WHMCompleteSolution