The Ultimate PostgreSQL Database Backup Guide

SimpleBackups founder

Laurent Lemaire

Co-founder, SimpleBackups

Last update on November 29, 2023
Originally posted on June 9, 2021

This article is part of “The Ultimate Backup Script” series we are creating to provide you with database backup scripts that not only allow you to create database backups, but also upload the backup dumps to Amazon S3 and automate the process daily.

PostgreSQL Backup Illustration

Table of Contents

Why PostgreSQL Backup is Crucial

One might think why backup is necessary for my database? The answer is simple, backup creates a copy of your physical, logical, and operational data.
Which you can store at any safe place such as Amazon S3. This copy comes into use if the running database gets corrupted.
Database backup can include files like control files, datafiles, and archived redo logs.

PostgreSQL backups are vital to protect against data loss from various risks like data corruption, device failure, human error, and software issues.

How to automate Postgres database Backup

We'll develop a straightforward script to back up your PostgreSQL database and store it in your Amazon S3 bucket. Additionally, we will automate this process using Cron.

Key Points to Understand Before We Begin:

Why Choose Amazon S3?
In this tutorial, we've selected Amazon S3 for its widespread use and reliability.
If you prefer a different cloud storage provider, feel free to make that choice. The steps will remain largely similar as long as the chosen provider is compatible with S3.

What is Cron?
Cron is a time-based job scheduling software utility, supporting Unix-based operating systems. Developers utilize Cron to schedule tasks, like executing commands or shell scripts, at specific intervals. These intervals can be set to daily, weekly, or any other frequency according to your needs.

Understanding Chmod
Chmod, short for ‘change mode’, is a command used to set file handling rules. Through the “chmod” system call, an administrator can alter access permissions of file system objects.

🧑‍💻 Let's get down to coding now. You can automate the creation of postgres database backup and storing it to Amazon S3 following these steps

  1. Crafting a script to automate the creation of a PostgreSQL backup directory.
  2. Uploading/synchronizing the backups with Amazon S3.
  3. Setting up Cron to execute this backup command daily.

1. Create a backup script that dumps the PostgreSQL database

Navigate to Your Home Directory and create a directory for your script:

cd ~
mkdir scripts
cd scripts
nano db_backup.sh

Create the PostgreSQL database backup script:

#!/bin/bash
DIR=`date +%d-%m-%y`
DEST=~/db_backups/$DIR
mkdir $DEST

PGPASSWORD='postgres_password' pg_dump --inserts --column-inserts --username=postgres_user --host=postgres_host --port=postgres_port postgres_database_name > dbbackup.sql

Be sure to replace the following variables with your own values: postgres_password, postgres_user, postgres_host, postgres_port, and postgres_database_name.

Now chmod the backup script to allow it to for execution

chmod +x ~/scripts/db_backup.sh

Now, you possess a fully operational shell script tailored for automating PostgreSQL database backups.
Next step is to automate it and to store the postgres backup on Amazon S3.

2. Configure the AWS CLI

We'll use the AWS CLI to sync the backup files with Amazon S3.

Before installing the AWS CLI you need to install python-pi. Type the following commands:

apt-get update
apt-get -y install python-pip
curl "https://bootstrap.pypa.io/get-pip.py" -o "get-pip.py"

Install the AWS CLI

Type the following command:

pip install awscli

Set up AWS key & Secret

Configuration and credential file settings

cd ~
mkdir .aws
nano ~/.aws/config

Paste in key_id and secret_access_key as shown below

[default]
aws_access_key_id=AKIAIOSFODNN7EXAMPLE
aws_secret_access_key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY

3. Sync the postgres backup to AWS S3

Now, let's build the script that will allow you to sync your PostgreSQL database backups with Amazon S3.
This script will be used in conjunction with the script you created in the previous section.

Copy and paste the script into db_sync.sh.

#!/bin/bash

# Set your AWS CLI path (update with your actual path if needed)
AWS_CLI_PATH="/usr/local/bin/aws"

# Set your Amazon S3 bucket name
S3_BUCKET="my-bucket-name"

# Specify the source directory where your backups are stored
SOURCE_DIR="~/db_backups"

# Synchronize the backups with Amazon S3
$AWS_CLI_PATH s3 sync $SOURCE_DIR s3://$S3_BUCKET

Save the file and exit the text editor.

Make the script executable:

chmod +x ~/scripts/db_sync.sh

Now, you have a shell script that can be executed to sync your Postgres database backups with your specified Amazon S3 bucket.
Don't forget to replace "my-bucket-name" with your actual S3 bucket name and update the AWS_CLI_PATH if necessary to point to your AWS CLI installation location.

4. Schedule your PostgreSQL database backup with CRON

The last step is to automate the process of PostgreSQL database backup with CRON. To do this, follow these steps:

crontab -e

Paste the below commands at the bottom to automate the process

0 0 * * * ~/scripts/db_backup.sh # take a backup every midnight
0 2 * * * ~/scripts/db_sync.sh # upload the backup at 2am

This way the backup script will run and also sync with Amazon S3 daily.

Conclusion

Hence, by using these scripts you can achieve 3 goals:

  1. Creating the database backup via a shell script
  2. Uploading the dump to Amazon S3
  3. Automating this process using Cron.

Want to trust your PostgreSQL backups are running well without hastle?

Try SimpleBackups Now →



Back to blog

Stop worrying about your backups.
Focus on building amazing things!

Free 7-day trial. No credit card required.

Have a question? Need help getting started?
Get in touch via chat or at [email protected]

Customer support with experts
Security & compliance
Service that you'll love using