Postgresql – Backup a big PostgreSQL database continually

backuppostgresql

I have a 20 Tb database in PostgreSQL 9.2 and I have no idea how to efficiently create backups. It's a database that grows a little bit every day and on the 1 of each month it grows about 200 Gb.

I was thinking of doing a pg_dump, but it takes too much to do it every month, then I think about doing it one time, and then backup just the new or changed tables with copy, but I can't stop using the database in order to create the first dump, so I won't exactly know exactly which data does the backup contains.

How can I do an incremental backup without stop using the database?

Best Answer

Backups are extensively covered in the PostgreSQL manual.

To achieve a continuous backup, archive the write-ahead log. Suggested settings are:

archive_mode = on
wal_level = hot_standby
archive_command = '/usr/bin/rsync --archive --ignore-existing "%p" "/backup-dest/%p"'

# Ensures that a log file is written at least once every 30 minutes even if little
# activity has occurred
archive_timeout = 30min

Instead of doing a pg_dump for your baseline, you can do pg_basebackup, which does not require you to freeze the database. However, if you do not already have archive_mode on, you'll need to restart the database to change that setting.