I have a pretty big postgres DB on Amazon RDS (about 9GB when zipped), and sometimes we need to copy it and do some tests on it on our local machines.
Doing a DB dump (pg_dump) and downloading it is simply too slow and honestly been just getting stuck last few times we tried.
Is there a simple way to get parts of the DB in a smart way? for example, get only changes from last 10 days, and then we can merge them with the local DB we have, or maybe getting DB in chunks etc?
I'm sure I'm not the first one with that need, but couldn't find a decent method or tutorial to explain the best ways to do it.
Thanks!
Best Answer
9GB compressed dump isn't really that large. You just need to make do it right:
--format=directory
or-Fd
) — it's automatically compressed;--jobs=16
or-j16
) and parallel restore;sslmode=disable
in connection string orenv PGSSLMODE=disable pg_dump …
to disable SSL — some versions of AWS RDS have 64GB limit of SSL data over single connection;