I am looking for a decent way of keeping a secondary Oracle database up to date without exporting and importing the database each time. There are 3 users on the instance that I would essentially like to 'log ship' if thats what it is called on Oracle!
Can anyone suggest anything?
The database is well under a GB total and we are running 10g express (although I have thought about using 10g standard as we have a spare license).
Cheers
Chris
Best Answer
Here's what I do:
My primary database has transaction log shipping turned on. The logs are written to /db/archive.
Every hour, a cron job runs as the oracle user. This cronjob does the following things:
moves the contents of /db/archive/ to /db_archive/YYYYMMDD/ (using the following script (that I didn't write, and so don't hold me accountable for ugliness))
rsyncs the contents of /db_archive/YYYYMMDD/ to /db_archive/YYYYMMDD on the secondary server.
That takes care of getting the files there.
On the secondary server, /db/archive is a symlink to /db_archive/YYYYMMDD. "recover standby database" automatically reads from /db/archive/, so whatever script you use to recover your database should be able to handle the errors generated at the end of a day, so they can switch the symlink to the new day.
Does this help, or do you need more info?