Easy Oracle Log-Shipping

backuporacle

I am looking for a decent way of keeping a secondary Oracle database up to date without exporting and importing the database each time. There are 3 users on the instance that I would essentially like to 'log ship' if thats what it is called on Oracle!

Can anyone suggest anything?

The database is well under a GB total and we are running 10g express (although I have thought about using 10g standard as we have a spare license).

Cheers

Chris

Best Answer

Here's what I do:

My primary database has transaction log shipping turned on. The logs are written to /db/archive.

Every hour, a cron job runs as the oracle user. This cronjob does the following things:

moves the contents of /db/archive/ to /db_archive/YYYYMMDD/ (using the following script (that I didn't write, and so don't hold me accountable for ugliness))

 #!/bin/bash
 # args: <src> <dest> <date>

 datechunk=$3
 echo "Processing $datechunk"

 check=`ls $1 | wc -l`

 if [ $check -le 2 ]; then
     exit 0
 fi

 let check2=check-2
 echo "Processing $check2 files"
 ls -vA $1 | head -n $check2 > $2/dirlist.$datechunk

 for line in `cat $2/dirlist.$datechunk`
 do
    cp $1/$line $2/$datechunk
    if [ -s $2/$datechunk/$line ]; then
        rm $1/$line
    fi
 done

rsyncs the contents of /db_archive/YYYYMMDD/ to /db_archive/YYYYMMDD on the secondary server.

That takes care of getting the files there.

On the secondary server, /db/archive is a symlink to /db_archive/YYYYMMDD. "recover standby database" automatically reads from /db/archive/, so whatever script you use to recover your database should be able to handle the errors generated at the end of a day, so they can switch the symlink to the new day.

Does this help, or do you need more info?

Related Topic