Backup script that excludes large files using Duplicity and Amazon S3

amazon s3backupduplicitymaximum

I'm trying to write an backup script that will exclude files over a certain size.

My script gives the proper command, but when run within the script it outputs an an error. However if the same command is run manually everything works…???

Here is the script based on one easy found with google


#!/bin/bash
# Export some ENV variables so you don't have to type anything
export AWS_ACCESS_KEY_ID="accesskey"
export AWS_SECRET_ACCESS_KEY="secretaccesskey"
export PASSPHRASE="password"

SOURCE=/home/
DEST=s3+http://s3bucket

GPG_KEY="7743E14E"

# exclude files over 100MB
exclude ()
{
 find /home/jason -size +100M \
 | while read FILE; do 
  echo -n " --exclude "
  echo -n \'**${FILE##/*/}\' | sed 's/\ /\\ /g' #Replace whitespace with "\ "
 done
}

echo "Using Command"
echo "duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST"

duplicity --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY `exclude` $SOURCE $DEST

# Reset the ENV variables.
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export PASSPHRASE=

If run I recieve the error;


Command line error: Expected 2 args, got 6
Enter 'duplicity --help' for help screen.

Any help your could offer would be greatly appreciated.

Best Answer

I solved the problem..


#!/bin/bash
# Export some ENV variables so you don't have to type anything
export AWS_ACCESS_KEY_ID="accesskey"
export AWS_SECRET_ACCESS_KEY="secretaccesskey"
export PASSPHRASE="password"

SOURCE=/home/
DEST=s3+http://s3bucket

GPG_KEY="gpgkey"

# Generate a filelist of excluded files over 100MB
find $SOURCE -size +100M > /tmp/filelist

duplicity --exclude-filelist /tmp/filelist --encrypt-key=$GPG_KEY --sign-key=$GPG_KEY $SOURCE $DEST

# Reset the ENV variables.
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
export PASSPHRASE=