i've got a remote FTP server where i store some backups via cronjob.
The Problem is that i only got a little amount of space so im doing incremental backups. I want to keep the backups about 14 days.
I don't have any other access to the server except FTP and i don't now how to delete files older than x days. Every file got a name with a date in it
yxzNamezxy-date-y-m-d.tar.bz2 (datev-20100111.tar.bz2(
Hope to get some help here.
Cheers and thanks for the hopefully coming answeres,
Dennis
/Edit
I'm trying to use the curl thing which is mantioned in an answere with this part of code
curl ftp://$FTP_SERVER --user $FTP_USER:$FTP_PASS --list-only > files.tmp
declare -a aFiles
let iCount=0
exec < files.tmp
while read sLine
do
aFiles[$iCount]=$sLine
((iCount++))
done
echo -e "\n Files: \n\n\n"
echo ${aFiles[@]}
echo -e "\nfor ...\n\n\n"
for sFile in $aFiles
do
echo -e "\nFile:" $sFile
done
but the secound loop doesn't give me more than 1 filename
Cheers,
Dennis
Best Answer
You could send a series of commands to
lftp
and wrap it in a shell script which calculates the filename of the too old files (see the script below).Make sure nobody execpt
root
can read this script or put the credentials somewhere else.TMPDIR
is the directory where the backups reside locally. Of course you need to edit the obivous parts.HTH,
PEra