1

Currently I am using following simple line script to take backup of server directory to remote FTP location

# Take backup to remote location ncftpput -z -R -uusername -ppassword ftp.example.org /destination/directory /source/directory 

Now, I just need to delete the backup from remote FTP which is older than 7 days. On normal directory location within the server I simply do it with following simple script

# Delete backups older than 7 days find /backup -name '*' -type d -mtime +6 -exec rm -rfv "{}" \; 

But, running the same is not working on FTP, is there any way I can delete FTP remote location data older than 7 days? may be via lftp or any other command

4
  • 1
    Simply running that same find script as batch job on the FTP server is not an option? Commented Nov 8, 2016 at 13:38
  • @HBruijn actually this is remote FTP location provided only for backups, there is no access to host other than via FTP command. Commented Nov 8, 2016 at 13:53
  • This related ServerFault post may help... Commented Nov 8, 2016 at 14:51
  • Anybody providing a backup service without an expiration option is ripe for competition. Are you sure they won't do this on the server side? Commented Nov 12, 2016 at 15:13

2 Answers 2

3

There is a FUSE filesystem called curlftpfs which will allow you to mount the remote FTP site as if it were a normal filesystem, thus allowing you to run your find command to delete files older than 7 days. First install the package by running apt-get install curlftpfs, yum install curlftpfs, or the equivalent for your distribution.

Once installed, you can mount the remote ftp site like this:

mkdir /tmp/ftp_mount curlftpfs -o user=username:password ftp.example.org /tmp/ftp_mount 

where username and password get replaced by your actual username and password. Now just cd into /tmp/ftp_mount and run your find command to delete the older files. When you're done, cd out of the filesystem and unmount it with:

fusermount -u /tmp/ftp_mount 

Since curlftpfs is a FUSE filesystem, you can do this as any user (you don't need root for it to work).

1
  • I should have mentioned: I know this but the very sad part of it is that mounting that much TB's of remote FTP location is not recommended at all as it really makes load on server while scanning and others... Commented Nov 12, 2016 at 16:51
0

It seems that there is no easy way to do what you want using just FTP access. Here are some ideas that might be useful:

  • Organize/rename your directories by date if applicable. So, you will know the directory name to delete.
  • Keep exact copy of same directories/files on your local machine. Apply find on local machine to get list of files to delete and then iterate over all files to delete them via FTP.
  • Iterate over directory structure to find the modified time and decide whether to delete this file/folder or not.

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.