How to Delete Files Older Than X Days on Remote Server Using SCP/SFTP Commands


10 views

Managing outdated backup files on remote servers is a common sysadmin task. While SCP and SFTP protocols weren't originally designed for complex file management operations, we can combine them with standard Unix tools to achieve our goal.

The most efficient approach is to use SSH in combination with the find command:

ssh user@remote.server "find /path/to/backups -type f -mtime +30 -delete"

Breakdown:
- -type f: Only target files (not directories)
- -mtime +30: Files modified more than 30 days ago
- -delete: Remove matching files

For environments where direct SSH access isn't available, we can use SFTP with a batch file:

#!/bin/bash
sftp -b - user@remote.server <

Then parse the output to identify old files and create a deletion script.

Here's a more robust script that handles edge cases:

#!/bin/bash
REMOTE_USER="user"
REMOTE_HOST="remote.server"
BACKUP_DIR="/path/to/backups"
DAYS_OLD=30

ssh ${REMOTE_USER}@${REMOTE_HOST} <

When automating file deletions:

  • Always test with -ls before using -delete
  • Consider implementing a dry-run mode first
  • Set proper permissions on backup directories
  • Log your deletion operations

For more complex scenarios, consider:

  • rsync with --remove-source-files flag
  • lftp with its mirroring capabilities
  • Ansible for orchestrated cleanup across multiple servers

When managing backups on remote servers, we often need to purge old files to save disk space. The tricky part comes when you only have SCP/SFTP access without direct shell access to the remote server.

Traditional methods like find -mtime +X -delete won't work here since we can't execute commands on the remote server. We need solutions that work purely through file transfer protocols.


lftp -e "mirror --only-newer --no-perms --no-umask --delete \
--verbose=3 /remote/path /local/path; quit" sftp://user:pass@host

This command will:

  • Connect via SFTP
  • Mirror only newer files
  • Delete older files from remote
  • Work without shell access

For more control, here's a Python script using Paramiko:


import paramiko
import time
from datetime import datetime, timedelta

ssh = paramiko.SSHClient()
ssh.set_missing_host_key_policy(paramiko.AutoAddPolicy())
ssh.connect('hostname', username='user', password='pass')

sftp = ssh.open_sftp()
threshold = datetime.now() - timedelta(days=30)

for fileattr in sftp.listdir_attr('/backup/path'):
if datetime.fromtimestamp(fileattr.st_mtime) < threshold: print(f"Deleting {fileattr.filename}") sftp.remove(f"/backup/path/{fileattr.filename}") sftp.close() ssh.close()

If you have rsync available:


rsync -avz --delete --files-from=<(ssh user@host 'find /backup -type f -mtime +30') \ empty/ user@host:/backup/

Always:

  • Test with --dry-run first
  • Backup before deletion
  • Use SSH keys instead of passwords
  • Limit script permissions