When automating file transfers between servers, we often need to maintain clean directories by removing outdated files. The combination of SCP for secure transfers and SSH for remote command execution provides a robust solution.
- SSH access to the remote server
- Proper permissions on both source and destination
- Key-based authentication setup (recommended)
Here's a bash script that handles both file transfer and cleanup:
#!/bin/bash
# Configuration
REMOTE_USER="username"
REMOTE_HOST="example.com"
REMOTE_DIR="/path/to/destination"
LOCAL_FILES="z*.foo"
DAYS_TO_KEEP=5
# SCP transfer with compression
scp -C $LOCAL_FILES $REMOTE_USER@$REMOTE_HOST:$REMOTE_DIR
# SSH command to delete old files
ssh $REMOTE_USER@$REMOTE_HOST "find $REMOTE_DIR -name 'z*.foo' -type f -mtime +$DAYS_TO_KEEP -exec rm {} \;"
The script uses two main commands:
- SCP command transfers all matching files (z*.foo) with compression (-C flag for better performance)
- SSH with find command locates and removes files older than specified days
For production environments, consider adding logging:
#!/bin/bash
LOG_FILE="/var/log/remote_file_cleanup.log"
TIMESTAMP=$(date +"%Y-%m-%d %T")
{
echo "=== Starting transfer at $TIMESTAMP ==="
scp -v -C z*.foo user@remote:/path/
echo "=== Starting cleanup at $(date +"%Y-%m-%d %T") ==="
ssh user@remote "find /path/ -name 'z*.foo' -type f -mtime +5 -exec rm -v {} \;"
} >> $LOG_FILE 2>&1
- Use SSH keys instead of passwords
- Restrict SSH access to specific IPs if possible
- Consider using a dedicated limited-permission account
- Test with '-exec echo {} \;' before actual deletion
For more complex scenarios:
- rsync: Offers built-in cleanup with --delete-after
- Ansible: Better for orchestration across multiple servers
- Custom Python script using Paramiko library
Common issues and solutions:
# Test SSH connection first
ssh -v user@remote "echo 'Connection test successful'"
# Verify find command works as expected
ssh user@remote "find /path/ -name 'z*.foo' -type f -mtime +5 -print"
# Check permissions on remote directory
ssh user@remote "ls -ld /path/"
Here's a comprehensive guide to securely transferring files via SCP while automatically cleaning up old files on the remote server.
When automating file transfers between servers, we often need to:
- Securely copy new files to the destination
- Remove outdated files to prevent disk space issues
- Maintain security throughout the process
Here's a robust approach combining SCP for transfer and SSH with find for cleanup:
#!/bin/bash
# Variables
SOURCE_FILES="z*.foo"
DEST_USER="username"
DEST_SERVER="remote.example.com"
DEST_PATH="/path/to/destination"
DAYS_TO_KEEP=5
# Step 1: SCP transfer
scp $SOURCE_FILES $DEST_USER@$DEST_SERVER:$DEST_PATH
# Step 2: Remote cleanup
ssh $DEST_USER@$DEST_SERVER "find $DEST_PATH -type f -name 'z*.foo' -mtime +$DAYS_TO_KEEP -exec rm {} \;"
For production environments, consider these enhancements:
# Using SSH keys for authentication (more secure than passwords)
ssh-keygen -t rsa -b 4096
ssh-copy-id $DEST_USER@$DEST_SERVER
# Restrictive permissions on destination
ssh $DEST_USER@$DEST_SERVER "chmod 750 $DEST_PATH"
For more complex scenarios, here's an improved version with error handling:
#!/bin/bash
set -euo pipefail
# Configurable parameters
readonly SOURCE_DIR="/local/source"
readonly DEST_DIR="/remote/destination"
readonly FILE_PATTERN="z*.foo"
readonly RETENTION_DAYS=5
readonly SSH_USER="deployer"
readonly SSH_HOST="production-server"
readonly SSH_PORT=22
readonly LOG_FILE="/var/log/file_transfer.log"
log() {
echo "$(date '+%Y-%m-%d %H:%M:%S') - $1" >> $LOG_FILE
}
log "Starting file transfer process"
# Transfer files
if scp -P $SSH_PORT $SOURCE_DIR/$FILE_PATTERN $SSH_USER@$SSH_HOST:$DEST_DIR; then
log "File transfer completed successfully"
else
log "ERROR: File transfer failed" >&2
exit 1
fi
# Clean up old files
if ssh -p $SSH_PORT $SSH_USER@$SSH_HOST \
"find $DEST_DIR -type f -name '$FILE_PATTERN' -mtime +$RETENTION_DAYS -delete"; then
log "Old file cleanup completed"
else
log "WARNING: Cleanup operation failed" >&2
fi
log "Process completed"
For more efficient transfers (especially with many files), consider rsync:
#!/bin/bash
# Using rsync with built-in cleanup
rsync -avz --remove-source-files --delete-after \
--include='z*.foo' --exclude='*' \
/local/source/ user@remote:/remote/path/
To automate this process, add to crontab:
# Run daily at 2 AM
0 2 * * * /path/to/your/script.sh >> /var/log/file_transfer.log 2>&1
Remember to test any cleanup commands with -print
before using -delete
or -exec rm
to verify which files would be affected.