When using Apache's rotatelogs to create daily log files in the format host.DD.MM.YYYY.access.log, we often need to compress these logs after rotation and move them to an archive directory. The native rotatelogs doesn't provide built-in compression functionality.
Here are three effective methods to handle this:
Create a wrapper script that handles both rotation and compression:
#!/bin/bash
LOGDIR="/var/log/apache2"
ARCHIVEDIR="/var/log/apache2/archive"
LOGNAME="host"
# Rotate and compress
/usr/sbin/rotatelogs -l ${LOGDIR}/${LOGNAME}.%d.%m.%Y.access.log 86400 | \
while read line; do
# When rotatelogs creates a new file, compress the previous one
if [[ $line =~ "opening new log file" ]]; then
PREV_FILE=$(echo $line | awk '{print $NF}')
gzip -9 "$PREV_FILE" && \
mv "${PREV_FILE}.gz" "$ARCHIVEDIR"
fi
done
Configure logrotate to handle the compressed archives:
/var/log/apache2/host.*.access.log {
daily
missingok
rotate 0
compress
delaycompress
notifempty
create 0640 root adm
sharedscripts
postrotate
/usr/sbin/apachectl graceful
endscript
lastaction
mv /var/log/apache2/host.*.access.log.gz /var/log/apache2/archive/
endaction
}
For more real-time processing, use inotify-tools:
#!/bin/bash
LOGDIR="/var/log/apache2"
ARCHIVEDIR="/var/log/apache2/archive"
inotifywait -m -e close_write --format '%w%f' "$LOGDIR" | \
while read file; do
if [[ $file =~ host\.[0-9]{2}\.[0-9]{2}\.[0-9]{4}\.access\.log$ ]]; then
gzip -9 "$file" && \
mv "${file}.gz" "$ARCHIVEDIR"
fi
done
When implementing any of these solutions:
- Ensure proper file permissions are maintained
- Consider adding error handling for the move operation
- Monitor disk space in the archive directory
- Test with small rotation intervals first
For maximum control, consider writing a custom log rotator:
#!/usr/bin/perl
use strict;
use warnings;
use POSIX qw(strftime);
use IO::Compress::Gzip qw(gzip $GzipError);
my $logdir = '/var/log/apache2';
my $archivedir = '/var/log/apache2/archive';
my $logname = 'host';
my $current_file;
while () {
my $today = strftime("%d.%m.%Y", localtime);
unless ($current_file && $current_file =~ /$today/) {
close LOG if $current_file;
$current_file = "$logdir/$logname.$today.access.log";
open LOG, '>>', $current_file or die $!;
# Compress yesterday's log
if (my @yesterdays = glob("$logdir/$logname.*.access.log")) {
foreach my $file (@yesterdays) {
next if $file eq $current_file;
my $gzfile = "$file.gz";
gzip $file => $gzfile or die "gzip failed: $GzipError";
system('mv', $gzfile, $archivedir) == 0 or warn "mv failed: $?";
}
}
}
print LOG $_;
}
When using Apache's rotatelogs utility, logs are typically created in daily files with patterns like host.DD.MM.YYYY.access.log. While rotation happens automatically, we often need to compress and archive these logs to save disk space. Here's a comprehensive solution.
The most reliable approach is to use a small script that gets executed after each rotation. Modify your Apache configuration:
CustomLog "|/usr/sbin/rotatelogs -l /var/log/apache2/host.%d.%m.%Y.access.log 86400" combined
Then create a post-rotation script (/usr/local/bin/compress_logs.sh):
#!/bin/bash
# Directory where logs are being rotated to
LOG_DIR="/var/log/apache2"
ARCHIVE_DIR="/var/log/apache2/archives"
# Find yesterday's log (most recently rotated)
YESTERDAY=$(date -d "yesterday" +"%d.%m.%Y")
LOG_FILE="${LOG_DIR}/host.${YESTERDAY}.access.log"
# Compress and move if file exists
if [ -f "$LOG_FILE" ]; then
gzip -9 "$LOG_FILE"
mkdir -p "$ARCHIVE_DIR"
mv "${LOG_FILE}.gz" "$ARCHIVE_DIR/"
fi
For more control, set up a daily cron job that runs shortly after midnight:
# Daily at 12:05 AM
5 0 * * * /usr/local/bin/compress_logs.sh
For real-time compression, you can pipe directly to gzip:
CustomLog "|/usr/sbin/rotatelogs -l /var/log/apache2/host.%d.%m.%Y.access.log 86400 | gzip -9 > /var/log/apache2/archives/host.%d.%m.%Y.access.log.gz" combined
For environments with many virtual hosts, use this enhanced script:
#!/bin/bash
LOG_DIR="/var/log/apache2"
ARCHIVE_DIR="/var/log/apache2/archives"
DAYS_TO_KEEP=30
# Process all yesterday's logs
for log in $(find "$LOG_DIR" -name "host.$(date -d "yesterday" +"%d.%m.%Y").*.log"); do
gzip -9 "$log"
mkdir -p "$ARCHIVE_DIR"
mv "${log}.gz" "$ARCHIVE_DIR/"
done
# Cleanup old archives
find "$ARCHIVE_DIR" -name "host.*.log.gz" -mtime +$DAYS_TO_KEEP -delete
Always test your compression script manually first. Consider adding logging to your script:
echo "$(date) - Compressed ${LOG_FILE} to ${ARCHIVE_DIR}/" >> /var/log/log_compression.log