When working with PostgreSQL 8.3+ in multi-database environments, administrators often need to isolate query logging for specific databases without capturing all database activity. The default logging configuration applies server-wide, which creates unnecessary log noise when you only need to monitor one database.
PostgreSQL offers several configuration parameters that can help achieve selective logging:
# postgresql.conf modifications
log_statement = 'all' # or 'mod', 'ddl' based on needs
log_line_prefix = '%m [%d] %p %r '
log_filename = 'postgresql-%Y-%m-%d.log'
log_rotation_age = '1d'
While there's no built-in per-database logging, you can implement filtering using these approaches:
-- Create a logging function in your target database
CREATE OR REPLACE FUNCTION log_query(query_text text) RETURNS void AS $$
BEGIN
INSERT INTO query_log_table(query, timestamp)
VALUES (query_text, now());
END;
$$ LANGUAGE plpgsql SECURITY DEFINER;
For production environments, consider these specialized tools:
- pgBadger: Creates detailed HTML reports with database-level filtering
- pgFouine: Specialized query analyzer with database segmentation
- pg_stat_statements: Built-in extension for statement statistics
Here's a complete setup for isolating logs:
# In postgresql.conf
log_destination = 'csvlog'
logging_collector = on
log_directory = 'pg_log'
log_filename = 'postgresql-%Y-%m-%d.csv'
log_rotation_age = 1d
log_rotation_size = 0
log_min_duration_statement = 0
# Then filter using:
grep \"[yourdbname]\" /path/to/pg_log/postgresql-*.csv > yourdb_queries.log
For critical operations, consider database triggers:
CREATE TABLE query_audit (
id serial PRIMARY KEY,
dbname text,
username text,
query text,
executed_at timestamp
);
CREATE OR REPLACE FUNCTION audit_trigger()
RETURNS event_trigger AS $$
BEGIN
INSERT INTO query_audit(dbname, username, query, executed_at)
VALUES (current_database(), current_user, current_query(), now());
END;
$$ LANGUAGE plpgsql;
CREATE EVENT TRIGGER audit_queries ON ddl_command_end
EXECUTE PROCEDURE audit_trigger();
When working with PostgreSQL 8.3 in a multi-database environment, you might need to monitor queries for a specific database without capturing activity from other databases. The default logging configuration doesn't provide direct filtering by database name.
While you can use log_line_prefix = "%d"
to include the database name in log entries, this doesn't solve the fundamental issue of:
- Log files becoming unnecessarily large
- Having to manually filter logs
- Potential performance impact from excessive logging
Here are several approaches to achieve database-specific logging:
1. Using pgBadger for Post-Processing
Configure standard logging and then use pgBadger to filter by database:
pgbadger --dbname your_database_name postgresql.log
2. Custom Logging Function
Create a PL/pgSQL function that writes to a separate log table:
CREATE OR REPLACE FUNCTION log_query(dbname text, query text) RETURNS void AS $$
BEGIN
IF dbname = 'your_target_db' THEN
INSERT INTO query_log (dbname, query, timestamp)
VALUES (dbname, query, now());
END IF;
END;
$$ LANGUAGE plpgsql;
3. syslog Filtering
When using syslog logging, configure filtering in rsyslog:
if $msg contains 'your_database_name' then {
action(type="omfile" file="/var/log/postgresql/your_db.log")
}
For PostgreSQL 8.3, you can implement this workaround:
log_destination = 'csvlog'
log_filename = 'postgresql-%d.log'
log_rotation_age = '1d'
This will create separate log files for each database with names like postgresql-yourdb.log
.
When implementing any logging solution, consider:
- Disk I/O impact from additional logging
- Storage requirements for log files
- Potential locking issues with custom logging functions
If you can upgrade to newer PostgreSQL versions (9.6+), consider:
ALTER DATABASE your_database SET log_statement = 'all';
This database-specific parameter provides the exact functionality needed.