Mastering Linux CLI: Essential Best Practices for Efficient System Administration and Shell Scripting


2 views

Organizing your command history is crucial for productivity. Add these to your ~/.bashrc:

# Better history management
export HISTSIZE=10000
export HISTFILESIZE=20000
export HISTTIMEFORMAT="%F %T "
shopt -s histappend

Combine grep, sed, and awk for advanced text manipulation:

# Find and modify Apache virtual hosts
grep -l "example.com" /etc/apache2/sites-available/*.conf | \
xargs sed -i 's/old.ip.address/new.ip.address/g'

# Process logs with AWK
awk '{print $1}' /var/log/nginx/access.log | sort | uniq -c | sort -nr | head -10

Monitor and debug processes effectively:

# List open files for process
ls -l /proc/$(pgrep nginx)/fd

# Check process memory usage
cat /proc/$(pgrep java)/status | grep -E 'VmSize|VmRSS'

Always include these safety measures:

#!/bin/bash
set -euo pipefail  # Exit on error, undefined variables, and pipe failures
IFS=$'\n\t'        # Prevent word splitting issues

# Use parameter expansion for safety
archive_dir="${1:-/tmp/default_archive}"

Leverage process substitution and parallel execution:

# Compare two directory listings
diff <(ls /dir1) <(ls /dir2)

# Parallel processing with xargs
find . -name "*.log" -print0 | xargs -0 -P 4 -I {} gzip {}

Optimize remote work with these ~/.ssh/config settings:

Host *
    ControlMaster auto
    ControlPath ~/.ssh/control:%h:%p:%r
    ControlPersist 1h
    ServerAliveInterval 60
    TCPKeepAlive yes

Customizing your .bashrc or .zshrc can dramatically improve productivity:


# Add these to your ~/.bashrc
alias ll='ls -alFh --color=auto'
alias grep='grep --color=auto'
alias fgrep='fgrep --color=auto'
alias egrep='egrep --color=auto'

For command history enhancements:


# Larger history with timestamps
HISTSIZE=5000
HISTFILESIZE=10000
HISTTIMEFORMAT='%F %T '
shopt -s histappend

Grep patterns:


# Recursive search with context
grep -r --include='*.py' -n -A3 -B2 'import requests' /path/to/code

# Inverse match with line numbers
grep -vn '^#' /etc/nginx/nginx.conf

Sed in-place editing:


# Replace text across multiple files
find . -type f -name '*.conf' -exec sed -i 's/old-text/new-text/g' {} +

# Delete lines matching pattern
sed -i '/pattern-to-delete/d' file.txt

AWK for data extraction:


# Sum column values
awk '{sum += $3} END {print sum}' data.csv

# Filter and format output
ps aux | awk '$3 > 5.0 {print $1, $3, $11}' | sort -k2 -n

Process management tricks:


# Find and kill processes by name
pgrep -f 'python script.py' | xargs kill -9

# Monitor system resources
watch -n 1 'df -h; echo; free -h; echo; uptime'

Working with /proc filesystem:


# Check process memory usage
cat /proc/$(pgrep -f nginx)/status | grep -E 'VmRSS|VmSize'

# List open files by process
ls -l /proc/$(pgrep -f mysql)/fd/

# Secure file copy with progress
rsync -ah --progress --partial source/ destination/

# Verify recursive file operations
find /target/dir -type f -exec md5sum {} + | sort > checksums.txt

Always include these at the top of your scripts:


#!/usr/bin/env bash
set -euo pipefail
IFS=$'\n\t'

Proper error handling example:


#!/bin/bash

log_error() {
    echo "[ERROR] $(date '+%Y-%m-%d %H:%M:%S') - $1" >&2
}

backup_dir() {
    local src=$1
    local dest=$2
    
    if [[ ! -d "$src" ]]; then
        log_error "Source directory $src does not exist"
        return 1
    fi
    
    mkdir -p "$dest" || {
        log_error "Failed to create destination directory $dest"
        return 1
    }
    
    if ! tar -czf "$dest/backup_$(date +%Y%m%d).tar.gz" "$src"; then
        log_error "Backup of $src failed"
        return 1
    fi
}