How to Execute Remote Commands via SSH in a Shell Script and Continue Local Processing


2 views

When automating workflows between local and remote machines, we often need to:

  1. Connect to a remote server via SSH
  2. Execute commands to gather information
  3. Disconnect while capturing the output
  4. Use that output in subsequent local commands

The key technique involves using command substitution with SSH. Here's the basic pattern:


REMOTE_OUTPUT=$(ssh user@remote "command_to_run")
# Now use $REMOTE_OUTPUT in local commands

Here's a full solution for your file copying scenario:


#!/bin/bash

# Connect to remote and find the target file
FILE_PATH=$(ssh user@remote-host "find /data/storage -name 'important_file*.log' | head -1")

# Verify we got a result
if [ -z "$FILE_PATH" ]; then
    echo "Error: No file found on remote server"
    exit 1
fi

# Copy using the obtained path
echo "Copying $FILE_PATH from remote..."
scp "user@remote-host:$FILE_PATH" ./local_directory/

# Continue with local processing
echo "File copied. Beginning local analysis..."
# ... rest of your script ...

Handling Multiple Results

When the remote command might return multiple files:


# Capture all matching files into an array
mapfile -t FILES < <(ssh user@remote "find /logs -name 'app_*.log'")

for file in "${FILES[@]}"; do
    scp "user@remote-host:$file" ./archive/
done

Error Handling

Add robust error checking:


if ! FILE_PATH=$(ssh user@remote "find /data -name 'report.xml'"); then
    echo "SSH command failed with status $?"
    exit 1
fi

Using SSH Config

Simplify with ~/.ssh/config entries:


Host my-remote
    HostName remote.example.com
    User deployuser
    IdentityFile ~/.ssh/deploy_key

Then your script becomes cleaner:


FILE_PATH=$(ssh my-remote "locate target.dat")
  • Use SSH ControlMaster for multiple connections
  • Consider parallel transfers for multiple files
  • Compress large transfers with -C flag

# Example using parallel and compression
parallel -j 4 scp -C my-remote:{} ./ ::: $(ssh my-remote "find /data -type f -mtime -1")

When automating SSH in scripts:

  • Use SSH keys with passphrases (consider ssh-agent)
  • Limit remote user permissions
  • Validate all remote output before processing
  • Consider using ssh-keyscan for host verification

When automating file transfers between machines, we often need to:

  1. SSH into a remote machine
  2. Execute commands to prepare files
  3. Return to the local machine
  4. Perform SCP operations using gathered information

The key is executing remote commands while maintaining script flow control locally.

Here's the approach I've found most effective:

#!/bin/bash

# 1. SSH command execution with output capture
remote_output=$(ssh user@remote-machine "command_to_locate_files")

# 2. Process remote output locally
file_path=$(echo "$remote_output" | grep -oP '/path/pattern.*')

# 3. Local SCP operation
scp "user@remote-machine:$file_path" ./local-destination/

Let's implement a complete solution for transferring the latest log file:

#!/bin/bash

# Get latest log file path from remote
latest_log=$(ssh admin@logs.example.com \
"ls -t /var/log/app/ | head -n 1 | xargs -I {} echo /var/log/app/{}")

# Verify we got a result
if [ -z "$latest_log" ]; then
    echo "Error: No log file found" >&2
    exit 1
fi

# Perform the transfer
echo "Transferring $latest_log..."
scp "admin@logs.example.com:$latest_log" ./logs/

# Continue with local processing
echo "File transfer complete. Beginning analysis..."
# Additional local commands...

Error Handling

Make the script more robust with proper error checking:

if ! remote_output=$(ssh -o ConnectTimeout=30 user@remote "command"); then
    echo "SSH command failed" >&2
    exit 1
fi

Multi-Step Remote Operations

For complex remote preparations:

remote_data=$(ssh user@remote <<'ENDSSH'
# Multiple commands executed on remote
file_path=$(find /data -name "*.tar.gz" -mtime -1)
size=$(du -h "$file_path" | cut -f1)
echo "$file_path|$size"
ENDSSH
)

# Parse the multi-value output
IFS='|' read -r file_path size <<< "$remote_data"
  • Use SSH connection multiplexing for multiple operations
  • Consider parallel transfers when dealing with multiple files
  • Compress data during transfer for large files

Example with compression:

ssh user@remote "tar czf - /path/to/files" | tar xzf - -C ./local-dir
  • Use SSH keys instead of passwords
  • Restrict remote commands with command="..." in authorized_keys
  • Validate all remote output before processing locally