How to Pipe Downloaded Files to stdout in Bash Using wget and Alternatives


2 views

When scripting in Bash, you'll often need to process downloaded files directly without saving them to disk. The standard wget command defaults to saving files, but we can modify this behavior.

For simple cases, wget can output to stdout using:

wget -qO- https://example.com/file.txt

The flags breakdown:

-q: Quiet mode (suppresses output)

-O-: Output to stdout (the hyphen)

Process a JSON file directly with jq:

wget -qO- https://api.example.com/data.json | jq '.results'

Count lines in a remote CSV:

wget -qO- https://data.example.com/large.csv | wc -l

For more flexibility, consider these alternatives:

curl

The more modern solution:

curl -sL https://example.com/file.txt

Flags:

-s: Silent mode

-L: Follow redirects

aria2

For faster downloads:

aria2c -q --stdout https://example.com/large-file.bin | sha256sum

Always check for success:

if ! data=$(wget -qO- "$url"); then
    echo "Download failed" >&2
    exit 1
fi

For large files, streaming avoids memory issues:

wget -qO- https://example.com/huge.log | \
    while IFS= read -r line; do
        # Process each line
        echo "${#line}"
    done

When working with command-line tools, we often want to process downloaded files without saving them to disk. The standard wget command saves files by default, but we can redirect the output using various Bash techniques.

Wget is primarily designed as a file retrieval tool with features like:

  • Resume capabilities
  • Progress tracking
  • Complex download scenarios

These features make it less suitable for direct piping compared to simpler tools.

Using curl for Streaming

curl -sL "https://example.com/file.txt" | process_command

The flags:

  • -s: Silent mode
  • -L: Follow redirects

Wget with Output Redirection

If you must use wget:

wget -qO- "https://example.com/file.txt" | grep "search_term"

The -O- option tells wget to write to stdout.

JSON Processing Pipeline

curl -sL "https://api.example.com/data.json" | jq '.items[] | select(.value > 10)'

Compressed File Handling

curl -sL "https://example.com/archive.tar.gz" | tar -xzO | head -n 20

When working with large files:

  • Add --compressed to curl for compressed transfers
  • Use pv to monitor progress: curl -sL url | pv | process
  • Consider aria2c for segmented downloads

Always check for failures:

curl -f -sL "https://example.com/file" || echo "Download failed" >&2

The -f flag makes curl fail silently on server errors.