How to Perform HTTP HEAD Request with Hard Timeout Using curl: Get Final Status Code and URL


4 views

When you need to verify server availability without downloading content, a HEAD request is the perfect solution. Here's the basic curl command structure:

curl -I -L -s -o /dev/null -w "%{http_code} %{url_effective}\\n" https://example.com

Key options breakdown:
- -I: Performs a HEAD request (instead of GET)
- -L: Follows redirects
- -s: Silent mode (no progress meter)
- -o /dev/null: Discards output
- -w: Custom output format

For production environments, you must implement timeouts to prevent hanging. curl provides several timeout-related options:

curl -I -L -m 10 --connect-timeout 5 https://example.com
  • -m 10: Total maximum time (10 seconds)
  • --connect-timeout 5: Connection timeout (5 seconds)

Combine all requirements into a single robust command:

curl -I -L -s -o /dev/null -w "%{http_code} %{url_effective}\\n" \\
     -m 15 --connect-timeout 7 \\
     --max-redirs 5 \\
     https://example.com

Basic health check:

response=$(curl -I -s -o /dev/null -w "%{http_code}" -m 5 http://localhost:8080/health)
if [ "$response" -eq 200 ]; then
    echo "Service is healthy"
else
    echo "Service check failed"
fi

Monitoring script example:

#!/bin/bash

URLS=("https://api.service1.com" "https://api.service2.com/v1/status")

for url in "${URLS[@]}"; do
    result=$(curl -I -L -s -o /dev/null -w "%{http_code} %{url_effective}" -m 10 "$url")
    read -r code final_url <<< "$result"
    
    echo "$(date) - $final_url returned $code" >> monitor.log
    if [ "$code" -ne 200 ]; then
        send_alert "Service $url is down (HTTP $code)"
    fi
done

When your curl HEAD requests fail:

  1. Verify the server actually supports HEAD (some APIs only implement GET)
  2. Check if intermediate proxies block HEAD requests
  3. Test with -v flag for verbose output during debugging
  4. Ensure DNS resolution is working (try with IP address)

When working with web services, it's often necessary to verify if a server is responsive without downloading the entire content. A HEAD request is perfect for this purpose, as it retrieves only the HTTP headers. Additionally, setting a hard timeout ensures your script doesn't hang indefinitely if the server is unresponsive.

Here's what we need to accomplish:

  • Send a HEAD request using curl
  • Follow redirects automatically
  • Get the final HTTP status code
  • Get the final URL after redirects
  • Set a hard timeout for the entire operation

Here's the complete curl command that meets all requirements:

curl -IXGET -L -w "%{http_code} %{url_effective}\\n" -o /dev/null --max-time 10 https://example.com

Let's break down the options:

  • -I or -XHEAD: Sends a HEAD request
  • -L: Follows redirects
  • -w: Custom output format showing status code and final URL
  • -o /dev/null: Discards the output
  • --max-time 10: Sets a 10-second hard timeout

Here's how you might use this in a bash script:

#!/bin/bash

url="https://example.com"
timeout=5

result=$(curl -IXGET -L -w "%{http_code} %{url_effective}\\n" -o /dev/null --max-time $timeout "$url" 2>&1)

if [ $? -eq 0 ]; then
    echo "Success: $result"
else
    echo "Error: $result"
    exit 1
fi

Case 1: Server returns 200 OK

200 https://example.com/

Case 2: Server redirects (301/302)

200 https://www.example.com/

Case 3: Timeout occurs

Error: curl: (28) Operation timed out after 5001 milliseconds with 0 bytes received

For more complex scenarios, you might want to:

  1. Store the output in variables:
  2. read status final_url <<< $(curl -IXGET -L -w "%{http_code} %{url_effective}\\n" -o /dev/null --max-time 10 "$url")
    
  3. Add connection timeout (--connect-timeout) separately from the total timeout:
  4. curl --connect-timeout 3 --max-time 10 ...
    

While curl is excellent for command-line use, you might consider these alternatives for other environments:

  • Python: Use requests library with timeout
  • Node.js: Use axios with timeout configuration
  • PHP: Use stream_context_create with timeout settings