When testing websites from Linux command line, these tools provide comprehensive HTTP response analysis and performance metrics:
# cURL with detailed timing metrics
curl -s -w "\n\nTime Breakdown:\n-------------\nDNS Lookup: %{time_namelookup}\nConnect: %{time_connect}\nTLS Handshake: %{time_appconnect}\nPre-transfer: %{time_pretransfer}\nRedirect: %{time_redirect}\nStart Transfer: %{time_starttransfer}\nTotal: %{time_total}\n" -o /dev/null https://example.com
A more modern alternative to cURL with better formatting:
# Install HTTPie
sudo apt install httpie
# Basic request with timing
http --print=hHbB --timeout=30 https://example.com
# Detailed timing breakdown
http --verbose --timing https://example.com
For benchmarking multiple requests:
# Install Siege
sudo apt install siege
# Run 25 concurrent users for 30 seconds
siege -c25 -t30S -v https://example.com
For more sophisticated performance testing:
# Basic usage
wrk -t12 -c400 -d30s https://example.com
# With Lua scripting
wrk -t4 -c100 -d10s -s script.lua https://example.com/api
Combine tools for comprehensive testing:
#!/bin/bash
URL="https://example.com"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
LOG_FILE="website_test_${TIMESTAMP}.log"
echo "Testing $URL at $(date)" > $LOG_FILE
echo -e "\n=== cURL Results ===" >> $LOG_FILE
curl -s -w "\nStatus: %{http_code}\nTotal Time: %{time_total}s\n" -o /dev/null $URL >> $LOG_FILE
echo -e "\n=== HTTPie Results ===" >> $LOG_FILE
http --print=hHbB --timeout=30 $URL >> $LOG_FILE 2>&1
echo -e "\nTest completed at $(date)" >> $LOG_FILE
When testing websites from Linux command line, you'll want tools that can:
- Return HTTP status codes
- Measure response times
- Analyze page components
- Handle various protocols (HTTP/HTTPS)
For basic status checking and timing:
curl -s -o /dev/null -w "%{http_code} %{time_total}\n" https://example.com
Breaking down the flags:
- -s: silent mode
- -o /dev/null: discard output
- -w: custom output format
More human-friendly than curl:
http --headers --print=h https://example.com
For load testing multiple requests:
ab -n 100 -c 10 https://example.com/
More sophisticated benchmarking:
siege -b -c 50 -t 1M https://example.com
High performance testing tool:
wrk -t12 -c400 -d30s https://example.com
To test individual elements with curl:
curl -s -w "%{time_total} %{http_code} %{size_download}\\n" -o /dev/null \
https://example.com/style.css
Sample bash script to test multiple URLs:
#!/bin/bash
urls=("https://example.com" "https://example.com/style.css" "https://example.com/script.js")
for url in "${urls[@]}"; do
curl -s -o /dev/null -w \
"URL: $url\nStatus: %{http_code}\nTotal Time: %{time_total}\n\n" "$url"
done
For better analysis, pipe output to tools like:
- grep for filtering
- awk for processing
- gnuplot for graphing