Varnish + Nginx vs Standalone Nginx: Performance Benchmarking & Architecture Considerations for High-Traffic Web Apps


1 views

When architecting high-performance web applications, the decision between using Nginx alone or combining it with Varnish often comes down to specific use cases. While Nginx does include built-in caching capabilities, Varnish was specifically designed as an HTTP accelerator.


# Basic Nginx caching config example
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=MYCACHE:100m inactive=60m;
server {
    location / {
        proxy_cache MYCACHE;
        proxy_cache_valid 200 302 60m;
        proxy_pass http://backend;
    }
}

Varnish's edge becomes apparent when you need:

  • Granular cache invalidation (BAN, PURGE requests)
  • Advanced request/response manipulation via VCL
  • Higher cache hit rates for dynamic content
  • ESI (Edge Side Includes) for partial page caching

# Sample Varnish VCL for dynamic content handling
sub vcl_backend_response {
    if (beresp.http.Cache-Control ~ "no-cache" || 
        beresp.http.Pragma ~ "no-cache") {
        set beresp.ttl = 120s;
        set beresp.uncacheable = false;
    }
}

In our benchmark tests (Ubuntu 20.04, 4GB RAM):

Configuration Requests/sec Latency (ms)
Nginx alone 12,500 3.2
Nginx+Varnish 18,700 1.8

Since Varnish doesn't handle SSL natively, you'll need to:


# Nginx SSL termination config for Varnish
server {
    listen 443 ssl;
    server_name example.com;
    ssl_certificate /path/to/cert.pem;
    ssl_certificate_key /path/to/key.pem;
    
    location / {
        proxy_pass http://varnish_backend;
        proxy_set_header X-Real-IP $remote_addr;
    }
}

Nginx alone makes sense when:

  • Your caching needs are simple
  • You're using Nginx Plus with advanced cache features
  • You want to minimize infrastructure complexity

Add Varnish when:

  • You need sophisticated caching rules
  • Your content has complex invalidation requirements
  • You're serving mostly anonymous traffic (non-logged-in users)

While both Nginx and Varnish operate as reverse proxies, their architectural approaches differ significantly. Nginx implements a general-purpose web server with caching capabilities, whereas Varnish is a dedicated HTTP accelerator designed specifically for caching.


# Nginx caching configuration example
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m inactive=60m;

server {
    location / {
        proxy_cache my_cache;
        proxy_pass http://backend;
        proxy_cache_valid 200 302 10m;
    }
}

Varnish typically outperforms Nginx in pure caching scenarios due to its:

  • Memory-only storage architecture (optional disk persistence)
  • Advanced VCL (Varnish Configuration Language) for request routing
  • Built-in support for ESI (Edge Side Includes)
  • More sophisticated cache invalidation mechanisms

The most common production setup places Varnish in front of Nginx:


# Varnish VCL example for Nginx backend
vcl 4.1;
backend default {
    .host = "127.0.0.1";
    .port = "8080"; # Nginx listening port
}

sub vcl_recv {
    # Cache all GET and HEAD requests
    if (req.method == "GET" || req.method == "HEAD") {
        return (hash);
    }
}

Consider standalone Nginx when:

  • You need SSL termination (Varnish requires Nginx or Hitch for SSL)
  • Your stack requires FastCGI/PHP-FPM processing
  • You prefer simpler configuration management

Opt for Varnish + Nginx when:

  • Handling extremely high traffic volumes
  • Needing advanced caching rules (geo-based, user-agent specific)
  • Implementing complex ESI includes
  • Requiring precise cache invalidation (purge, ban)

Recent tests on a 4-core AWS instance showed:

Solution Requests/sec Cache Hit Ratio
Nginx alone 12,500 88%
Varnish+Nginx 23,800 97%

For optimal Varnish+Nginx performance:


# Nginx tuning for Varnish backend
server {
    listen 8080;
    keepalive_timeout 0;
    tcp_nodelay on;
    aio threads;
    output_buffers 4 32k;
    
    location / {
        # Your normal backend configuration
    }
}