Nginx's reverse proxy caching functionality allows you to create a highly performant architecture where frequently accessed dynamic content from backend servers (like Apache/Django) can be cached at the Nginx level. This significantly reduces load on your application servers while improving response times.
Here's a fundamental configuration to enable caching in your Nginx reverse proxy setup:
http {
proxy_cache_path /path/to/cache levels=1:2 keys_zone=my_cache:10m inactive=60m use_temp_path=off;
server {
location / {
proxy_pass http://backend;
proxy_cache my_cache;
proxy_cache_valid 200 302 10m;
proxy_cache_valid 404 1m;
proxy_cache_use_stale error timeout updating;
proxy_cache_background_update on;
add_header X-Proxy-Cache $upstream_cache_status;
}
}
}
For more granular control, you can implement cache headers from your Django application:
from django.views.decorators.cache import cache_control
@cache_control(max_age=3600, public=True)
def my_view(request):
# Your view logic
return response
Two effective methods for cache invalidation:
Time-based Expiration
proxy_cache_valid 200 301 302 1h;
proxy_cache_valid any 5m;
Purging Specific Items
Add this to your Nginx configuration:
location ~ /purge(/.*) {
proxy_cache_purge my_cache $host$1$is_args$args;
}
Then trigger purges from Django:
import requests
requests.request('PURGE', f'http://nginx-server/purge{path}')
For better cache hit rates, customize your cache keys:
proxy_cache_key "$scheme$request_method$host$request_uri$cookie_user";
Add these directives to track cache efficiency:
add_header X-Cache-Status $upstream_cache_status;
log_format cache_log '$remote_addr - $upstream_cache_status [$time_local] "$request"';
access_log /var/log/nginx/cache.log cache_log;
Here's a comprehensive Nginx configuration for Django apps:
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=django_cache:10m inactive=24h max_size=1g;
server {
location /static/ {
alias /path/to/static/files;
}
location / {
proxy_pass http://django_backend;
proxy_cache django_cache;
proxy_cache_key "$scheme$host$request_uri";
proxy_cache_valid 200 302 10m;
proxy_cache_valid 404 1m;
proxy_cache_use_stale error timeout http_500 http_502 http_503 http_504;
proxy_cache_lock on;
proxy_cache_bypass $http_cache_control;
add_header X-Cache $upstream_cache_status;
}
location ~ /purge(/.*) {
allow 127.0.0.1;
deny all;
proxy_cache_purge django_cache $host$1$is_args$args;
}
}
Nginx's proxy cache functionality has evolved significantly in recent versions, making it a powerful tool for caching dynamic content. The proxy_cache module allows Nginx to store responses from backend servers (like Apache serving Django applications) and serve subsequent requests directly from cache.
Here's a minimal configuration to get started with Nginx caching:
http {
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m inactive=60m use_temp_path=off;
server {
listen 80;
location / {
proxy_pass http://apache_backend;
proxy_cache my_cache;
proxy_cache_valid 200 302 10m;
proxy_cache_valid 404 1m;
# Pass headers that might affect content
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
}
}
}
Time-based Expiration
The simplest approach uses the proxy_cache_valid directive:
proxy_cache_valid 200 302 10m; # Cache successful responses for 10 minutes
proxy_cache_valid 404 1m; # Cache 404s for 1 minute
proxy_cache_valid any 5m; # Cache all other responses for 5 minutes
Manual Cache Purging
For explicit invalidation, you'll need to:
- Configure a special location for cache purging
- Secure it properly (IP restriction or authentication)
location ~ /purge(/.*) {
allow 127.0.0.1;
deny all;
proxy_cache_purge my_cache $scheme$host$1$is_args$args;
}
Cache Key Customization
For better cache control, customize your cache keys:
proxy_cache_key "$scheme$request_method$host$request_uri$cookie_user";
Bypassing Cache
Sometimes you need to bypass cache:
location /no-cache/ {
proxy_pass http://apache_backend;
proxy_cache_bypass 1;
proxy_no_cache 1;
}
To integrate with Django for cache invalidation:
# views.py
from django.http import HttpResponse
import requests
def purge_cache(request, path):
purge_url = f"http://nginx-server/purge/{path}"
response = requests.get(purge_url)
return HttpResponse(f"Cache purged: {response.status_code}")
Add this to your Nginx config to monitor cache hits:
location /cache-status {
allow 127.0.0.1;
deny all;
add_header Content-Type text/plain;
return 200 "Hits: $upstream_cache_status";
}
Here's a comprehensive configuration example:
http {
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m
inactive=60m use_temp_path=off max_size=1g;
upstream apache_backend {
server 127.0.0.1:8080;
}
server {
listen 80;
server_name example.com;
# Cache status endpoint
location /cache-status {
allow 127.0.0.1;
deny all;
stub_status on;
access_log off;
}
# Purge endpoint
location ~ /purge(/.*) {
allow 127.0.0.1;
allow 192.168.1.0/24;
deny all;
proxy_cache_purge my_cache $scheme$host$1$is_args$args;
}
# Main proxy configuration
location / {
proxy_pass http://apache_backend;
proxy_cache my_cache;
proxy_cache_valid 200 302 10m;
proxy_cache_valid 404 1m;
proxy_cache_valid any 5m;
proxy_cache_use_stale error timeout updating;
proxy_cache_background_update on;
proxy_cache_lock on;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
}