While response compression (Accept-Encoding) is universally implemented, request compression remains surprisingly underutilized despite its potential for bandwidth-heavy operations. The HTTP/1.1 spec (RFC 2616) actually supports request compression through the Content-Encoding
header, but browser implementations lag behind.
Consider these scenarios where request compression delivers tangible benefits:
- Batch API operations with 10MB+ JSON payloads
- File uploads via POST/PUT
- GraphQL queries returning complex object graphs
- WebSockets transmitting binary data
Modern browsers don't automatically compress requests, but we can implement it manually:
// Browser-side JavaScript using pako.js
async function sendCompressedRequest(url, data) {
const compressed = pako.gzip(JSON.stringify(data));
await fetch(url, {
method: 'POST',
headers: {
'Content-Encoding': 'gzip',
'Content-Type': 'application/json'
},
body: compressed
});
}
For Node.js/Express servers:
const express = require('express');
const zlib = require('zlib');
const app = express();
app.use(express.raw({ type: '*/*' }));
app.post('/api', (req, res) => {
const encoding = req.headers['content-encoding'];
if (encoding === 'gzip') {
zlib.gunzip(req.body, (err, buffer) => {
if (err) return res.status(400).send('Invalid compressed data');
const data = JSON.parse(buffer.toString());
// Process decompressed data
});
} else {
// Handle uncompressed request
}
});
Compression isn't free. Benchmark these factors:
- CPU overhead vs. bandwidth savings
- Optimal compression level (1-9)
- Minimum payload size threshold (typically 1KB+)
For modern applications, consider:
- HTTP/2 header compression
- Binary protocols like Protocol Buffers
- WebTransport for UDP-based transmission
Note that compressed data can:
- Obscure malicious payloads from WAFs
- Be vulnerable to BREACH attacks
- Require additional validation post-decompression
While response compression (via Content-Encoding) is well-documented, request compression remains less explored despite its potential benefits for bandwidth-heavy operations. Modern web applications frequently handle large PUT/POST requests containing JSON/XML payloads where compression could reduce transfer times by 70-90%.
Browsers don't natively compress requests, but we can implement it manually:
// JavaScript example using pako for compression
import pako from 'pako';
async function sendCompressedRequest(url, data) {
const compressed = pako.deflate(JSON.stringify(data));
const response = await fetch(url, {
method: 'POST',
headers: {
'Content-Encoding': 'deflate',
'Content-Type': 'application/json'
},
body: compressed
});
return response.json();
}
For Node.js/Express servers:
const express = require('express');
const zlib = require('zlib');
const app = express();
app.use(express.raw({ type: '*/*', limit: '50mb' }));
app.post('/api/data', (req, res) => {
try {
// Check for compressed content
if (req.headers['content-encoding'] === 'deflate') {
zlib.inflate(req.body, (err, buffer) => {
if (err) return res.status(400).send('Invalid compressed data');
const data = JSON.parse(buffer.toString());
// Process data...
res.json({ status: 'processed' });
});
} else {
// Fallback to regular processing
const data = JSON.parse(req.body.toString());
res.json({ status: 'processed' });
}
} catch (e) {
res.status(400).send('Invalid request');
}
});
Compression thresholds matter. Benchmark tests show:
- Payloads < 1KB: Compression overhead exceeds benefits
- 1KB-10KB: Marginal gains (10-30% reduction)
- >10KB: Significant improvements (60-90% reduction)
For REST APIs, consider:
# curl example with compressed request
curl -X PUT \
-H "Content-Encoding: gzip" \
-H "Content-Type: application/xml" \
--data-binary @<(gzip -c large_file.xml) \
https://api.example.com/resource
HTTP/2 improves efficiency but doesn't replace payload compression. For binary protocols, evaluate:
- Protocol Buffers with built-in compression
- MessagePack for compact serialization
- Custom binary formats with LZ4/Snappy