How To Cache Content on Nginx
In this tutorial, we will show you how to cache content on Nginx web server. Nginx stands as one of the most powerful and versatile web servers available today, serving millions of websites across the internet. Among its many capabilities, content caching represents a critical feature that can dramatically improve your web application’s performance, reduce server load, and enhance user experience. By intelligently storing copies of frequently accessed content, Nginx eliminates redundant processing and delivers responses at lightning speed.
This guide will walk you through everything you need to know about implementing effective caching strategies in Nginx, from basic setup to advanced techniques that can transform your website’s performance.
Understanding Content Caching Fundamentals
At its core, caching refers to the process of storing copies of resources to reduce the need for repetitive work. When it comes to web servers like Nginx, caching allows the server to save previously generated responses and reuse them when similar requests arrive.
How caching improves performance
When a user visits your website, their browser requests various resources – HTML pages, CSS files, JavaScript, images, and more. Without caching, Nginx would forward each request to your application server (like PHP, Python, or Node.js), which must process the request and generate a response every time. This repetitive processing creates unnecessary load and increases response times.
With caching enabled, Nginx can intercept these requests, check if it already has a valid cached response, and deliver it directly to the client without involving the application server. This process dramatically reduces:
- Server CPU and memory utilization
- Database load
- Network traffic between servers
- Response time (often by orders of magnitude)
- Energy consumption and hosting costs
Caching particularly benefits high-traffic websites where the same content is frequently requested by different users. For instance, popular blog posts, product pages, or media files can be served directly from the cache rather than regenerating identical content thousands of times.
Prerequisites for Nginx Caching
Before diving into configuration, ensure your system meets these requirements:
- A functioning Nginx installation (version 1.7.11 or higher recommended for advanced features)
- Root or sudo access to modify Nginx configuration files
- Basic familiarity with Linux file permissions
- Sufficient disk space for cache storage
- A text editor for configuration (nano, vim, etc.)
Essential directory structure
You’ll need to create a dedicated directory for Nginx to store cached content. This directory should be:
- Owned by the Nginx user (usually www-data or nginx)
- Located on a fast storage device (SSD recommended for production)
- Configured with appropriate permissions (typically 700 or 755)
For example, create a cache directory with:
sudo mkdir -p /var/cache/nginx
sudo chown -R www-data:www-data /var/cache/nginx
sudo chmod 755 /var/cache/nginx
After making configuration changes, remember to verify syntax and restart Nginx:
sudo nginx -t
sudo systemctl restart nginx
Caching Static Content in Nginx
Static content includes resources that rarely change, such as images, CSS files, JavaScript, PDFs, and other media files. These are perfect candidates for aggressive caching strategies.
Basic static content caching
The simplest way to enable caching for static content is by using the expires
directive in your Nginx configuration. This directive adds Cache-Control and Expires headers to responses, instructing browsers how long to cache content locally.
Open your Nginx configuration file:
sudo nano /etc/nginx/sites-available/default
Add caching rules for different content types:
# Add before server block
map $sent_http_content_type $expires {
default off;
text/html epoch;
text/css max;
application/javascript max;
~image/ max;
~font/ max;
}
server {
listen 80;
server_name example.com;
# Apply expiration map
expires $expires;
# Other configuration...
}
In this configuration:
max
sets expiration to the maximum value (about one year)epoch
effectively disables caching (useful for HTML that changes frequently)off
doesn’t add any caching headers
For more granular control, you can set specific cache durations for different locations:
location ~* \.(jpg|jpeg|png|gif|ico)$ {
expires 30d;
}
location ~* \.(css|js)$ {
expires 7d;
}
After implementing, use browser developer tools to verify caching headers are correctly applied. You should see Cache-Control
and Expires
headers in the response.
Caching Dynamic Content in Nginx
Dynamic content represents a greater challenge for caching as it’s generated on-demand and may vary based on user input, database queries, or other factors. Nginx offers powerful tools for caching dynamic content through its reverse proxy capabilities.
Setting up proxy cache
First, configure a cache storage location in your main Nginx configuration file:
http {
# Other directives...
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m max_size=1g
inactive=60m use_temp_path=off;
}
This configuration:
- Creates a cache in
/var/cache/nginx
- Uses a two-level directory hierarchy (
levels=1:2
) for better performance - Allocates 10MB of shared memory for cache metadata (
keys_zone=my_cache:10m
) - Limits total cache size to 1GB (
max_size=1g
) - Removes entries inactive for 60 minutes (
inactive=60m
) - Avoids using temporary files (
use_temp_path=off
)
Next, enable caching in your server or location blocks:
server {
# Server configuration...
location / {
proxy_cache my_cache;
proxy_cache_valid 200 302 10m;
proxy_cache_valid 404 1m;
proxy_pass http://backend_server;
# Optional: Add cache status header
add_header X-Cache-Status $upstream_cache_status;
}
}
This configuration:
- Enables caching using the zone defined earlier
- Caches successful responses (200) and redirects (302) for 10 minutes
- Caches not-found responses (404) for 1 minute
- Passes requests to your backend application
- Adds a header showing cache status (HIT, MISS, EXPIRED, etc.)
Caching PHP content with FastCGI
For PHP applications, use FastCGI caching:
http {
# Other directives...
fastcgi_cache_path /var/cache/nginx/fcgi levels=1:2 keys_zone=fcgi_cache:10m
max_size=1g inactive=60m;
fastcgi_cache_key "$scheme$request_method$host$request_uri";
}
server {
# Server configuration...
location ~ \.php$ {
fastcgi_cache fcgi_cache;
fastcgi_cache_valid 200 10m;
fastcgi_cache_bypass $cookie_PHPSESSID;
fastcgi_no_cache $cookie_PHPSESSID;
# Standard FastCGI configuration
fastcgi_pass unix:/var/run/php/php-fpm.sock;
fastcgi_index index.php;
include fastcgi_params;
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}
}
This setup:
- Creates a dedicated cache for FastCGI responses
- Generates cache keys based on the request details
- Caches successful responses for 10 minutes
- Bypasses cache for authenticated users (with PHP sessions)
Advanced Caching Techniques
Once you’ve mastered basic caching, these advanced techniques can further optimize performance.
Micro-caching for dynamic content
Micro-caching involves caching dynamic content for very short periods (1-10 seconds). This technique is particularly effective for high-traffic pages where even slightly outdated content is acceptable:
location / {
proxy_cache my_cache;
proxy_cache_valid 200 5s;
proxy_cache_use_stale updating error timeout invalid_header http_500;
proxy_cache_background_update on;
proxy_pass http://backend_server;
}
This configuration caches responses for just 5 seconds but can dramatically reduce server load during traffic spikes.
Serving stale content during updates
The proxy_cache_use_stale
directive allows Nginx to serve expired cached content in specific situations, such as when the backend is unreachable or while updating the cache:
location / {
proxy_cache my_cache;
proxy_cache_valid 200 5m;
proxy_cache_use_stale updating error timeout http_500 http_502 http_503 http_504;
proxy_cache_background_update on;
proxy_pass http://backend_server;
}
This approach ensures users always receive a response, even if it’s slightly outdated, improving perceived performance and availability.
Cache slicing for large files
For large media files, Nginx’s Cache Slice module allows efficient handling of byte-range requests:
http {
# Other directives...
slice 1m;
proxy_cache_path /var/cache/nginx/slices levels=1:2 keys_zone=slice_cache:10m
max_size=1g inactive=60m;
}
server {
# Server configuration...
location /videos/ {
slice 1m;
proxy_cache slice_cache;
proxy_cache_key $uri$is_args$args$slice_range;
proxy_set_header Range $slice_range;
proxy_http_version 1.1;
proxy_cache_valid 200 206 24h;
proxy_pass http://backend_server;
}
}
This configuration divides large files into 1MB slices, allowing Nginx to cache and serve partial content efficiently. This is particularly valuable for streaming media like videos.
Cache-Control Headers Management
HTTP cache headers determine how browsers and proxies cache content. Configuring these properly is essential for an effective caching strategy.
Understanding key cache headers
Cache-Control
: Primary header for defining caching policiesExpires
: Legacy header specifying exact expiration date/timeETag
: Tag that changes when content changes, enabling validationLast-Modified
: Timestamp of when the resource was last changed
Setting cache headers in Nginx
You can set or modify these headers using Nginx directives:
location ~* \.(jpg|jpeg|png|gif)$ {
# Cache images for 30 days
expires 30d;
add_header Cache-Control "public";
# Enable validation
etag on;
add_header Last-Modified $date_gmt;
}
location / {
# For dynamic content - prevent caching
add_header Cache-Control "no-store, must-revalidate";
expires 0;
}
Overriding upstream headers
Sometimes you need to modify cache headers sent by your application:
location / {
proxy_ignore_headers Cache-Control Expires;
proxy_hide_header Cache-Control;
proxy_hide_header Expires;
add_header Cache-Control "public, max-age=3600";
proxy_pass http://backend_server;
}
This configuration removes original cache headers and applies your own caching policy, giving you complete control over how content is cached.
Caching for Single-Page Applications
Single-page applications (SPAs) built with frameworks like React, Angular, or Vue present unique caching challenges due to their architecture.
Optimal caching strategy for SPAs
server {
# Server configuration...
# Cache static assets aggressively
location /static/ {
expires max;
add_header Cache-Control "public, immutable";
}
# Don't cache the HTML entry point
location = /index.html {
expires -1;
add_header Cache-Control "no-store, must-revalidate";
}
# For all other routes, serve index.html but don't cache
location / {
try_files $uri $uri/ /index.html;
add_header Cache-Control "no-store, must-revalidate";
}
}
This setup applies different caching policies to different parts of your SPA:
- Static assets (JavaScript, CSS, images) are cached aggressively
- The main HTML file is never cached, ensuring users always get the latest version
- All routes serve the main HTML file without caching
For API requests from your SPA:
location /api/ {
proxy_cache api_cache;
proxy_cache_valid 200 30s;
proxy_cache_methods GET HEAD;
proxy_pass http://api_backend;
}
This configuration applies short-lived caching to API responses, reducing load while keeping data relatively fresh.
Cache Invalidation and Purging
No caching system is complete without the ability to invalidate cached content when the underlying data changes.
Manual cache purging
To manually purge specific cached items, install and configure the ngx_cache_purge module:
http {
# Other directives...
proxy_cache_path /var/cache/nginx levels=1:2 keys_zone=my_cache:10m;
}
server {
# Server configuration...
location ~ /purge(/.*) {
# Restrict access to purge requests
allow 127.0.0.1;
deny all;
proxy_cache_purge my_cache "$scheme$request_method$host$1";
}
}
With this configuration, you can purge specific URLs:
curl -X PURGE http://localhost/purge/path/to/cached/resource
Automated cache invalidation
For automated invalidation, integrate your application’s content management system with Nginx caching:
- Create a script that sends PURGE requests to Nginx when content changes
- Configure your CMS to call this script after updates
- Consider using webhooks or event-driven architecture for real-time invalidation
For example, in a WordPress plugin:
function purge_nginx_cache($post_id) {
$url = get_permalink($post_id);
$parts = parse_url($url);
$purge_url = 'http://localhost/purge' . $parts['path'];
wp_remote_request($purge_url, array('method' => 'PURGE'));
}
add_action('save_post', 'purge_nginx_cache');
Cache Monitoring and Troubleshooting
Effective monitoring ensures your caching strategy works as expected and helps identify potential issues.
Monitoring cache performance
Enable cache status headers to track cache effectiveness:
location / {
proxy_cache my_cache;
add_header X-Cache-Status $upstream_cache_status;
proxy_pass http://backend_server;
}
This adds a header showing:
HIT
: Response came from cacheMISS
: Response wasn’t cachedEXPIRED
: Cached response expiredUPDATING
: Serving stale content while updatingSTALE
: Serving stale content
Use log formats to track cache statistics:
log_format cache_log '$remote_addr - $upstream_cache_status [$time_local] '
'"$request" $status $body_bytes_sent '
'"$http_referer" "$http_user_agent"';
server {
# Server configuration...
access_log /var/log/nginx/cache.log cache_log;
}
Analyze logs with tools like GoAccess or parse them with scripts to calculate hit ratios and identify frequently missed content.
Common caching issues and solutions
- Content not being cached
- Check cache configuration paths and permissions
- Verify cache keys are properly defined
- Ensure response headers don’t prevent caching
- Stale content being served
- Review cache invalidation mechanisms
- Adjust cache validity durations
- Implement cache purging for critical updates
- Cache growing too large
- Set appropriate
max_size
andinactive
parameters - Monitor disk usage and adjust as needed
- Consider segregating caches for different content types
- Set appropriate
Real-World Optimization Scenarios
Different applications require different caching strategies. Here are some tailored approaches:
E-commerce site optimization
# Cache product images and static assets aggressively
location ~* \.(jpg|jpeg|png|gif|ico|css|js)$ {
expires 30d;
add_header Cache-Control "public, immutable";
}
# Product pages - short cache with validation
location ~* ^/product/ {
proxy_cache product_cache;
proxy_cache_valid 200 5m;
proxy_cache_bypass $cookie_session $arg_nocache;
proxy_cache_key "$host$request_uri$cookie_user";
proxy_pass http://backend_server;
}
# Shopping cart and checkout - no caching
location ~* ^/(cart|checkout)/ {
proxy_no_cache 1;
proxy_pass http://backend_server;
}
Content management system optimization
For WordPress, Drupal, or similar CMS platforms:
# Cache static assets
location ~* \.(jpg|jpeg|png|gif|ico|css|js)$ {
expires 7d;
}
# Cache pages for logged-out users only
location / {
proxy_cache cms_cache;
proxy_cache_valid 200 10m;
proxy_cache_bypass $cookie_wordpress_logged_in;
proxy_no_cache $cookie_wordpress_logged_in;
proxy_pass http://backend_server;
}
# Admin areas - no caching
location ~* ^/wp-admin/ {
proxy_no_cache 1;
proxy_pass http://backend_server;
}
High-traffic media site
For news sites or blogs with traffic spikes:
# Micro-caching for homepage and article pages
location / {
proxy_cache media_cache;
proxy_cache_valid 200 30s;
proxy_cache_use_stale updating error timeout;
proxy_cache_background_update on;
proxy_pass http://backend_server;
}
# Aggressive caching for media files
location /media/ {
expires max;
add_header Cache-Control "public, immutable";
}