# CDN vs Application Cache — Where to Cache What
TL;DR
Your app has caching opportunities at 5 distinct layers — browser, CDN edge, reverse proxy, app memory, database. Putting the right cache at the right layer cuts response time from 500ms to 20ms. This guide maps every cache decision with DomainIndia + Cloudflare examples.
## The 5 caching layers
| Layer | Latency to user | TTL typical | Who sets it |
| Browser (HTTP cache) | 0ms | minutes-days | HTTP headers |
| CDN edge (Cloudflare) | 10-50ms | seconds-days | HTTP headers + rules |
| Reverse proxy (nginx, Varnish) | 1-5ms | seconds-minutes | proxy config |
| Application cache (Redis, memcached) | 1-2ms | minutes-hours | app code |
| Database (query cache, materialised views) | 0-10ms | varies | DB config / app |
Each layer catches different things. Use them together.
## Decision matrix — what goes where
| Content type | Best cache layer | TTL |
| Static assets (CSS, JS, images, fonts) | Browser + CDN | 1 year |
| Public HTML pages (marketing) | CDN | 1-24 hours |
| Personalised HTML (dashboard) | App cache (per-user) | Minutes |
| DB query results (user profile) | App (Redis) | Minutes |
| DB query results (product list) | App (Redis) + CDN API cache | 1-5 min |
| Third-party API responses | App (Redis) | Hours |
| Computed views (aggregations) | Materialised view + App | Hours-days |
| Session data | App (Redis) | Until expiry |
| Rate-limit counters | App (Redis) | Window |
## Layer 1 — Browser cache
The cheapest cache. Served without network request at all.
Set HTTP headers:
```nginx
location ~* .(jpg|jpeg|png|gif|webp|svg|css|js|woff2|ttf)$ {
expires 1y;
add_header Cache-Control "public, immutable";
add_header Vary "Accept-Encoding";
}
location /api/ {
expires -1; # no cache for API
add_header Cache-Control "no-store";
}
location / {
expires 1h;
add_header Cache-Control "public, must-revalidate";
}
```
**Fingerprinting** your static files breaks cache automatically:
`bundle.abc123.js` — when content changes, hash changes, browser refetches. Set TTL = forever.
## Layer 2 — CDN edge (Cloudflare)
Cloudflare caches what you tell it to. By default: static files yes, HTML no.
### Cache everything with Page Rule
Cloudflare Dashboard → Caching → Page Rules → Add:
```
URL: *yourcompany.com/*
Settings:
Cache Level: Cache Everything
Edge Cache TTL: 2 hours
Browser Cache TTL: 30 minutes
```
Now static HTML pages cache at edge too.
### Bypass cache for dynamic paths
```
URL: *yourcompany.com/admin/*
Settings:
Cache Level: Bypass
```
And for cookies:
```
URL: *yourcompany.com/*
Settings:
Cache Level: Cache Everything
Edge Cache TTL: 2h
Cache by Device Type: On (separate mobile/desktop variants)
Bypass Cache on Cookie: (wp_logged_in|sessionid|auth_token)
```
When `auth_token` cookie is set → Cloudflare skips cache, hits origin. Logged-out users see cached; logged-in users see fresh.
### Cloudflare's own analytics
See hit ratio: Cloudflare → Analytics → Cache.
Target: >70% hit ratio for content sites. Dashboard apps: may be lower but still benefits from static-asset caching.
### Purge cache after deploy
```bash
curl -X POST "https://api.cloudflare.com/client/v4/zones/YOUR_ZONE_ID/purge_cache"
-H "Authorization: Bearer YOUR_API_TOKEN"
-H "Content-Type: application/json"
--data '{"purge_everything":true}'
```
Or purge specific URLs:
```bash
--data '{"files":["https://yourcompany.com/api/users","https://yourcompany.com/"]}'
```
Add to your deploy pipeline.
## Layer 3 — Reverse proxy (nginx / Varnish)
On your VPS — caches after origin generates, before sending to Cloudflare.
nginx:
```nginx
proxy_cache_path /var/cache/nginx keys_zone=STATIC:10m max_size=1g inactive=60m;
location /api/ {
proxy_cache STATIC;
proxy_cache_valid 200 5m;
proxy_cache_valid 404 1m;
proxy_cache_key "$scheme$request_method$host$request_uri";
proxy_cache_use_stale error timeout updating http_500 http_502 http_503 http_504;
add_header X-Cache-Status $upstream_cache_status;
proxy_pass http://app;
}
```
`X-Cache-Status: HIT` / `MISS` header shows cache state.
Varnish (alternative, more powerful):
```vcl
sub vcl_backend_response {
set beresp.ttl = 5m;
set beresp.grace = 1h; # serve stale during outages
}
```
See our [Server-Side Caching article](https://domainindia.com/support/kb/server-side-caching-redis-memcached-varnish-opcache).
## Layer 4 — Application cache (Redis)
The most flexible layer — your code decides what to cache.
See full patterns in our [Redis Beyond Caching](https://domainindia.com/support/kb/redis-data-structures-beyond-caching) article.
Classic cache-aside:
```typescript
async function getUser(id: string) {
const cached = await redis.get(`user:${id}`);
if (cached) return JSON.parse(cached);
const user = await db.user.findUnique({ where: { id } });
await redis.setex(`user:${id}`, 300, JSON.stringify(user));
return user;
}
```
**Cache tags** (Laravel, Rails) — invalidate related keys together:
```php
// Laravel
Cache::tags(['users', "user:{$id}"])->put("user.{$id}.profile", $data, 300);
// Later invalidate just this user:
Cache::tags(["user:{$id}"])->flush();
```
## Layer 5 — Database-level
- **Postgres materialized views** for expensive aggregations:
```sql
CREATE MATERIALIZED VIEW daily_stats AS
SELECT date(created_at), count(*), sum(amount)
FROM orders GROUP BY 1;
REFRESH MATERIALIZED VIEW daily_stats; -- schedule hourly
```
- **Postgres `pg_prewarm`** — load hot tables into shared_buffers on startup
- **MySQL query cache** (removed in 8.0; use Redis instead)
- **Read replicas** — not caching exactly, but distributes load
## Invalidation strategies
Three approaches:
### TTL-only (easy, eventual consistency)
```
Cache: 300s TTL
Updates: ignored
Result: users see stale up to 5 min
```
Works for: product catalogs, articles, rankings.
### Explicit invalidation
```typescript
async function updateUser(id, data) {
await db.user.update({ where: { id }, data });
await redis.del(`user:${id}`);
}
```
Works for: user profiles, settings, anything with clear "owner".
### Write-through
```typescript
async function updateUser(id, data) {
const user = await db.user.update({ where: { id }, data });
await redis.setex(`user:${id}`, 300, JSON.stringify(user));
}
```
Cache always matches DB. Best consistency, most code.
## Cache stampede prevention
When cache expires, 1000 concurrent requests all hit DB. Prevent with:
**Lock (SETNX):**
```typescript
async function withLock(key, callback, lockTtl = 10) {
const lockKey = `lock:${key}`;
const acquired = await redis.set(lockKey, '1', { NX: true, EX: lockTtl });
if (acquired) {
try { return await callback(); } finally { await redis.del(lockKey); }
}
// Someone else is computing — wait briefly and try cache again
await new Promise(r => setTimeout(r, 100));
return JSON.parse(await redis.get(key));
}
```
**Probabilistic early refresh:** (see Laravel Cache article)
## Hit ratio targets
Measure. Without data, you're guessing:
| Layer | Target hit rate |
| Browser cache | 60-80% |
| Cloudflare | >70% (content), >50% (apps) |
| nginx proxy cache | >80% for what you cache |
| Redis app cache | >80% |
| DB query cache | >90% for cached queries |
Monitor via:
- `redis-cli INFO stats` → `keyspace_hits / (keyspace_hits + keyspace_misses)`
- Cloudflare Analytics → Cache
- `X-Cache-Status` nginx header aggregated in logs
## Common pitfalls
## FAQ
Q
Cloudflare + nginx cache — overkill?
Depends on traffic. Cloudflare handles 80% globally; nginx catches what Cloudflare misses (geographic origin requests). For most sites: Cloudflare alone is fine.
Q
How do I cache logged-in users?
Either per-user keys (expensive in Redis memory) or differentiate static vs dynamic parts. See our Next.js App Router article on Partial Prerendering.
Q
Does Cloudflare cost extra for high cache usage?
Free plan includes unlimited Cloudflare bandwidth. Origin bandwidth is what's reduced — huge savings.
Q
CDN for APIs?
Yes for public read-only APIs. Set short TTL (30-60s) and Cache-Control: public. User-specific APIs: bypass.
Q
When does caching hurt?
When it hides bugs (stale data looks like app is broken). When cost of inconsistency > cost of slower response. When debug cycles lengthen ("is it cache or code?").
Combine DomainIndia hosting + Cloudflare for a fast cache stack.
View hosting