Response Caching

Two-Tier Cache.
Sub-Millisecond Hits.

L1 in-memory LRU for instant access. L2 Redis for shared cache across workers. Early cache middleware serves responses before the proxy pipeline even starts.

Cache Lookup Flow

GET /api/products

Early Cache Middleware

Checks cache BEFORE auth, transforms, or proxy

~0.1ms

L1: In-Memory LRU

5,000 entries, 3s TTL. Per-worker, zero network.

~0.001ms

L2: Redis

Shared across workers. Configurable TTL per route.

~0.1ms

Cache MISS → Proxy to Backend

Response cached on return (fire-and-forget write)

Engineered for Performance

Every detail optimized for maximum cache hit rate and minimum latency.

FNV-1a Cache Keys

Ultra-fast hash function for cache key generation. 10x faster than MD5, collision-resistant for URL patterns.

Order-Independent Query Hashing

XOR-based hash ensures ?a=1&b=2 and ?b=2&a=1 produce the same cache key. Zero array allocation.

Security Header Filtering

Sensitive headers (Authorization, Cookie, Set-Cookie) are automatically stripped before caching.

Fail-Open Design

Redis failures never block requests. Cache writes are fire-and-forget. Your API stays fast even if Redis is down.

Config Version Keys

Cache keys include a config version. Route changes automatically invalidate stale cache entries without manual purging.

Cross-Instance Invalidation

Redis Pub/Sub broadcasts cache invalidation to all workers. Update a route on node 1, cache clears on all nodes.

The Early Cache Advantage

Most gateways check the cache after authentication and transformation. Nolxy checks it first. On a cache HIT:

  • Authentication is still verified (security first)
  • Route matching is done once and reused downstream
  • Request transforms are SKIPPED entirely
  • Proxy call is SKIPPED entirely
  • Response is sent directly from memory
  • gatewayContext is pre-populated for cache MISSes (no duplicate DB lookups)

Cache Smarter, Not Harder

Response caching is available on all plans. Configure TTL per route, and Nolxy handles the rest.