L1 in-memory LRU for instant access. L2 Redis for shared cache across workers. Early cache middleware serves responses before the proxy pipeline even starts.
GET /api/products
Early Cache Middleware
Checks cache BEFORE auth, transforms, or proxy
L1: In-Memory LRU
5,000 entries, 3s TTL. Per-worker, zero network.
~0.001msL2: Redis
Shared across workers. Configurable TTL per route.
~0.1msCache MISS → Proxy to Backend
Response cached on return (fire-and-forget write)
Every detail optimized for maximum cache hit rate and minimum latency.
Ultra-fast hash function for cache key generation. 10x faster than MD5, collision-resistant for URL patterns.
XOR-based hash ensures ?a=1&b=2 and ?b=2&a=1 produce the same cache key. Zero array allocation.
Sensitive headers (Authorization, Cookie, Set-Cookie) are automatically stripped before caching.
Redis failures never block requests. Cache writes are fire-and-forget. Your API stays fast even if Redis is down.
Cache keys include a config version. Route changes automatically invalidate stale cache entries without manual purging.
Redis Pub/Sub broadcasts cache invalidation to all workers. Update a route on node 1, cache clears on all nodes.
Most gateways check the cache after authentication and transformation. Nolxy checks it first. On a cache HIT:
Response caching is available on all plans. Configure TTL per route, and Nolxy handles the rest.