Package Exports
- undici-cache-redis
- undici-cache-redis/index.js
This package does not declare an exports field, so the exports above have been automatically detected and optimized by JSPM instead. If any package subpath is missing, it is recommended to post an issue to the original package (undici-cache-redis) to support the "exports" field. If that is not possible, create a JSPM override to customize the exports field for this package.
Readme
undici-cache-redis
A high-performance Redis-backed cache store for Undici's cache interceptor. This library provides seamless HTTP response caching with Redis/Valkey as the storage backend, featuring client-side caching, cache invalidation by tags, and support for managed Redis environments.
Built on top of iovalkey for optimal Redis/Valkey connectivity.
Features
- 🚀 High Performance: Redis-backed caching with client-side optimization
- 🏷️ Cache Tags: Invalidate cached responses by custom tags
- 🔄 Automatic Invalidation: Smart cache invalidation on mutating operations
- 📊 Cache Management: Built-in cache manager for monitoring and administration
- 🌐 Vary Header Support: Proper handling of content negotiation
- ☁️ Cloud Ready: Works with managed Redis services (AWS ElastiCache, etc.)
- 💾 Binary Support: Handles both text and binary response data
- 📈 Tracking Cache: Client-side caching for improved performance
Installation
npm install undici-cache-redis
Quick Start
Basic Usage
import { Agent, interceptors } from 'undici'
import { RedisCacheStore } from 'undici-cache-redis'
// Create a Redis cache store
const store = new RedisCacheStore({
clientOpts: {
host: 'localhost',
port: 6379,
keyPrefix: 'my-app:cache:'
}
})
// Create Undici agent with caching
const agent = new Agent()
.compose(interceptors.cache({ store }))
// Make requests - responses will be automatically cached
const response = await agent.request({
origin: 'https://api.example.com',
method: 'GET',
path: '/users/123'
})
console.log(await response.body.text())
Cache Invalidation by Tags
import { Agent, interceptors } from 'undici'
import { RedisCacheStore } from 'undici-cache-redis'
const store = new RedisCacheStore({
cacheTagsHeader: 'cache-tags' // Header to read cache tags from
})
const agent = new Agent()
.compose(interceptors.cache({ store }))
// Server responds with: Cache-Tags: user:123,profile
const response = await agent.request({
origin: 'https://api.example.com',
method: 'GET',
path: '/users/123'
})
// Later, invalidate all cached responses tagged with 'user:123'
await store.deleteTags(['user:123'])
Advanced Cache Management with RedisCacheManager
import { RedisCacheStore, RedisCacheManager } from 'undici-cache-redis'
// Create both store and manager
const store = new RedisCacheStore({
cacheTagsHeader: 'cache-tags'
})
const manager = new RedisCacheManager({
clientOpts: { host: 'localhost', port: 6379 }
})
// Subscribe to cache events
await manager.subscribe()
manager.on('add-entry', (entry) => {
console.log('Cache entry added:', entry.path, entry.cacheTags)
})
manager.on('delete-entry', ({ id, keyPrefix }) => {
console.log('Cache entry deleted:', id)
})
// Analyze cache contents
await manager.streamEntries((entry) => {
console.log(`Entry: ${entry.path}, Tags: [${entry.cacheTags.join(', ')}]`)
}, '')
// Invalidate by tags using the store
await store.deleteTags(['user:123', 'products'])
// Clean up specific entries by ID
const entriesToDelete = []
await manager.streamEntries((entry) => {
if (entry.path.startsWith('/api/products/')) {
entriesToDelete.push(entry.id)
}
}, '')
if (entriesToDelete.length > 0) {
await manager.deleteIds(entriesToDelete, '')
}
// Get response body for debugging
const responseBody = await manager.getResponseById('some-entry-id', '')
Cache Management
import { RedisCacheManager } from 'undici-cache-redis'
const manager = new RedisCacheManager({
clientOpts: {
host: 'localhost',
port: 6379
}
})
// Subscribe to cache events
await manager.subscribe()
manager.on('add-entry', (entry) => {
console.log('Cache entry added:', entry.id)
})
manager.on('delete-entry', ({ id, keyPrefix }) => {
console.log('Cache entry deleted:', id)
})
// Stream all cache entries
await manager.streamEntries((entry) => {
console.log('Entry:', entry.origin, entry.path, entry.statusCode)
}, 'my-app:cache:')
// Get response body by ID
const responseBody = await manager.getResponseById('entry-id', 'my-app:cache:')
Configuration Options
RedisCacheStore Options
interface RedisCacheStoreOpts {
// Redis client options (passed to iovalkey)
clientOpts?: {
host?: string
port?: number
keyPrefix?: string
// ... other iovalkey options
}
// Maximum size in bytes for a single cached response
maxEntrySize?: number
// Maximum total cache size (for client-side cache)
maxSize?: number
// Maximum number of entries (for client-side cache)
maxCount?: number
// Enable/disable client-side tracking cache (default: true)
tracking?: boolean
// Header name to read cache tags from responses
cacheTagsHeader?: string
// Error callback function
errorCallback?: (err: Error) => void
}
RedisCacheManager Options
interface RedisCacheManagerOpts {
// Redis client options
clientOpts?: {
host?: string
port?: number
// ... other iovalkey options
}
// Whether to configure keyspace event notifications (default: true)
// Set to false for managed Redis services
clientConfigKeyspaceEventNotify?: boolean
}
Advanced Usage Examples
Using with fetch()
import { Agent, interceptors, setGlobalDispatcher } from 'undici'
import { RedisCacheStore } from 'undici-cache-redis'
// Create a Redis cache store
const store = new RedisCacheStore()
// Create agent with caching
const agent = new Agent()
.compose(interceptors.cache({ store }))
// Set as global dispatcher to enable caching for fetch
setGlobalDispatcher(agent)
// Now fetch() automatically uses the cache!
const response = await fetch('https://api.example.com/users/123')
const data = await response.json()
// Cache headers are available
if (response.headers.get('x-cache') === 'HIT') {
console.log('Response was served from cache!')
}
Working with Vary Headers
const store = new RedisCacheStore()
const agent = new Agent()
.compose(interceptors.cache({ store }))
// Different responses cached based on Accept-Language header
const responseEn = await agent.request({
origin: 'https://api.example.com',
method: 'GET',
path: '/content',
headers: { 'Accept-Language': 'en' }
})
const responseFr = await agent.request({
origin: 'https://api.example.com',
method: 'GET',
path: '/content',
headers: { 'Accept-Language': 'fr' }
})
Manual Cache Operations
const store = new RedisCacheStore()
// Delete specific cache entries
await store.deleteKeys([
{ origin: 'https://api.example.com', method: 'GET', path: '/users/123' }
])
// Delete by cache tags
await store.deleteTags(['user:123', 'profile'])
// Close the store when done
await store.close()
Error Handling
const store = new RedisCacheStore({
errorCallback: (err) => {
console.error('Cache error:', err.message)
// Send to monitoring service
monitoringService.error('cache_error', err)
}
})
Managed Redis Services
When using managed Redis services like AWS ElastiCache, some Redis commands may be restricted. Configure the cache manager accordingly:
const manager = new RedisCacheManager({
clientConfigKeyspaceEventNotify: false, // Disable auto-configuration
clientOpts: {
host: 'your-elasticache-endpoint.cache.amazonaws.com',
port: 6379
}
})
Ensure your managed Redis instance has the following configuration:
notify-keyspace-events AKE
(if not automatically configured)
Multi-Host Architecture
graph TB
subgraph "Users"
U1[User 1]
U2[User 2]
U3[User N]
end
subgraph "Host 1"
A1[App]
B1[Local Cache]
end
subgraph "Host 2"
A2[App]
B2[Local Cache]
end
subgraph "Host N"
A3[App]
B3[Local Cache]
end
subgraph "Redis/Valkey"
R[Shared Cache Storage<br/>+ Invalidation Events]
end
subgraph "External APIs"
API[HTTP APIs]
end
U1 --> A1
U2 --> A2
U3 --> A3
A1 <--> B1
A2 <--> B2
A3 <--> B3
B1 <--> R
B2 <--> R
B3 <--> R
A1 --> API
A2 --> API
A3 --> API
R -.-> B1
R -.-> B2
R -.-> B3
classDef users fill:#e8f5e8
classDef app fill:#e3f2fd
classDef cache fill:#f3e5f5
classDef redis fill:#ffebee
classDef api fill:#fff3e0
class U1,U2,U3 users
class A1,A2,A3 app
class B1,B2,B3 cache
class R redis
class API api
Flow: Users make requests → Apps check local/Redis cache → If miss, fetch from APIs → Cache responses → Invalidation events sync all hosts.
Cache Key Structure
The library uses a structured approach to Redis keys:
- Metadata keys:
{prefix}metadata:{origin}:{path}:{method}:{id}
- Value keys:
{prefix}values:{id}
- ID keys:
{prefix}ids:{id}
- Tag keys:
{prefix}cache-tags:{tag1}:{tag2}:{id}
Where {prefix}
is your configured keyPrefix
and {id}
is a UUID for each cache entry.
Cache Invalidation Flow
The following diagram illustrates how cache invalidation works across different scenarios:
flowchart TD
A[Cache Invalidation Request] --> B{Invalidation Type}
B -->|Delete by Key| C[delete key]
B -->|Delete by Tags| D[deleteTags tags]
B -->|Automatic Cleanup| E[Redis Expiration Events]
C --> F[Find Metadata Keys]
F --> G[Get Metadata from Redis]
G --> H[Extract Associated Keys]
H --> I[Delete Redis Keys]
I --> JJ[Redis/Valkey Key Deleted]
JJ --> J[Update Tracking Cache]
J --> K[Clean up Tags]
K --> L[Complete]
D --> M[Scan for Tag Patterns]
M --> N[Find Matching Tag Keys]
N --> O[Get Metadata References]
O --> P[Delete Tag Keys]
P --> PP[Redis/Valkey Keys Deleted]
PP --> Q[Delete Referenced Entries]
Q --> R[Update Tracking Cache]
R --> S[Complete]
E --> T[Redis Keyspace Event]
T --> U{Event Type}
U -->|expired| V[Entry Expiration]
U -->|del| W[Manual Deletion]
V --> X[Parse Key Type]
X --> Y{Key Type}
Y -->|metadata| Z[Emit delete-entry Event]
Y -->|cache-tags| AA[Parse Tags from Key]
AA --> BB[Global Tag Cleanup]
BB --> CC[Delete Tag Entries]
CC --> DD[Complete]
W --> X
Z --> DD
%% Client-side tracking invalidation triggered by Redis updates
JJ --> EE[Redis Client Tracking Detects Change]
PP --> EE
EE --> FF[__redis__:invalidate Event]
FF --> GG[Parse Metadata Key]
GG --> HH[Remove from Tracking Cache]
HH --> II[Tracking Complete]
%% Styling
classDef primary fill:#e1f5fe
classDef process fill:#f3e5f5
classDef decision fill:#fff3e0
classDef complete fill:#e8f5e8
classDef redis fill:#ffebee
class A,C,D,E primary
class F,G,H,I,J,K,M,N,O,P,Q,R,T,V,W,X,AA,BB,CC,FF,GG,HH process
class B,U,Y decision
class L,S,DD,II complete
class JJ,PP,EE redis
Invalidation Methods
- Direct Key Deletion: Targets specific cache entries by URL pattern
- Tag-based Deletion: Removes all entries associated with given cache tags
- Automatic Expiration: Handles Redis TTL expiration and manual deletions
- Client-side Tracking: Maintains local cache consistency via Redis invalidation notifications
Performance Considerations
- Client-side Tracking: Enabled by default, provides in-memory caching of metadata
- Pipeline Operations: Uses Redis pipelining for batch operations
- Binary Data: Efficiently handles binary responses with base64 encoding
- Memory Management: Configurable size limits prevent memory exhaustion
API Reference
RedisCacheStore
Methods
get(key: CacheKey): Promise<GetResult | undefined>
- Retrieve cached responsecreateWriteStream(key: CacheKey, value: CachedResponse): Writable
- Create write stream for cachingdelete(key: CacheKey): Promise<void>
- Delete cache entries by key patterndeleteKeys(keys: CacheKey[]): Promise<void>
- Delete multiple cache entriesdeleteTags(tags: string[]): Promise<void>
- Delete entries by cache tagsclose(): Promise<void>
- Close Redis connections
Events
write
- Emitted when a cache entry is written
RedisCacheManager
Methods
streamEntries(callback, keyPrefix): Promise<void>
- Stream all cache entriessubscribe(): Promise<void>
- Subscribe to cache eventsgetResponseById(id, keyPrefix): Promise<string | null>
- Get response body by IDgetDependentEntries(id, keyPrefix): Promise<CacheEntry[]>
- Get entries sharing cache tagsdeleteIds(ids, keyPrefix): Promise<void>
- Delete entries by IDsclose(): Promise<void>
- Close connections
Events
add-entry
- Emitted when a cache entry is addeddelete-entry
- Emitted when a cache entry is deletederror
- Emitted on errors
Troubleshooting
Common Issues
Connection Errors
// Ensure Redis is running and accessible
const store = new RedisCacheStore({
clientOpts: {
host: 'localhost',
port: 6379,
connectTimeout: 10000,
retryDelayOnFailover: 100
}
})
Memory Issues
// Limit cache size to prevent memory exhaustion
const store = new RedisCacheStore({
maxEntrySize: 1024 * 1024, // 1MB per entry
maxSize: 100 * 1024 * 1024, // 100MB total
maxCount: 10000 // Max 10k entries
})
Managed Redis Issues
// For AWS ElastiCache or similar services
const manager = new RedisCacheManager({
clientConfigKeyspaceEventNotify: false,
clientOpts: {
host: 'your-cluster.cache.amazonaws.com',
port: 6379,
family: 4, // Force IPv4
enableReadyCheck: false
}
})
Requirements
- Node.js >= 20
- Redis >= 6.0 or Valkey >= 7.2
- Undici >= 7.0
License
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
Benchmarking
This project includes comprehensive benchmarks to measure performance improvements with different caching strategies.
Quick Benchmark
# Automated benchmark with all prerequisites checked
./run-benchmarks.sh
# Or run manually
npm run bench
The benchmarks test a realistic proxy server architecture:
- Server Foo (Proxy): Uses Undici with different cache configurations
- Server Bar (Backend): API server with simulated latency
- Autocannon: Load testing tool measuring performance
Expected results show 10-15x performance improvement with caching enabled.
For detailed benchmarking instructions, see benchmarks/README.md.
Contributing
This project is part of the Platformatic ecosystem. For contributing guidelines, please refer to the main Platformatic repository.
Related Projects
- Undici - HTTP/1.1 client for Node.js
- iovalkey - High-performance Valkey client
- Platformatic - Enterprise-Ready Node.js