TidyScripts Cache System

A comprehensive, type-safe caching library with support for multiple backends, memoization, TTL (Time To Live), statistics tracking, and extensible architecture.

The cache system provides a unified interface for different storage backends through the ICache interface, with implementations for in-memory storage, file system persistence, and extensibility for custom backends like Redis, DynamoDB, etc.

  • 🏭 Factory Pattern: Create cache instances using CacheFactory.create()
  • 💾 Multiple Backends: Memory, filesystem, and extensible for custom implementations
  • ⚡ Memoization: Function result caching with CacheUtils.memoize()
  • ⏰ TTL Support: Automatic expiration of cache entries
  • 📊 Statistics: Hit/miss rates and performance monitoring
  • 🏠 Namespacing: Isolate cache entries by context
  • 🔧 Type Safety: Full TypeScript support with generics
  • 🛡️ Error Handling: Graceful degradation when cache operations fail
import { MemoryCache, CacheFactory } from 'tidyscripts_common';

// Direct instantiation
const cache = new MemoryCache<string>({
defaultTtl: 300000, // 5 minutes
verbose: true,
namespace: 'user-data'
});

// Factory pattern
const cache2 = CacheFactory.create<User>('memory', {
defaultTtl: 600000, // 10 minutes
logPrefix: '[UserCache]'
});

// Basic operations
await cache.set('user:123', 'John Doe', { ttl: 60000 });
const user = await cache.get('user:123');
const exists = await cache.has('user:123');
await cache.delete('user:123');
import { CacheUtils, MemoryCache } from 'tidyscripts_common';

const cache = new MemoryCache();

// Memoize expensive functions
const memoizedApiCall = CacheUtils.memoize(fetchUserData, cache, {
ttl: 300000, // 5 minutes
namespace: 'api',
keyGenerator: (args) => `user:${args[0]}` // custom key generation
});

// Factory pattern for multiple functions
const memoizer = CacheUtils.createMemoizer(cache, {
namespace: 'calculations',
ttl: 3600000 // 1 hour
});

const memoizedCalc1 = memoizer(heavyComputation);
const memoizedCalc2 = memoizer(anotherComputation, { ttl: 1800000 }); // 30 min override

Fast in-memory storage using JavaScript Map. Data is lost when process exits.

const memoryCache = new MemoryCache<UserData>({
defaultTtl: 300000,
enableStats: true,
verbose: true,
logPrefix: '[UserCache]'
});

Pros: Extremely fast, no I/O overhead Cons: Data lost on restart, limited by available RAM Best for: Session data, temporary computations, frequently accessed data

Persistent storage using JSON files (requires ts_node package).

import { FileSystemCache } from 'ts_node';

const fileCache = new FileSystemCache<ApiResponse>({
cacheDir: '/tmp/my-app-cache',
defaultTtl: 3600000, // 1 hour
namespace: 'api-responses'
});

// Or via factory
const fileCache2 = CacheFactory.create('filesystem', {
cacheDir: '/var/cache/myapp'
});

Pros: Persistent across restarts, unlimited storage Cons: Slower than memory, file I/O overhead Best for: API responses, computed results, data that needs persistence

Extend BaseCache to implement Redis, DynamoDB, or other storage systems:

class RedisCache<T> extends BaseCache<T> {
constructor(config: RedisCacheConfig) {
super({ logPrefix: '[RedisCache]', ...config });
}

async get(key: string): Promise<T | null> {
// Redis implementation
}
// ... implement other methods
}

// Register with factory
CacheFactory.register('redis', RedisCache);
const apiCache = CacheFactory.create('memory', {
namespace: 'api',
defaultTtl: 300000 // 5 minutes
});

const cachedFetch = CacheUtils.memoize(
async (url: string, options?: RequestInit) => {
const response = await fetch(url, options);
return response.json();
},
apiCache,
{
keyGenerator: (args) => `${args[0]}:${JSON.stringify(args[1] || {})}`
}
);
class UserService {
private cache = CacheFactory.create('memory', { namespace: 'users' });
private memoizer = CacheUtils.createMemoizer(this.cache, { ttl: 600000 });

// Automatically cached for 10 minutes
getUserById = this.memoizer(async (id: string): Promise<User> => {
return await db.users.findById(id);
});

// Custom TTL for different operations
getUserStats = this.memoizer(
async (id: string) => await db.userStats.compute(id),
{ ttl: 3600000 } // 1 hour
);
}
// L1: Fast memory cache
const l1Cache = new MemoryCache({ defaultTtl: 60000 }); // 1 minute

// L2: Persistent file cache
const l2Cache = CacheFactory.create('filesystem', {
cacheDir: '/tmp/l2-cache',
defaultTtl: 3600000 // 1 hour
});

async function multiLayerGet<T>(key: string): Promise<T | null> {
// Try L1 first
let result = await l1Cache.get<T>(key);
if (result !== null) return result;

// Try L2
result = await l2Cache.get<T>(key);
if (result !== null) {
// Populate L1 for next time
await l1Cache.set(key, result);
return result;
}

return null;
}
// Parallel cache operations
const userIds = ['1', '2', '3'];
const cachedUsers = await CacheUtils.batchGet(cache, userIds);

// Batch write
await CacheUtils.batchSet(cache, [
{ key: 'user:1', value: user1, options: { ttl: 300000 } },
{ key: 'user:2', value: user2 },
{ key: 'user:3', value: user3 }
]);
interface CacheConfig {
defaultTtl?: number; // Default expiration time in ms
enableStats?: boolean; // Track hit/miss statistics
keyPrefix?: string; // Prefix for all keys
namespace?: string; // Logical grouping of entries
verbose?: boolean; // Enable detailed logging
logPrefix?: string; // Custom log message prefix
}
interface MemoizeOptions {
ttl?: number; // TTL for cached results
namespace?: string; // Cache key namespace
keyGenerator?: (args: any[]) => string; // Custom key generation
includeThis?: boolean; // Include 'this' in cache key
errorOnCacheFailure?: boolean; // Throw on cache errors
}
const cache = new MemoryCache({ enableStats: true });

// After some operations...
const stats = cache.getStats();
console.log(`Hit rate: ${(stats.hitRate * 100).toFixed(2)}%`);
console.log(`Total operations: ${stats.hits + stats.misses}`);

// Reset statistics
cache.resetStats();
const cache = new MemoryCache({
verbose: true,
logPrefix: '[MyAppCache]'
});
// Logs: "2023-12-07T10:30:00Z [MyAppCache] Cache hit: { key: 'user:123' }"
  • Memory Cache: Use for frequently accessed, small-to-medium datasets
  • File Cache: Use for larger datasets that need persistence
  • TTL Values: Balance between freshness and performance
  • Key Generation: Avoid overly complex key generation functions
  • Namespace Usage: Prevents key collisions and enables logical grouping
  • Batch Operations: Use for multiple related cache operations

The cache system is designed for graceful degradation:

// Memoized functions continue working even if cache fails
const memoizedFn = CacheUtils.memoize(originalFn, cache, {
errorOnCacheFailure: false // Default: continue on cache errors
});

// Manual error handling
try {
await cache.set('key', 'value');
} catch (error) {
console.warn('Cache write failed, continuing without cache');
// Application continues normally
}
import { MemoizationTests } from 'tidyscripts_common';

// Run comprehensive test suite
await MemoizationTests.runAllTests();

Index

Classes

Interfaces