Custom FeatureCache
A FeatureCache stores computed indicator series so they are not recalculated on every backtest run. The SDK ships with MemoryFeatureCache — a simple in-process Map with no eviction. For production deployments — cross-process sharing, persistence across runs, or TTL-based expiry — you write your own implementation. This page explains the interface, the content-addressing key scheme, and when and how to replace the default.
Contract
The FeatureCache interface has two required methods and one optional one:
interface FeatureCache {
get(key: FeatureKey): Promise<Series | undefined>;
set(key: FeatureKey, series: Series): Promise<void>;
invalidate?(prefix: Partial<FeatureKey>): Promise<void>;
}get(key)
Returns the cached Series for the given key, or undefined on a miss. Always async — even an in-memory implementation must return a Promise so the interface is compatible with remote stores (Redis, a database, a filesystem) without requiring a different signature.
set(key, series)
Stores a computed Series. Must be idempotent: calling set twice with the same key overwrites the previous value without error.
invalidate(prefix) — optional
Removes all cache entries whose key matches all fields specified in prefix. A Partial<FeatureKey> with only feature: 'sma' set removes every SMA series regardless of asset or date range. Useful when you update an indicator's implementation and want to force recomputation. Omit this method when your store does not support prefix-based deletion.
Content-addressed keys
Cache keys are FeatureKey objects with five fields:
type FeatureKey = {
feature: string; // indicator name, e.g. 'sma'
paramsHash: string; // stable hash of the indicator's parameters
scope: FeatureScope; // { kind: 'asset', asset: AssetId } | { kind: 'universe', ... }
range: DateRange; // the date window the series covers
freq: Frequency; // bar frequency, e.g. '1d'
};The key is content-addressed: the same indicator with the same parameters computed over the same asset and date range always produces the same FeatureKey, regardless of which strategy triggered the computation. This means two different strategies using sma(SPY, 50) share a single cached series — there is no per-strategy cache namespace.
MemoryFeatureCache serialises the key to a pipe-delimited string for Map lookup. A Redis implementation would use the same serialisation as a key string. A filesystem implementation would hash the serialised key to a filename.
When to write a custom cache
| Scenario | Recommended approach |
|---|---|
| Single-process backtest | Use MemoryFeatureCache (default) |
| Multiple backtest processes on the same machine | Filesystem cache keyed by the serialised FeatureKey |
| Distributed backtesting (multiple nodes) | Redis or Memcached with the serialised key |
| Long-running live strategy (persist across restarts) | SQLite or a hosted key-value store |
| Indicator data that expires (e.g. real-time feeds) | TTL-aware cache; implement get to check expiry before returning |
Reference: MemoryFeatureCache
class MemoryFeatureCache implements FeatureCache {
private store = new Map<string, Series>();
async get(key: FeatureKey): Promise<Series | undefined> {
return this.store.get(canonicalKey(key));
}
async set(key: FeatureKey, series: Series): Promise<void> {
this.store.set(canonicalKey(key), series);
}
async invalidate(prefix: Partial<FeatureKey>): Promise<void> {
const needles = canonicalPrefix(prefix).split('|').filter(Boolean);
if (needles.length === 0) return;
for (const k of [...this.store.keys()]) {
if (needles.every(n => k.includes(n))) this.store.delete(k);
}
}
}canonicalKey serialises all five fields into a deterministic pipe-delimited string. invalidate does a substring-based scan — fine for hundreds of entries, but consider a more efficient index for caches with thousands.
Sample: InstrumentedCache
The sample at scripts/docs/guides-runtime/custom-feature-cache.ts wraps MemoryFeatureCache to track hit/miss rate:
npx tsx scripts/docs/guides-runtime/custom-feature-cache.ts// Custom FeatureCache — guide sample
// Demonstrates an InstrumentedCache that wraps MemoryFeatureCache to track
// hit/miss rate — a pattern you'd extend for Redis, filesystem, or TTL caches.
//
// npx tsx scripts/docs/guides-runtime/custom-feature-cache.ts
import {
fromSpec,
runBacktest,
FeatureRuntime,
NYSEExchangeCalendar,
MemoryFeatureCache,
BacktestExecutor,
} from '@livefolio/sdk';
import type {
FeatureCache,
FeatureKey,
Series,
DataFeed,
Asset,
Bar,
DateRange,
Frequency,
TacticalSpec,
} from '@livefolio/sdk';
// ─── 1. Implement FeatureCache ───────────────────────────────────────────────
//
// Contract:
// get(key) — return cached Series or undefined on miss
// set(key, series) — store a series; must be idempotent
// invalidate(prefix) — optional; remove all entries whose key matches prefix
//
// Keys are content-addressed: the same feature + params + asset + date range
// + frequency always maps to the same key, regardless of which strategy
// triggered the computation. This means a shared cache (Redis, filesystem)
// lets multiple backtest processes reuse indicator results.
/**
* InstrumentedCache wraps any FeatureCache to record hit/miss statistics.
* Replace MemoryFeatureCache with a Redis client, SQLite store, or any other
* FeatureCache implementation — the tracking logic stays unchanged.
*/
class InstrumentedCache implements FeatureCache {
private hits = 0;
private misses = 0;
constructor(private readonly inner: FeatureCache) {}
async get(key: FeatureKey): Promise<Series | undefined> {
const result = await this.inner.get(key);
if (result !== undefined) {
this.hits++;
} else {
this.misses++;
}
return result;
}
async set(key: FeatureKey, series: Series): Promise<void> {
return this.inner.set(key, series);
}
async invalidate(prefix: Partial<FeatureKey>): Promise<void> {
return this.inner.invalidate?.(prefix);
}
/** Returns hit rate as a value in [0, 1]. Returns 0 if no requests yet. */
hitRate(): number {
const total = this.hits + this.misses;
return total === 0 ? 0 : this.hits / total;
}
printStats(): void {
const total = this.hits + this.misses;
console.log(
`FeatureCache stats — hits: ${this.hits}, misses: ${this.misses}, total: ${total}, hit-rate: ${(this.hitRate() * 100).toFixed(1)}%`,
);
}
}
// ─── 2. Synthetic DataFeed ────────────────────────────────────────────────────
const MS_DAY = 86_400_000;
function makeBars(startIso: string, count: number, base: number, drift: number): Bar[] {
const bars: Bar[] = [];
let price = base;
let t = new Date(`${startIso}T00:00:00Z`);
for (let i = 0; i < count; i++) {
const dow = t.getUTCDay();
if (dow !== 0 && dow !== 6) {
price = price * (1 + drift + Math.sin(i / 8) * 0.006);
bars.push({
t: new Date(t),
open: price,
high: price * 1.004,
low: price * 0.996,
close: price,
volume: 1_000_000,
});
}
t = new Date(t.getTime() + MS_DAY);
}
return bars;
}
const FIXTURES: Record<string, Bar[]> = {
'us:SPY': makeBars('2023-01-02', 700, 390, 0.0005),
'us:IEF': makeBars('2023-01-02', 700, 96, 0.0001),
};
const dataFeed: DataFeed = {
bars: async function* (asset: Asset, range: DateRange, _freq: Frequency) {
const all = FIXTURES[asset.id];
if (!all) throw new Error(`no fixture for ${asset.id}`);
for (const b of all) {
if (b.t >= range.from && b.t < range.to) yield b;
}
},
};
// ─── 3. Wire into a backtest ──────────────────────────────────────────────────
const SPY = { id: 'us:SPY', symbol: 'SPY' };
const IEF = { id: 'us:IEF', symbol: 'IEF' };
const spec: TacticalSpec = {
kind: 'tactical/v1',
universe: [SPY, IEF],
rebalance: { frequency: 'Monthly' },
features: [
{ id: 'spy_price', kind: 'price', asset: SPY },
{ id: 'spy_sma20', kind: 'sma', asset: SPY, period: 20 },
{ id: 'spy_rsi14', kind: 'rsi', asset: SPY, period: 14 },
],
rules: {
op: 'if',
cond: { op: 'gt', left: { ref: 'spy_price' }, right: { ref: 'spy_sma20' } },
then: { op: 'allocate', weights: { 'us:SPY': 1.0 } },
else: { op: 'allocate', weights: { 'us:IEF': 1.0 } },
},
};
const calendar = new NYSEExchangeCalendar();
const cache = new InstrumentedCache(new MemoryFeatureCache());
const range: DateRange = { from: new Date('2023-04-01T00:00:00Z'), to: new Date('2024-06-01T00:00:00Z') };
const runtime = new FeatureRuntime({ dataFeed, featureCache: cache, range, freq: '1d' });
const executor = new BacktestExecutor({
calendar,
nextOpen: async (asset, t) => {
const bars = FIXTURES[asset.id];
if (!bars) throw new Error(`no fixture for ${asset.id}`);
const next = bars.find((b) => b.t.getTime() > t.getTime());
if (!next) throw new Error(`no bar after ${t.toISOString()} for ${asset.id}`);
return { t: next.t, price: next.open };
},
});
const strategy = fromSpec(spec, { runtime, calendar });
await runBacktest({
strategy,
range,
initialPortfolio: { cash: 100_000, positions: [], t: range.from },
dataFeed,
executor,
calendar,
});
// Print cache statistics after the backtest completes.
cache.printStats();The InstrumentedCache class is the pattern to extend for any real backing store. Replace MemoryFeatureCache with a Redis client or SQLite adapter — the get/set/invalidate wrappers stay unchanged:
class RedisFeatureCache implements FeatureCache {
constructor(private readonly redis: RedisClient) {}
async get(key: FeatureKey): Promise<Series | undefined> {
const raw = await this.redis.get(canonicalKey(key));
return raw ? (JSON.parse(raw) as Series) : undefined;
}
async set(key: FeatureKey, series: Series): Promise<void> {
await this.redis.set(canonicalKey(key), JSON.stringify(series));
}
async invalidate(prefix: Partial<FeatureKey>): Promise<void> {
// Redis SCAN + DEL pattern for prefix matching.
const pattern = `*${canonicalPrefix(prefix)}*`;
const keys = await this.redis.keys(pattern);
if (keys.length > 0) await this.redis.del(...keys);
}
}You would need to copy (or extract) the canonicalKey/canonicalPrefix serialisation from src/reference/memory-feature-cache.ts for this to work correctly.
Things to verify
- [ ]
getreturnsundefinedon a miss — notnull, not an emptySeries. - [ ]
setis idempotent: calling it twice with the same key leaves the cache consistent. - [ ]
getandsetare bothasyncand returnPromise— even for synchronous in-memory stores. - [ ] Key serialisation is deterministic: the same
FeatureKeyalways produces the same canonical string. - [ ] Your implementation compiles:
npm run docs:check. - [ ] Integration: run a backtest twice with a warm cache and confirm the second run makes zero
setcalls (all hits).
What's next
- DataFeed —
FeatureRuntimeusesFeatureCacheandDataFeedtogether. The cache sits in front of the feed; a miss triggers aDataFeed.bars()fetch. See Custom DataFeed. FeatureRuntime— the built-in runtime that wiresDataFeed,FeatureCache, and indicator definitions together. Pass your custom cache tonew FeatureRuntime({ dataFeed, featureCache, range, freq }).- API reference —
FeatureCache·MemoryFeatureCache·FeatureKey.