memories.sh logomemories.sh
SDK

Core Client

MemoriesClient for use with any LLM framework.

@memories.sh/core provides a standalone client that works with any LLM SDK — Anthropic, OpenAI, or custom integrations. Zero runtime dependencies, edge-compatible.

Install

pnpm add @memories.sh/core

Basic Usage

import { MemoriesClient } from "@memories.sh/core"

const client = new MemoriesClient({
  apiKey: "mem_xxx", // or uses MEMORIES_API_KEY env var
  tenantId: "acme-prod",
})

const { rules, memories } = await client.context.get({
  query: "auth patterns",
  projectId: "github.com/acme/platform",
  userId: "user_123",
  mode: "all",
})

// Use with any LLM SDK
const response = await anthropic.messages.create({
  model: "claude-sonnet-4-5-20250929",
  system: client.buildSystemPrompt({ rules, memories }),
  messages: [{ role: "user", content: userMessage }],
})

API Reference

Constructor

const client = new MemoriesClient({
  apiKey: "mem_xxx",      // API key (default: MEMORIES_API_KEY env var)
  baseUrl: "https://memories.sh", // optional (default)
  transport: "auto",      // "auto" (default), "sdk_http", or "mcp"
  tenantId: "acme-prod",  // AI SDK Project (security/database boundary)
  userId: "user_123",     // End-user scope inside tenantId (optional)
  projectId: "github.com/acme/platform", // Optional repo context filter (not auth boundary)
})

Transport behavior:

  • auto (default): uses sdk_http for normal base URLs (for example https://memories.sh), and switches to mcp only when baseUrl ends with /api/mcp
  • sdk_http: uses /api/sdk/v1/* endpoints
  • mcp: uses JSON-RPC /api/mcp

The SDK client never shells out to the CLI. CLI commands are a separate integration path for local/dev workflows.

For the canonical SDK HTTP contracts (envelopes, status codes, and route list), see SDK endpoint contract.

client.context.get(input?)

Fetch rules and relevant memories with explicit runtime scope controls.

const { rules, memories } = await client.context.get({
  query: "deployment process",
  projectId: "github.com/acme/platform",
  userId: "user_123",
  mode: "all",
})

// Rules + working only
const ctx = await client.context.get({
  query: "auth",
  mode: "working",
  limit: 20,
  includeRules: true, // default true
})

mode controls tier selection:

  • all (default): rules + working + long-term
  • working: rules + working only
  • long_term: rules + long-term only
  • rules_only: rules only

Legacy signature remains supported:

await client.context.get("auth", { projectId: "github.com/acme/platform", limit: 20 })

Hybrid Retrieval Usage

Enable graph-augmented recall per request:

const ctx = await client.context.get({
  query: "why did billing alerts spike?",
  strategy: "hybrid_graph", // or "baseline"
  graphDepth: 1,            // 0 | 1 | 2
  graphLimit: 8,            // max graph-expanded memories
  userId: "user_123",
  projectId: "github.com/acme/platform",
})

console.log(ctx.trace)
// {
//   requestedStrategy: "hybrid_graph",
//   strategy: "baseline" | "hybrid_graph",
//   rolloutMode: "off" | "shadow" | "canary",
//   shadowExecuted: boolean,
//   qualityGateStatus: "pass" | "warn" | "fail" | "insufficient_data" | "unavailable",
//   qualityGateBlocked: boolean,
//   qualityGateReasonCodes: string[],
//   fallbackTriggered: boolean,
//   fallbackReason: string | null
// }

trace.strategy is the applied strategy after rollout guardrails. This lets you detect when a hybrid request safely falls back to baseline.

Workspace rollout and safety signals are available from SDK HTTP graph endpoints:

  • GET /api/sdk/v1/graph/rollout for mode + shadow metrics + quality gate snapshot
  • GET /api/sdk/v1/graph/status for counts + alarms (HIGH_FALLBACK_RATE, GRAPH_EXPANSION_ERRORS, CANARY_QUALITY_GATE_BLOCKED)

client.memories.add(input)

Create a new memory.

await client.memories.add({
  content: "Use TypeScript with strict mode",
  type: "rule",
  tags: ["code-style"],
  paths: ["src/**"],
})

client.memories.search(query, options?)

Full-text search across memories.

const results = await client.memories.search("auth patterns", {
  type: "rule",
  limit: 10,
})

client.memories.list(options?)

List memories with optional filters.

const memories = await client.memories.list({
  type: "rule",
  tags: ["code-style"],
  limit: 50,
})

client.memories.edit(id, updates)

Update an existing memory.

await client.memories.edit("mem_abc123", {
  content: "Updated content",
  tags: ["updated-tag"],
})

client.memories.forget(id)

Soft-delete a memory.

await client.memories.forget("mem_abc123")

client.buildSystemPrompt(context)

Format rules and memories into a system prompt block.

const { rules, memories } = await client.context.get({ query: "auth" })
const systemPrompt = client.buildSystemPrompt({ rules, memories })

// Returns formatted text like:
// ## Rules (always follow)
// - Use TypeScript with strict mode
// - Prefer functional patterns
//
// ## Relevant Context (from memory)
// - [decision] Chose Supabase over Firebase for auth
// - [fact] Project uses pnpm

Scoping

Use all three scopes with clear roles:

  • tenantId = AI SDK Project (security/database boundary)
  • userId = end-user scope inside tenantId
  • projectId = optional repo context filter (not auth boundary)

If tenantId is omitted, requests route to the API key's default workspace database.

If tenantId is provided and no mapping exists yet, the runtime can auto-provision a project Turso database (when server provisioning credentials are configured). This lets SaaS apps avoid manual project DB setup during first use.

const client = new MemoriesClient({
  apiKey: "mem_xxx",
  tenantId: "acme-prod",
  userId: "user_123",
})

await client.memories.add({
  content: "User prefers concise updates in repository reviews",
  type: "note",
  projectId: "github.com/acme/platform",
})

// Per-call override is also supported:
const context = await client.context.get({
  userId: "user_456",
  projectId: "github.com/acme/platform",
  mode: "working",
})

Management APIs

When you manage API keys or AI SDK Projects (tenantId mappings) programmatically, prefer the SDK management endpoints:

  • /api/sdk/v1/management/keys
  • /api/sdk/v1/management/tenants

The core client exposes these directly:

import { MemoriesClient } from "@memories.sh/core"

const client = new MemoriesClient({
  apiKey: process.env.MEMORIES_API_KEY,
  baseUrl: "https://memories.sh",
  transport: "sdk_http",
})

const keyStatus = await client.management.keys.get()
const rotatedKey = await client.management.keys.create({
  expiresAt: "2027-01-01T00:00:00.000Z",
})
const revoked = await client.management.keys.revoke()

const sdkProjects = await client.management.tenants.list()
const upsertedProject = await client.management.tenants.upsert({
  tenantId: "acme-prod",
  mode: "provision",
})
const disabledProject = await client.management.tenants.disable("acme-prod")

void [keyStatus, rotatedKey, revoked, sdkProjects, upsertedProject, disabledProject]

Legacy MCP management routes (/api/mcp/key, /api/mcp/tenants) remain available for backward compatibility.

For single-user apps, tenantId can be your app environment or workspace id:

const client = new MemoriesClient({
  apiKey: "mem_xxx",
  tenantId: "my-app-prod",
  userId: "user_123",
})

// All operations are now scoped to user_123
await client.memories.add({ content: "User prefers dark mode", type: "note" })
const { memories } = await client.context.get({ query: "user preferences" })
// Only returns memories for user_123

Skill Files (Scoped)

Skill files are first-class scoped records in the SDK (tenantId + userId + projectId), separate from memory type: "skill" labels.

await client.skills.upsertFile({
  path: ".agents/skills/review/SKILL.md",
  content: "---\nname: review\n---\nUse strict checks.",
  tenantId: "acme-prod",
  userId: "user_123",
  projectId: "github.com/acme/platform",
})

const skillFiles = await client.skills.listFiles({
  tenantId: "acme-prod",
  userId: "user_123",
  projectId: "github.com/acme/platform",
})

await client.skills.deleteFile({
  path: ".agents/skills/review/SKILL.md",
  tenantId: "acme-prod",
  userId: "user_123",
  projectId: "github.com/acme/platform",
})

Edge Runtime

The core client is edge-compatible with zero dependencies:

import { MemoriesClient } from "@memories.sh/core"

export const runtime = "edge"

export async function POST(req: Request) {
  const client = new MemoriesClient({ tenantId: "acme-prod" })
  const { rules } = await client.context.get()
  return Response.json({ rules: rules.map((r) => r.content) })
}

On this page