memories.sh logomemories.sh

memories.sh Documentation

One memory store for all your AI coding tools. Store rules, decisions, and knowledge once — generate native config files for every AI editor.

What is memories.sh?

memories.sh is a local-first CLI that gives your AI coding agents persistent memory. Store your coding rules, architectural decisions, and project knowledge in a single local database, then generate native configuration files for Cursor, Claude Code, GitHub Copilot, Windsurf, and more.

Scope model by surface: CLI uses global + git project scope. SDK uses tenantId (security/database boundary), userId (end-user scope), and projectId (optional repo context filter).

New: segmented memory lifecycle docs are now live. Start with Memory Segmentation and see the release summary in Changelog.

Why memories.sh?

If you use multiple AI coding tools, you know the pain:

  • Change a rule in .cursorrules and forget to update CLAUDE.md
  • Set up a new machine and lose all your carefully curated agent context
  • Want to try a new AI tool but dread recreating all your rules

memories.sh solves this by being the single source of truth:

  • One store, every tool — Add a memory once, generate files for all your tools
  • Local-first — Your data stays on your machine, runs entirely offline
  • AI-powered search — Semantic search finds related memories, not just keyword matches
  • MCP server — Fallback for agents that need real-time access beyond static configs
  • Auto-setup — Detects your tools, configures MCP, and imports existing project skills automatically
  • Sync config files — Your skills, commands, and rules follow you everywhere

Quick Start

# Install
pnpm add -g @memories.sh/cli

# Initialize (auto-configures MCP and imports project skills for detected tools)
memories setup

# Add your first rule
memories add --rule "Always use TypeScript strict mode"

# Add a decision
memories add --decision "Chose PostgreSQL over MySQL for JSONB support"

# Generate config files for all your tools
memories generate all

What memories setup does

When you run memories setup (alias: memories init), it:

  1. Creates the local database at ~/.config/memories/local.db
  2. Detects installed AI tools (Cursor, Claude Code, Windsurf, VS Code)
  3. Configures MCP for each detected tool
  4. Generates instruction files with your existing memories
  5. Imports existing project skills (for example from .claude/skills and .codex/skills) into memories
[1/4] Setting up local storage...
  Database: ~/.config/memories/local.db
[2/4] Detecting scope...
✓ Project scope detected
[3/4] Detecting AI coding tools...
  Cursor ✓ MCP ○ Rules
  Claude Code ○ MCP ✓ Rules
  
✓ Claude Code: MCP configured → .mcp.json
✓ Cursor: Generated .cursor/rules/memories.mdc
[4/4] Finalizing...

Find memories even when you don't remember the exact keywords:

# Keyword search (default)
memories search "authentication"

# Semantic search (AI-powered)
memories search "how to handle user login" --semantic

The semantic search model runs entirely locally — no API calls, no data leaves your machine.

Memory Segmentation Lifecycle

memories.sh now separates memory by lifecycle role, not just type:

  • Session memory for active working context
  • Semantic memory for stable truths and preferences
  • Episodic memory for daily logs and raw snapshots
  • Procedural memory for reusable workflows

This keeps context coherent across long tasks, reset boundaries, and compaction windows.

Explore the lifecycle model:

Sync Config Files

Beyond memories, sync your entire AI tool setup:

# Import your configs (skills, commands, rules, tasks)
memories files ingest

# Apply on a new machine
memories files apply --global --force

This syncs files from .agents/, .claude/, .cursor/, .codex/, and other tool directories.

Next Steps

On this page