#mcp #embedding #coding #memory

app oghma

Unified AI memory layer — aggregate context across coding tools

2 unstable releases

new 0.1.0 Mar 11, 2026
0.0.1 Feb 5, 2026

#779 in Development tools

MIT license

115KB
3K SLoC

oghma

Unified AI memory layer — aggregate context across coding tools.

oghma watches your AI coding session transcripts (Claude Code, Codex, OpenCode, OpenClaw), extracts technical gotchas and learnings via LLM, stores them in SQLite with FTS5 + local vector embeddings, and injects relevant context at the start of each new session.

Features

  • Automatic ingestion — daemon watches JSONL session files and extracts memories via LLM (OpenRouter/OpenAI)
  • Local embeddings — fastembed-rs (BGESmallENV15, 384-dim) — no API key required for search
  • Hybrid search — FTS5 keyword + vector cosine with RRF reranking and recency boost
  • MCP serveroghma mcp exposes memories as a Claude Code MCP tool
  • Zero dependencies at runtime — single static binary, SQLite bundled

Install

cargo install oghma

Or build from source:

git clone https://github.com/terry-li-hm/oghma-rs
cd oghma-rs
cargo build --release
cp target/release/oghma ~/bin/

Quick Start

oghma init                        # create ~/.oghma/config.toml
oghma start                       # start background daemon
oghma search "rust lifetime"      # keyword search
oghma search "async tokio" --mode hybrid  # hybrid search
oghma status                      # show DB stats and daemon state

MCP Server (Claude Code)

Add to your Claude Code MCP config:

{
  "oghma": {
    "command": "oghma",
    "args": ["mcp"]
  }
}

Tools exposed: oghma_search, oghma_get, oghma_stats, oghma_categories.

Configuration

Config file: ~/.oghma/config.toml

[extraction]
model = "google/gemini-2.5-flash-preview"  # or any OpenAI-compatible model
max_content_chars = 4000
confidence_threshold = 0.5

[daemon]
poll_interval = 300  # seconds

Set OPENROUTER_API_KEY or OPENAI_API_KEY for LLM extraction.

Commands

Command Description
oghma init Create default config
oghma start [-f] Start daemon (background or foreground)
oghma stop Stop daemon
oghma search QUERY [--mode keyword|vector|hybrid] Search memories
oghma add CONTENT [-c CATEGORY] Manually add a memory
oghma stats Count by category and source tool
oghma status Show DB path, memory count, daemon state
oghma mcp Start stdio MCP server
oghma migrate-embeddings Backfill local embeddings
oghma migrate --from DB_PATH Import from another oghma DB
oghma dedup [--threshold 0.95] Find/remove semantic duplicates
oghma export [--format json|markdown] Export all memories
oghma prune-stale --max-age-days N Delete old memories

License

MIT

Dependencies

~58–85MB
~1.5M SLoC