Skip to content

Memory engine for multi-agent AI systems. ACE patterns for persistent learning. Production-ready, self-hostable, Apache 2.0.

License

Notifications You must be signed in to change notification settings

quantifylabs/aegis-memory

Repository files navigation

Aegis Memory

License Python 3.11+

The memory engine for multi-agent systems.

Aegis Memory is a production-ready, self-hostable memory engine designed to give AI agents a persistent learning loop. By combining semantic search, scope-aware access control, and ACE (Agentic Context Engineering), Aegis allows agents to share state, vote on strategies, and extract actionable reflections from their failures.

Quick Start

1. Start the Server (2 min)

git clone https://github.com/quantifylabs/aegis-memory.git
cd aegis-memory

export OPENAI_API_KEY=sk-...
docker-compose up -d

# Verify
curl http://localhost:8000/health

2. Install the CLI + SDK

pip install aegis-memory

3. Configure & Use

# First-time setup
aegis config init

# Check connection
aegis status

# Add your first memory
aegis add "User prefers concise responses" -a assistant -s global

# Query memories
aegis query "user preferences"

# View stats
aegis stats

CLI Reference

Aegis provides a powerful CLI for all memory operations:

# Core Operations
aegis add "content"              # Add a memory
aegis query "search text"        # Semantic search
aegis get <id>                   # Get single memory
aegis delete <id>                # Delete memory
aegis vote <id> helpful          # Vote on memory

# ACE Patterns
aegis playbook "error handling"  # Query proven strategies
aegis progress show <session>    # View session progress
aegis features list              # Track feature status

# Data Management
aegis export -o backup.jsonl     # Export memories
aegis import backup.jsonl        # Import memories
aegis stats                      # Namespace statistics

→ Full CLI Reference

Python SDK

from aegis_memory import AegisClient

client = AegisClient(api_key="dev-key", base_url="http://localhost:8000")

# Add a memory
client.add("User prefers concise responses", agent_id="assistant")

# Query memories
memories = client.query("user preferences", agent_id="assistant")

# Vote on usefulness (ACE pattern)
client.vote(memories[0].id, "helpful", voter_agent_id="assistant")

# Cross-agent memory sharing
client.add(
    content="Task: Build login. Steps: 1) Form, 2) Validation, 3) API",
    agent_id="planner",
    scope="agent-shared",
    shared_with_agents=["executor"]
)

Why Aegis Memory?

Challenge DIY Solution Aegis Memory
Multi-agent memory sharing Custom access control Built-in scopes (private/shared/global)
Long-running agent state File-based progress tracking Structured session & feature tracking
Context window limits Dump everything in prompt Semantic search + effectiveness scoring
Learning from mistakes Manual prompt tuning Memory voting + reflection patterns

Aegis Memory is not just another vector database. It's an active strategy engine with primitives designed to turn agent execution into persistent organizational intelligence.

Features

Core Memory

  • Semantic Search — pgvector HNSW index for O(log n) queries at scale
  • Scope-Aware Accessagent-private, agent-shared, global with automatic ACL
  • Multi-Agent Handoffs — Structured state transfer between agents
  • Auto-Deduplication — Hash-based O(1) duplicate detection

ACE Patterns

  • Memory Voting — Track which memories help vs harm task completion
  • Delta Updates — Incremental changes that prevent context collapse
  • Reflections — Store insights from failures for future reference
  • Session Progress — Track work across context windows
  • Feature Tracking — Prevent premature task completion

Production Ready

  • Self-Hostable — Docker, Kubernetes, any cloud
  • Observable — Prometheus metrics, structured logging
  • Fast — 30-80ms queries on 1M+ memories
  • Safe — Data export, migrations, no vendor lock-in

Framework Integrations

Drop-in support for popular agent frameworks:

# LangChain
from aegis_memory.integrations.langchain import AegisMemory
chain = ConversationChain(llm=llm, memory=AegisMemory(agent_id="assistant"))

# CrewAI
from aegis_memory.integrations.crewai import AegisCrewMemory
crew = Crew(agents=[...], memory=AegisCrewMemory())

→ Integration Guides

ACE Patterns

Aegis implements patterns from recent research on self-improving agents:

Memory Voting

# After a memory helped complete a task
aegis vote <memory-id> helpful -c "Successfully paginated API"

# Query only effective strategies
aegis playbook "API pagination" -e 0.3

Session Progress

# Track work across context windows
aegis progress create build-dashboard -a coder
aegis progress update build-dashboard -c auth -i routing
aegis progress show build-dashboard

→ ACE Patterns Guide

Performance

Operation Latency Notes
Query (1M memories) 30-80ms HNSW index
Add single ~100ms Includes embedding
Add batch (50) ~300ms Batched embedding
Deduplication <1ms Hash lookup

Documentation

Deployment

Docker Compose

docker-compose up -d

Kubernetes

kubectl apply -f k8s/

Configuration

Variable Default Description
DATABASE_URL postgresql+asyncpg://... PostgreSQL connection
OPENAI_API_KEY For embeddings
AEGIS_API_KEY dev-key API authentication

→ Full Configuration

Contributing

We welcome contributions! See CONTRIBUTING.md for guidelines.

# Run tests
pytest tests/ -v

# Run linting
ruff check server/

License

Apache 2.0 — Use it however you want. See LICENSE.

Links


Built with ❤️ for the agent community

About

Memory engine for multi-agent AI systems. ACE patterns for persistent learning. Production-ready, self-hostable, Apache 2.0.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •