18 releases (5 breaking)
Uses new Rust 2024
| new 0.7.3 | Feb 8, 2026 |
|---|---|
| 0.7.2 | Feb 7, 2026 |
| 0.6.0 | Feb 5, 2026 |
| 0.5.1 | Feb 4, 2026 |
| 0.1.1 | Jan 28, 2026 |
#1000 in Asynchronous
Used in machi-integrations
2MB
20K
SLoC
A lightweight, ergonomic framework for building AI agents in Rust.
Machi provides the building blocks for constructing AI agents that reason, use tools, and collaborate — powered by any LLM backend.
Core Concepts
Agent— A self-contained unit with its own LLM provider, instructions, tools, and optional sub-agents.Runner— A stateless execution engine driving theReActloop (think → act → observe → repeat).Tool/DynTool— Capabilities that agents can invoke (filesystem, shell, web search, or custom).ChatProvider— Trait abstracting over LLM backends (OpenAI, Ollama, or custom).
Feature Flags
| Feature | Description |
|---|---|
openai |
OpenAI API backend |
ollama |
Ollama local LLM backend |
derive |
#[tool] proc-macro for deriving tools |
toolkit |
Built-in filesystem, shell, and web search tools |
mcp |
Model Context Protocol server integration |
a2a |
Agent-to-Agent protocol support |
wallet |
EVM wallet for blockchain interactions |
memory-sqlite |
SQLite-backed session persistence |
schema |
Structured output via JSON Schema generation |
full |
All of the above (default) |
Quick Start
use machi::agent::{Agent, RunConfig};
use machi::chat::ChatRequest;
use machi::message::Message;
// Build a chat request
let request = ChatRequest::new("gpt-4o")
.system("You are a helpful assistant.")
.user("Hello!")
.temperature(0.7);
// Configure an agent
let agent = Agent::new("assistant")
.instructions("You are a helpful assistant.")
.model("gpt-4o");
// Construct messages manually
let msgs = vec![
Message::system("You are helpful."),
Message::user("What is Rust?"),
];
Dependencies
~13–47MB
~669K SLoC