2 releases
Uses new Rust 2024
| 0.1.1 | Nov 19, 2025 |
|---|---|
| 0.1.0 | Nov 19, 2025 |
#1229 in Asynchronous
360KB
6.5K
SLoC
Ceylon - AI Agent Framework
A powerful and flexible Rust framework for building AI agents with goal-oriented capabilities, memory management, and tool integration.
Features
- Goal-Oriented Agents: Create agents that can analyze tasks, break them into sub-goals, and track progress
- Memory Management: Built-in conversation history, context management, and vector memory support
- Tool Integration: Extensible tool system for adding custom capabilities to your agents
- Multiple LLM Support: Works with 13+ providers including OpenAI, Anthropic, Ollama, Google, Groq, and more
- Async-First: Built on Tokio for efficient async/await support
- Vector Memory: Optional support for semantic search with OpenAI, Ollama, HuggingFace embeddings
- Interactive Runner: Optional CLI runner for interactive agent sessions
- WASM Support: Can be compiled to WebAssembly for browser-based applications
Quick Start
Add Ceylon to your Cargo.toml:
[dependencies]
ceylon-next = "0.1.0"
tokio = { version = "1", features = ["rt-multi-thread", "macros"] }
Basic Usage
use ceylon_next::agent::Agent;
use ceylon_next::tasks::TaskRequest;
#[tokio::main]
async fn main() {
// Create a new agent
let mut agent = Agent::new("MyAssistant", "openai::gpt-4");
// Create a task
let task = TaskRequest::new("What is the capital of France?");
// Run the agent
let response = agent.run(task).await;
println!("Response: {:?}", response.result());
}
Set your API key as an environment variable:
export OPENAI_API_KEY="your-api-key-here"
Working with Tools
Extend your agent's capabilities with custom tools:
use ceylon_next::agent::Agent;
use ceylon_next::tools::ToolTrait;
use serde_json::json;
// Define a custom tool
struct CalculatorTool;
impl ToolTrait for CalculatorTool {
fn name(&self) -> String {
"calculator".to_string()
}
fn description(&self) -> String {
"Performs basic arithmetic operations".to_string()
}
fn input_schema(&self) -> serde_json::Value {
json!({
"type": "object",
"properties": {
"operation": {"type": "string", "enum": ["add", "subtract", "multiply", "divide"]},
"a": {"type": "number"},
"b": {"type": "number"}
},
"required": ["operation", "a", "b"]
})
}
fn execute(&self, input: serde_json::Value) -> serde_json::Value {
let op = input["operation"].as_str().unwrap();
let a = input["a"].as_f64().unwrap();
let b = input["b"].as_f64().unwrap();
let result = match op {
"add" => a + b,
"subtract" => a - b,
"multiply" => a * b,
"divide" => a / b,
_ => 0.0,
};
json!({"result": result})
}
}
#[tokio::main]
async fn main() {
let mut agent = Agent::new("Calculator Agent", "openai::gpt-4");
agent.add_tool(CalculatorTool);
let task = TaskRequest::new("What is 15 multiplied by 7?");
let response = agent.run(task).await;
println!("{:?}", response.result());
}
Working with Memory
Agents automatically maintain conversation history:
use ceylon_next::agent::Agent;
use ceylon_next::tasks::TaskRequest;
#[tokio::main]
async fn main() {
let mut agent = Agent::new("MemoryAgent", "openai::gpt-4");
// First conversation
let task1 = TaskRequest::new("My name is Alice");
agent.run(task1).await;
// Second conversation - agent remembers context
let task2 = TaskRequest::new("What is my name?");
let response = agent.run(task2).await;
// Agent should respond with "Alice"
// Search memory
let memories = agent.search_memory("Alice").await;
println!("Found {} relevant conversations", memories.len());
}
Supported LLM Providers
Ceylon supports 13+ LLM providers out of the box:
| Provider | Example Model String | API Key Env Var |
|---|---|---|
| OpenAI | openai::gpt-4 |
OPENAI_API_KEY |
| Anthropic | anthropic::claude-3-5-sonnet-20241022 |
ANTHROPIC_API_KEY |
| Ollama | ollama::llama3.2 |
(local) |
| DeepSeek | deepseek::deepseek-coder |
DEEPSEEK_API_KEY |
| X.AI (Grok) | xai::grok-beta |
XAI_API_KEY |
| Google Gemini | google::gemini-pro |
GOOGLE_API_KEY |
| Groq | groq::mixtral-8x7b-32768 |
GROQ_API_KEY |
| Azure OpenAI | azure::gpt-4 |
AZURE_OPENAI_API_KEY |
| Cohere | cohere::command |
COHERE_API_KEY |
| Mistral | mistral::mistral-large-latest |
MISTRAL_API_KEY |
| Phind | phind::Phind-CodeLlama-34B-v2 |
PHIND_API_KEY |
| OpenRouter | openrouter::anthropic/claude-3-opus |
OPENROUTER_API_KEY |
| ElevenLabs | elevenlabs::eleven_monolingual_v1 |
ELEVENLABS_API_KEY |
Features
Ceylon uses Cargo features to enable optional functionality:
[dependencies]
# Default: std features, vector memory, and CLI runner
ceylon-next = "0.1.0"
# Minimal installation (no tokio, no LLM, suitable for WASM)
ceylon-next = { version = "0.1.0", default-features = false }
# With specific vector providers
ceylon-next = { version = "0.1.0", features = ["vector-openai"] }
ceylon-next = { version = "0.1.0", features = ["vector-huggingface-local"] }
# All vector providers
ceylon-next = { version = "0.1.0", features = ["full-vector"] }
Available Features
std(default): Standard features including tokio, LLM support, SQLite memory, and MessagePack serializationvector: Base vector memory functionalityvector-openai: OpenAI embeddings for vector memoryvector-ollama: Ollama embeddings for vector memoryvector-huggingface: HuggingFace API embeddingsvector-huggingface-local: Local HuggingFace embeddings using Candlefull-vector: All vector providersrunner: Interactive CLI runnerwasm: WebAssembly support
Goal-Oriented Programming
Create agents that can break down complex tasks:
use ceylon_next::agent::Agent;
use ceylon_next::goal::Goal;
use ceylon_next::tasks::TaskRequest;
#[tokio::main]
async fn main() {
let mut agent = Agent::new("ProjectManager", "openai::gpt-4");
// Create a goal with success criteria
let mut goal = Goal::new(
"Launch Product",
"Successfully launch the new product to market"
);
goal.add_criterion("Product is tested and bug-free");
goal.add_criterion("Marketing materials are ready");
goal.add_criterion("Launch event is scheduled");
// Add sub-goals
goal.add_sub_goal(Goal::new("Development", "Complete development"));
goal.add_sub_goal(Goal::new("Marketing", "Create marketing campaign"));
goal.add_sub_goal(Goal::new("Launch", "Execute launch"));
// Track progress
println!("Progress: {}%", goal.get_progress());
}
Examples
The repository includes numerous examples:
- 01_basic_agent: Simple agent creation and usage
- 02_with_tools: Custom tool implementation
- 03_with_memory: Working with conversation history
- 04_advanced_agent: Complex agent configurations
- 05_with_goals: Goal-oriented task management
- 08_llm_providers: Using different LLM providers
- 10_file_saving: Creating file-saving tools
- 11_persistent_memory: SQLite-backed memory
- 12_vector_memory: Semantic search with Ollama
- 13_vector_memory_openai: OpenAI embeddings
- 14_vector_memory_huggingface: HuggingFace API embeddings
- 15_vector_memory_huggingface_local: Local embeddings with Candle
Run examples from the repository:
# Clone the repository
git clone https://github.com/ceylonai/next.git
cd next
# Run an example
cargo run --example 01_basic_agent --manifest-path ceylon/Cargo.toml
Documentation
Architecture
Ceylon is organized into several core modules:
agent: Core agent implementation and lifecycle managementtools: Tool system and built-in toolsmemory: Memory backends (in-memory, SQLite, vector)llm: LLM provider integrations and abstractionsgoal: Goal-oriented task managementrunner: Interactive CLI runnertasks: Task definitions and execution
Contributing
We welcome contributions! Please see our GitHub repository for more information.
License
Licensed under either of:
- Apache License, Version 2.0 (LICENSE-APACHE or http://www.apache.org/licenses/LICENSE-2.0)
at your option.
Acknowledgments
Ceylon is built on top of the excellent llm crate for LLM provider integrations.
Dependencies
~0.8–25MB
~352K SLoC