Star us on GitHub! Join us on Discord.
Chidori is an open-source orchestrator, runtime, and IDE for building software in symbiosis with modern AI tools. It is especially catered towards building AI agents by providing solutions to the following problems:
- How do we understand what an agent is doing and how it got into a given state?
- How can we pause execution and then resume after interaction with a human?
- How do we handle the accidental complexity of state-space exploration, evaluating and reverting execution throughout our software?
When using Chidori, you author code with python or javascript, we provide a layer for interfacing with the complexities of AI models in long-running workflows. We have avoided the need for declaring a new language or SDK in order to provide these capabilities so that you can leverage software patterns that you are already familiar with.
Features:
- Runtime written in Rust, supporting Python and JavaScript code execution
- The ability to cache behaviors and resume from partially executed agents
- Time travel debugging, execution of the program can be reverted to prior states
- Visual debugging environment, visualize and manipulate the graph of states your code has executed through.
- Create and navigate tree-searching code execution workflows
Chidori is available on crates.io and can be installed using cargo. Our expected entrypoint for
prototype development is chidori-debugger
which wraps our runtime in a useful visual interface.
# Install the rust toolchain and the nightly channel
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
rustup toolchain install nightly
# Required for building dependencies
xcode-select --install
# These dependencies are necessary for a successful build
brew install cmake
# We are investigating if this is necessary or can be removed
brew install [email protected]
# Chidori uses uv for handling python dependencies
brew install uv
# We depend on features only supported by nightly at the moment
cargo +nightly install chidori-debugger --locked
If you prefer to use a different python interpreter you can set PYO3_PYTHON=python3.12 (or whichever version > 3.7) during your installation to change which is linked against.
Chidori's interactions with LLMs default to http://localhost:4000 to hook into LiteLLM's proxy.
If you'd like to leverage gpt-3.5-turbo the included config file will support that.
You will need to install pip install litellm[proxy]
in order to run the below:
export OPENAI_API_KEY=...
uv pip install "litellm[proxy]"
uv run litellm --config ./litellm_config.yaml
The following example shows how to build a simple agent that fetches the top stories from Hacker News and call the OpenAI API to filter to AI related launches and then format that data into markdown.
Chidori agents can be a single file, or a collection of files structured as a typical Typescript or Python project. The following example is a single file agent. Consider this similar to something like a jupyter/iPython notebook represented as a markdown file.
```javascript (load_hacker_news) const axios = require('https://deno.land/x/axiod/mod.ts'); const HN_URL_TOP_STORIES = "https://hacker-news.firebaseio.com/v0/topstories.json"; function fetchStory(id) { return axios.get(`https://hacker-news.firebaseio.com/v0/item/${id}.json?print=pretty`) .then(response => response.data); } async function fetchHN() { const stories = await axios.get(HN_URL_TOP_STORIES); const storyIds = stories.data; // only the first 30 const tasks = storyIds.slice(0, 30).map(id => fetchStory(id)); return Promise.all(tasks) .then(stories => { return stories.map(story => { const { title, url, score } = story; return {title, url, score}; }); }); } ``` Prompt "interpret_the_group" ```prompt (interpret_the_group) Based on the following list of HackerNews threads, filter this list to only launches of new AI projects: {{fetched_articles}} ``` Prompt "format_and_rank" ```prompt (format_and_rank) Format this list of new AI projects in markdown, ranking the most interesting projects from most interesting to least. {{interpret_the_group}} ``` Using a python cell as our entrypoint, demonstrating inter-language execution: ```python articles = await fetchHN() format_and_rank(articles=articles) ```
At its core, Chidori brings a reactive runtime that orchestrates interactions between different agents and their components. Chidori accepts arbitrary Python or JavaScript code, taking over brokering and execution of it to allow for interruptions and reactivity. This allows you to get the benefits of these runtime behaviors while leveraging the patterns you're already familiar with.
Chidori ensures comprehensive monitoring and observability of your agents. We record all the inputs and outputs emitted by functions throughout the execution of your agent, enabling us to explain precisely what led to what, enhancing your debugging experience and understanding of the system’s production behavior.
With Chidori, you can take snapshots of your system and explore different possible outcomes from that point (branching), or rewind the system to a previous state (time-travel). This functionality improves error handling, debugging, and system robustness by offering alternative pathways and do-overs.
Chidori comes with first-class support for code interpretation for both Python and JavaScript. You can execute code directly within your system, providing quick startup, ease of use, and secure execution. We're continually working on additional safeguards against running untrusted code, with containerized environment support coming soon.
With our execution graph, preservation of state, and tools for debugging - Chidori is an exceptional environment for generating code during the evaluation of your agent. You can use this to leverage LLMs to achieve more generalized behavior and to evolve your agents over time.
- Reactive subscriptions between nodes
- Branching and time travel debugging, reverting execution of a graph
- Node.js, Python, and Rust support for building and executing graphs
- Simple local vector db for development
- Adding support for containerized nodes
- Analysis tools for comparing executions
- Adding support for more vector databases
- Adding support for other LLM sources
- Adding support for more code interpreter environments
- Agent re-evaluation with feedback
- Definitive patterns for human in the loop agents
This is an early open source release and we're looking for collaborators from the community. A good place to start would be to join our discord!
Our framework is inspired by the work of many others, including:
- Temporal.io - providing reliability and durability to workflows
- Eve - developing patterns for building reactive systems and reducing accidental complexity
- Timely Dataflow - efficiently streaming changes
- Langchain - developing tools and patterns for building with LLMs
Chidori is under the MIT license. See the LICENSE for more information.
Please star the GitHub repo and join our discord!