24 releases
Uses new Rust 2024
| 0.1.25 | Aug 2, 2025 |
|---|---|
| 0.1.24 | Jul 20, 2025 |
#681 in Command line utilities
1,552 downloads per month
335KB
515 lines
Deep Search CLI

An AI-powered research assistant for your terminal.
Deep Search is a command-line tool that uses local large language models (LLMs) to provide in-depth answers to complex questions. It breaks down your query, scours the web for relevant information, and synthesizes a comprehensive response, all within your terminal.
Features
- AI-Powered Research: Leverages local LLMs (via Ollama) to understand and research your questions.
- Step-by-Step Process: Decomposes questions, searches multiple sources (Wikipedia, DuckDuckGo), filters for relevance, and summarizes findings.
- Local First: Works with your own Ollama-hosted models, keeping your data private.
- Minimalist CLI: A clean, focused interface for your research tasks.
How It Works
The tool follows a structured research workflow:
- Decompose: The initial question is broken down into smaller, specific sub-questions.
- Search: Each sub-question is researched using Wikipedia or DuckDuckGo.
- Filter: The search results are filtered to identify the most relevant sources.
- Summarize: The content of each relevant page is summarized.
- Evaluate: The summaries are used to construct a final answer. If the answer is incomplete, the process can be iterated with new sub-questions.
- Answer: A final, synthesized answer is presented to the user.
Installation
From Crates.io
Once the package is published to crates.io, you can install it directly using cargo:
cargo install deepsearch
This will install the deepsearch binary in your cargo bin directory, allowing you to run it from anywhere in your terminal.
From Source
-
Install Rust: If you don't have Rust, install it from rust-lang.org.
-
Install Ollama: You need a running Ollama instance. See the Ollama website for installation instructions.
-
Clone the repository:
git clone https://github.com/LightInn/deepsearch.git cd deepsearch -
Build the project:
For a development build, run:
cargo buildThe executable will be at
./target/debug/deepsearch.For a release (production) build, run:
cargo build --releaseThe executable will be at
./target/release/deepsearch.
Usage
Once built, you can run the tool from the command line.
From Release Build
./target/release/deepsearch "Your research question"
With Cargo
For development, you can run the tool directly with cargo:
cargo run -- "Your research question"
Parameters
You can customize the behavior of the tool with the following parameters:
--max-iterationsor-i: Set the maximum number of research iterations.--modelor-m: Specify the Ollama model to use.--verboseor-v: Enable verbose output for debugging purposes.
Example:
./target/release/deepsearch "How does photosynthesis work?" -i 5 -m "llama3"
This will start a research task on "How does photosynthesis work?", with a maximum of 5 iterations, using the llama3 model.
Contributing
Contributions are welcome! If you'd like to contribute, please feel free to submit a pull request or open an issue.
Prompt Engineering
A core part of this tool is the quality of the prompts used to interact with the LLM. If you have ideas for improving the prompts, you are encouraged to modify the src/prompts.rs file and submit a pull request. Better prompts lead to better research outcomes!
License
This project is licensed under the MIT License. See the LICENSE file for details.
Dependencies
~14–33MB
~449K SLoC