Skip to content

av/harbor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Apr 19, 2025
86421be · Apr 19, 2025
Apr 19, 2025
Apr 19, 2025
Feb 8, 2025
Jan 12, 2025
Nov 2, 2024
Feb 16, 2025
Feb 8, 2025
Oct 2, 2024
Nov 2, 2024
Apr 19, 2025
Nov 2, 2024
Nov 2, 2024
Nov 2, 2024
Nov 24, 2024
Apr 14, 2025
Nov 9, 2024
Nov 2, 2024
Jan 26, 2025
Feb 1, 2025
Feb 8, 2025
Apr 19, 2025
Mar 29, 2025
Nov 24, 2024
Feb 8, 2025
Sep 15, 2024
Oct 12, 2024
Nov 2, 2024
Nov 2, 2024
Apr 5, 2025
Feb 8, 2025
Nov 18, 2024
Jan 11, 2025
Nov 2, 2024
Dec 25, 2024
Dec 29, 2024
Mar 1, 2025
Apr 19, 2025
Apr 14, 2025
Mar 29, 2025
Dec 29, 2024
Nov 2, 2024
Feb 8, 2025
Mar 22, 2025
Nov 2, 2024
Sep 5, 2024
Apr 19, 2025
Apr 5, 2025
Apr 5, 2025
Apr 5, 2025
Feb 22, 2025
Apr 19, 2025
Mar 7, 2025
Feb 8, 2025
Nov 2, 2024
Apr 11, 2025
Nov 2, 2024
Nov 17, 2024
Apr 5, 2025
Feb 1, 2025
Nov 2, 2024
Dec 29, 2024
Mar 15, 2025
Nov 2, 2024
Nov 2, 2024
Nov 2, 2024
Nov 17, 2024
Nov 8, 2024
Feb 23, 2025
Apr 19, 2025
Mar 3, 2025
Nov 9, 2024
Mar 1, 2025
Mar 15, 2025
Nov 2, 2024
Apr 5, 2025
Apr 19, 2025
Feb 1, 2025
Feb 3, 2025
Feb 8, 2025
Nov 2, 2024
Apr 5, 2025
Feb 8, 2025
Nov 2, 2024
Mar 1, 2025
Apr 5, 2025
Mar 1, 2025
Nov 2, 2024
Mar 1, 2025
Nov 2, 2024
Mar 13, 2025
Aug 12, 2024
Sep 19, 2024
Mar 1, 2025
Sep 22, 2024
Jul 27, 2024
Apr 19, 2025
Feb 22, 2025
Jan 12, 2025
Sep 4, 2024
Feb 1, 2025
Nov 2, 2024
Oct 2, 2024
Sep 14, 2024
Sep 5, 2024
Sep 22, 2024
Nov 2, 2024
Feb 1, 2025
Apr 12, 2025
Aug 5, 2024
Nov 9, 2024
Aug 27, 2024
Jan 26, 2025
Jan 25, 2025
Aug 7, 2024
Sep 23, 2024
Nov 24, 2024
Feb 8, 2025
Sep 15, 2024
Nov 2, 2024
Nov 2, 2024
Aug 2, 2024
Sep 14, 2024
Nov 9, 2024
Jan 11, 2025
Sep 17, 2024
Dec 25, 2024
Dec 29, 2024
Mar 1, 2025
Apr 19, 2025
Feb 22, 2025
Mar 29, 2025
Nov 2, 2024
Oct 1, 2024
Jan 23, 2025
Mar 22, 2025
Aug 2, 2024
Sep 14, 2024
Sep 5, 2024
Apr 19, 2025
Apr 5, 2025
Apr 5, 2025
Apr 5, 2025
Feb 22, 2025
Sep 9, 2024
Feb 8, 2025
Mar 11, 2025
Feb 8, 2025
Sep 17, 2024
Apr 12, 2025
Sep 7, 2024
Nov 17, 2024
Feb 1, 2025
Nov 2, 2024
Dec 29, 2024
Mar 15, 2025
Nov 2, 2024
Nov 2, 2024
Sep 3, 2024
Mar 1, 2025
Nov 9, 2024
Nov 2, 2024
Feb 9, 2025
Nov 9, 2024
Aug 5, 2024
Mar 15, 2025
Oct 4, 2024
Apr 19, 2025
Sep 13, 2024
Feb 3, 2025
Feb 8, 2025
Sep 28, 2024
Apr 5, 2025
Nov 2, 2024
Nov 2, 2024
Mar 22, 2025
Mar 1, 2025
Nov 2, 2024
Mar 1, 2025
Sep 14, 2024
Mar 13, 2025
Apr 5, 2025
Sep 17, 2024
Sep 4, 2024
Aug 12, 2024
Aug 12, 2024
Mar 1, 2025
Aug 12, 2024
Sep 17, 2024
Aug 12, 2024
Aug 12, 2024
Aug 12, 2024
Aug 12, 2024
Aug 12, 2024
Sep 13, 2024
Aug 12, 2024
Aug 12, 2024
Oct 2, 2024
Oct 2, 2024
Apr 19, 2025
Mar 1, 2025
Aug 4, 2024
Sep 22, 2024
Sep 22, 2024
Sep 22, 2024
Sep 22, 2024
Sep 22, 2024
Sep 22, 2024
Sep 22, 2024
Sep 22, 2024
Sep 22, 2024
Sep 22, 2024
Sep 22, 2024
Sep 22, 2024
Nov 9, 2024
Nov 9, 2024
Aug 27, 2024
Aug 27, 2024
Aug 27, 2024
Aug 27, 2024
Aug 27, 2024
Aug 27, 2024
Aug 27, 2024
Aug 27, 2024
Aug 27, 2024
Aug 27, 2024
Jan 26, 2025
Aug 7, 2024
Sep 17, 2024
Aug 7, 2024
Mar 1, 2025
Sep 2, 2024
Sep 23, 2024
Feb 8, 2025
Mar 1, 2025
Sep 14, 2024
Mar 1, 2025
Jan 11, 2025
Mar 1, 2025
Sep 17, 2024
Nov 21, 2024
Apr 19, 2025
Apr 19, 2025
Dec 29, 2024
Dec 29, 2024
Aug 1, 2024
Aug 1, 2024
Mar 1, 2025
Jan 23, 2025
Mar 22, 2025
Mar 22, 2025
Mar 1, 2025
Aug 4, 2024
Mar 1, 2025
Sep 13, 2024
Sep 17, 2024
Apr 19, 2025
Apr 19, 2025
Apr 19, 2025
Apr 5, 2025
Apr 5, 2025
Mar 1, 2025
Sep 4, 2024
Feb 8, 2025
Mar 1, 2025
Oct 3, 2024
Feb 22, 2025
Aug 4, 2024
Mar 10, 2025
Sep 17, 2024
Sep 23, 2024
Sep 9, 2024
Aug 4, 2024
Aug 4, 2024
Aug 4, 2024
Sep 17, 2024
Aug 4, 2024
Aug 4, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Mar 1, 2025
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Dec 29, 2024
Mar 15, 2025
Mar 1, 2025
Aug 9, 2024
Sep 17, 2024
Sep 17, 2024
Apr 19, 2025
Mar 1, 2025
Nov 17, 2024
Apr 19, 2025
Aug 3, 2024
Aug 3, 2024
Sep 17, 2024
Nov 17, 2024
Mar 15, 2025
Mar 15, 2025
Mar 1, 2025
Sep 13, 2024
Mar 1, 2025
Jan 18, 2025
Feb 8, 2025
Mar 1, 2025
Sep 28, 2024
Mar 1, 2025
Aug 4, 2024
Mar 1, 2025
Aug 11, 2024
Mar 1, 2025
Aug 4, 2024
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Apr 19, 2025
Mar 1, 2025
Mar 29, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 22, 2025
Mar 1, 2025
Mar 1, 2025
Apr 19, 2025
Apr 5, 2025
Apr 5, 2025
Apr 5, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 15, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Mar 1, 2025
Aug 4, 2024
Mar 1, 2025
Aug 10, 2024
Sep 17, 2024
Mar 1, 2025
Aug 4, 2024
Jan 12, 2025
Aug 9, 2024
Aug 2, 2024
Sep 22, 2024
Aug 2, 2024
Aug 7, 2024
Jan 11, 2025
Sep 17, 2024
Aug 2, 2024
Aug 2, 2024
Mar 22, 2025
Apr 8, 2025
Aug 4, 2024
Oct 3, 2024
Aug 2, 2024
Sep 7, 2024
Dec 29, 2024
Aug 9, 2024
Nov 8, 2024
Aug 2, 2024
Apr 19, 2025
Sep 13, 2024
Feb 8, 2025
Feb 8, 2025
Aug 2, 2024
Aug 2, 2024
Aug 4, 2024
Mar 22, 2025
Apr 5, 2025
Apr 19, 2025
Sep 27, 2024
Apr 19, 2025
Oct 12, 2024
Apr 19, 2025
Feb 2, 2025

Repository files navigation

Harbor project logo

GitHub Tag NPM Version PyPI - Version GitHub repo size GitHub repo file or directory count Visitors GitHub language count Discord Harbor Ko-fi

Effortlessly run LLM backends, APIs, frontends, and services with one command.

Harbor is a containerized LLM toolkit that allows you to run LLMs and additional services. It consists of a CLI and a companion App that allows you to manage and run AI services with ease.

Screenshot of Harbor CLI and App together

Documentation

What can Harbor do?

Diagram outlining Harbor's service structure

✦ Local LLMs

Run LLMs and related services locally, with no or minimal configuration, typically in a single command or click.

# Starts fully configured
# Open WebUI and Ollama
harbor up

# All backends are pre-connected to Open WebUI
harbor up vllm

Cutting Edge Inference

Harbor supports most of the major inference engines as well as a few of the lesser-known ones.

# We sincerely hope you'll never try to run all of them at once
harbor up vllm llamacpp tgi litellm tabbyapi aphrodite sglang ktransformers mistralrs airllm

Tool Use

Enjoy the benefits of MCP ecosystem, extend it to your use-cases.

# Manage MCPs with a convenient Web UI
harbor up metamcp

# Connect MCPs to Open WebUI
harbor up metamcp mcpo

Generate Images

Harbor includes ComfyUI + Flux + Open WebUI integration.

# Use FLUX in Open WebUI in one command
harbor up comfyui

Local Web RAG / Deep Research

Harbor includes SearXNG that is pre-connected to a lot of services out of the box: Perplexica, ChatUI, Morphic, Local Deep Research and more.

# SearXNG is pre-connected to Open WebUI
harbor up searxng

# And to many other services
harbor up searxng chatui
harbor up searxng morphic
harbor up searxng perplexica
harbor up searxng ldr

LLM Workflows

Harbor includes multiple services for build LLM-based data and chat workflows: Dify, LitLytics, n8n, Open WebUI Pipelines, FloWise, LangFlow

# Use Dify in Open WebUI
harbor up dify

Talk to your LLM

Setup voice chats with your LLM in a single command. Open WebUI + Speaches

# Speaches includes OpenAI-compatible SST and TTS
# and connected to Open WebUI out of the box
harbor up speaches

Chat from the phone

You can access Harbor services from your phone with a QR code. Easily get links for local, LAN or Docker access.

# Print a QR code to open the service on your phone
harbor qr
# Print a link to open the service on your phone
harbor url webui

Chat from anywhere

Harbor includes a built-in tunneling service to expose your Harbor to the internet.

# Expose your Harbor to the internet
harbor tunnel

LLM Scripting

Harbor Boost allows you to easily script workflows and interactions with downstream LLMs.

# Use Harbor Boost to script LLM workflows
harbor up boost

Config Profiles

Save and manage configuration profiles for different scenarios. For example - save llama.cpp args for different models and contexts and switch between them easily.

# Save and use config profiles
harbor profile save llama4
harbor profile use default

Command History

Harbor keeps a local-only history of recent commands. Look up and re-run easily, standalone from the system shell history.

# Lookup recently used harbor commands
harbor history

Eject

Ready to move to your own setup? Harbor will give you a docker-compose file replicating your setup.

# Eject from Harbor into a standalone Docker Compose setup
# Will export related services and variables into a standalone file.
harbor eject searxng llamacpp > docker-compose.harbor.yml

Services

UIs

Open WebUI ⦁︎ ComfyUI ⦁︎ LibreChat ⦁︎ HuggingFace ChatUI ⦁︎ Lobe Chat ⦁︎ Hollama ⦁︎ parllama ⦁︎ BionicGPT ⦁︎ AnythingLLM ⦁︎ Chat Nio ⦁︎ mikupad ⦁︎ oterm

Backends

Ollama ⦁︎ llama.cpp ⦁︎ vLLM ⦁︎ TabbyAPI ⦁︎ Aphrodite Engine ⦁︎ mistral.rs ⦁︎ openedai-speech ⦁︎ Speaches ⦁︎ Parler ⦁︎ text-generation-inference ⦁︎ LMDeploy ⦁︎ AirLLM ⦁︎ SGLang ⦁︎ KTransformers ⦁︎ Nexa SDK ⦁︎ KoboldCpp

Satellites

Harbor Bench ⦁︎ Harbor Boost ⦁︎ SearXNG ⦁︎ Perplexica ⦁︎ Dify ⦁︎ Plandex ⦁︎ LiteLLM ⦁︎ LangFuse ⦁︎ Open Interpreter ⦁ ︎cloudflared ⦁︎ cmdh ⦁︎ fabric ⦁︎ txtai RAG ⦁︎ TextGrad ⦁︎ Aider ⦁︎ aichat ⦁︎ omnichain ⦁︎ lm-evaluation-harness ⦁︎ JupyterLab ⦁︎ ol1 ⦁︎ OpenHands ⦁︎ LitLytics ⦁︎ Repopack ⦁︎ n8n ⦁︎ Bolt.new ⦁︎ Open WebUI Pipelines ⦁︎ Qdrant ⦁︎ K6 ⦁︎ Promptfoo ⦁︎ Webtop ⦁︎ OmniParser ⦁︎ Flowise ⦁︎ Langflow ⦁︎ OptiLLM ⦁︎ Morphic ⦁︎ SQL Chat ⦁︎ gptme ⦁︎ traefik ⦁︎ Latent Scope ⦁︎ RAGLite ⦁︎ llama-swap ⦁︎ LibreTranslate ⦁︎ MetaMCP ⦁︎ mcpo

See services documentation for a brief overview of each.

CLI Tour

# Run Harbor with default services:
# Open WebUI and Ollama
harbor up

# Run Harbor with additional services
# Running SearXNG automatically enables Web RAG in Open WebUI
harbor up searxng

# Speaches includes OpenAI-compatible SST and TTS
# and connected to Open WebUI out of the box
harbor up speaches

# Run additional/alternative LLM Inference backends
# Open Webui is automatically connected to them.
harbor up llamacpp tgi litellm vllm tabbyapi aphrodite sglang ktransformers

# Run different Frontends
harbor up librechat chatui bionicgpt hollama

# Get a free quality boost with
# built-in optimizing proxy
harbor up boost

# Use FLUX in Open WebUI in one command
harbor up comfyui

# Use custom models for supported backends
harbor llamacpp model https://huggingface.co/user/repo/model.gguf

# Access service CLIs without installing them
# Caches are shared between services where possible
harbor hf scan-cache
harbor hf download google/gemma-2-2b-it
harbor ollama list

# Shortcut to HF Hub to find the models
harbor hf find gguf gemma-2
# Use HFDownloader and official HF CLI to download models
harbor hf dl -m google/gemma-2-2b-it -c 10 -s ./hf
harbor hf download google/gemma-2-2b-it

# Where possible, cache is shared between the services
harbor tgi model google/gemma-2-2b-it
harbor vllm model google/gemma-2-2b-it
harbor aphrodite model google/gemma-2-2b-it
harbor tabbyapi model google/gemma-2-2b-it-exl2
harbor mistralrs model google/gemma-2-2b-it
harbor opint model google/gemma-2-2b-it
harbor sglang model google/gemma-2-2b-it

# Convenience tools for docker setup
harbor logs llamacpp
harbor exec llamacpp ./scripts/llama-bench --help
harbor shell vllm

# Tell your shell exactly what you think about it
harbor opint
harbor aider
harbor aichat
harbor cmdh

# Use fabric to LLM-ify your linux pipes
cat ./file.md | harbor fabric --pattern extract_extraordinary_claims | grep "LK99"

# Open services from the CLI
harbor open webui
harbor open llamacpp
# Print yourself a QR to quickly open the
# service on your phone
harbor qr
# Feeling adventurous? Expose your Harbor
# to the internet
harbor tunnel

# Config management
harbor config list
harbor config set webui.host.port 8080

# Create and manage config profiles
harbor profile save l370b
harbor profile use default

# Lookup recently used harbor commands
harbor history

# Eject from Harbor into a standalone Docker Compose setup
# Will export related services and variables into a standalone file.
harbor eject searxng llamacpp > docker-compose.harbor.yml

# Run a built-in LLM benchmark with
# your own tasks
harbor bench run

# Gimmick/Fun Area

# Argument scrambling, below commands are all the same as above
# Harbor doesn't care if it's "vllm model" or "model vllm", it'll
# figure it out.
harbor model vllm
harbor vllm model

harbor config get webui.name
harbor get config webui_name

harbor tabbyapi shell
harbor shell tabbyapi

# 50% gimmick, 50% useful
# Ask harbor about itself
harbor how to ping ollama container from the webui?

Harbor App Demo

2024-09-29.17-22-06.mp4

In the demo, Harbor App is used to launch a default stack with Ollama and Open WebUI services. Later, SearXNG is also started, and WebUI can connect to it for the Web RAG right out of the box. After that, Harbor Boost is also started and connected to the WebUI automatically to induce more creative outputs. As a final step, Harbor config is adjusted in the App for the klmbr module in the Harbor Boost, which makes the output unparsable for the LLM (yet still undetstandable for humans).

Why?

  • If you're comfortable with Docker and Linux administration - you likely don't need Harbor to manage your local LLM environment. However, while growing it - you're also likely to eventually arrive to a similar solution. I know this for a fact, since that's exactly how Harbor came to be.
  • Harbor is not designed as a deployment solution, but rather as a helper for the local LLM development environment. It's a good starting point for experimenting with LLMs and related services.
  • Workflow/setup centralisation - you can be sure where to find a specific config or service, logs, data and configuration files.
  • Convenience factor - single CLI with a lot of services and features, accessible from anywhere on your host.

Supporters

@av's wife @burnth3heretic