the text-based terminal client for Ollama.
- intuitive and simple terminal UI, no need to run servers, frontends, just type
oterm
in your terminal. - multiple persistent chat sessions, stored together with the context embeddings and system prompt customizations in sqlite.
- can use any of the models you have pulled in Ollama, or your own custom models.
- allows for easy customization of the model's system prompt and parameters.
Using brew
for MacOS:
brew tap ggozad/formulas
brew install ggozad/formulas/oterm
Using pip
:
pip install oterm
In order to use oterm
you will need to have the Ollama server running. By default it expects to find the Ollama API running on http://0.0.0.0:11434/api
. If you are running Ollama inside docker or on a different host/port, use the OLLAMA_HOST
environment variable to customize the host/port. Alternatively you can use OLLAMA_URL
to specify the full http(s) url. Setting OTERM_VERIFY_SSL
to False
will disable SSL verification.
OLLAMA_URL=http://host:port/api
The following keyboard shortcuts are supported:
-
^ Ctrl+N - create a new chat session
-
^ Ctrl+E - edit the chat session (change template, system prompt or format)
-
^ Ctrl+R - rename the current chat session
-
^ Ctrl+S - export the current chat session as markdown
-
^ Ctrl+X - delete the current chat session
-
^ Ctrl+T - toggle between dark/light theme
-
^ Ctrl+Q - quit
-
^ Ctrl+L - switch to multiline input mode
-
^ Ctrl+P - select an image to include with the next message
-
↑ - navigate through history of previous prompts
While Ollama is inferring the next message, you can press Esc to cancel the inference.
Note that some of the shortcuts may not work in a certain context, for example pressing ↑ while the prompt is in multi-line mode.
When creating a new chat, you may not only select the model, but also customize the template
as well as the system
instruction to pass to the model. Checking the JSON output
checkbox will cause the model reply in JSON format. Please note that oterm
will not (yet) pull models for you, use ollama
to do that. All the models you have pulled or created will be available to oterm
.
You can also "edit" the chat to change the template, system prompt or format. Note, that the model cannot be changed once the chat has started. In addition whatever "context" the chat had (an embedding of the previous messages) will be kept.
All your chat sessions are stored locally in a sqlite database. You can customize the directory where the database is stored by setting the OTERM_DATA_DIR
environment variable.
You can find the location of the database by running oterm --db
.
This project is licensed under the MIT License.