Cellm is an Excel extension that lets you use Large Language Models (LLMs) like ChatGPT in cell formulas.
Cellm's =PROMPT()
function outputs AI responses to a range of text, similar to how Excel's =SUM()
function outputs the sum of a range of numbers.
For example, you can write =PROMPT(A1, "Extract all person names mentioned in the text.")
in a cell's formula and drag the cell to apply the prompt to many rows. Cellm is useful when you want to use AI for repetitive tasks that would normally require copy-pasting data in and out of a chat window many times.
This extension does one thing and one thing well:
- Calls LLMs in formulas and returns short answers suitable for cells
- Supports models from Anthropic, Mistral, OpenAI, and Google as well as locally hosted models via Llamafiles, Ollama, or vLLM
Say you're reviewing medical studies and need to quickly identify papers relevant to your research. Here's how Cellm can help:
720p.mp4
In this example, we copy the papers' titles and abstracts into Excel and write this prompt:
"If the paper studies diabetic neuropathy and stroke, return "Include", otherwise, return "Exclude"."
We then use autofill to apply the prompt to many papers. Simple and powerful.
Green cells denote correct classifications and red cells denote incorrect classifications. The models will make mistakes at times and it is your responsibility to validate that a model is accurate enough for your use case.
- Windows 10 or higher
- .NET 9.0 Runtime
- Excel 2010 or higher (desktop app)
-
Go to the Release page and download
Cellm-AddIn64-packed.xll
andappsettings.json
. Put them in the same folder. -
Double-click on
Cellm-AddIn64-packed.xll
and click on "Enable this add-in for this session only" when Excel opens. -
Download and install Ollama. Cellm uses Ollama and the Gemma 2 2B model by default.
For permanent installation and more options, see our Installation Guide.
Select a cell and type =PROMPT("What model are you and who made you?")
. The default model will tell you that it's called "Gemma" and made by Google DeepMind.
You can also use cell references. For example, copy a news article into cell A1 and type in cell B1: =PROMPT(A1, "Extract all person names mentioned in the text")
.
For more advanced usage, including function calling and configuration, see our Documentation.
Cellm supports:
- Hosted models from Anthropic, OpenAI, Mistral, and others
- Local models via Ollama, Llamafiles, or vLLM
For detailed information about configuring different models, see our documentation on Local Models and Hosted Models.
Cellm is useful for repetitive tasks on both structured and unstructured data:
- Text classification - Categorize survey responses, support tickets, etc.
- Model comparison - Compare results from different LLMs side by side
- Data cleaning - Standardize names, fix formatting issues
- Content summarization - Condense articles, papers, or reports
- Entity extraction - Pull out names, locations, dates from text
For more use cases and examples, see our Prompting Guide.
For build instructions with Visual Studio or command line, see our Development Guide.
A friend was writing a systematic review paper and had to compare 7,500 papers against inclusion/exclusion criteria. We thought this was a great use case for LLMs but quickly realized that individually copying papers in and out of chat windows was a total pain. This sparked the idea to make an AI tool to automate repetitive tasks for people who would rather avoid programming.
Cellm enables everyone to automate repetitive tasks with AI to a level that was previously available only to programmers.
Fair Core License, Version 1.0, Apache 2.0 Future License