Skip to content

Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐

License

Notifications You must be signed in to change notification settings

jeffmartson/DistiLlama

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

66 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DistiLlama

image

What is DistiLlama?

DistiLlama is a Chrome extension that leverages locally running LLM perform following tasks.

Overview

One of the things that I was experimenting with is how to use a locally running LLM instance for various tasks and summarization (tl;dr) was on the top of my list. It was key to have all calls to LLM be local and all the data to stay private.

This project utilizes Ollama as the locally running LLM instance. Ollama is a great project that is easy to setup and use. I highly recommend checking it out.

To generate the summary I am using the following approach:

  • Grab the current active tab id
  • Use Readability to extract the text content from the page. In my experiments it was clear that the quality of the summary was much better when using Readability as it removed a lot of un-necessary content from the page.
  • Use LangChain (LangChain.js) to summarize the text content.
  • Display the summary in a popup window.

How to use DistiLlama?

  • Prerequisites:

  • Clone this repo

    • Install pnpm npm install -g pnpm
    • run yarn install
    • run yarn dev
    • Open Chrome and navigate to chrome://extensions/
      • Enable developer mode (if not already enabled)
      • Click on Load unpacked and select the dist folder from the base of the cloned project.
      • You should see the DistiLlama added to your Chrome extensions.
      • You may want to pin the extension to your Chrome toolbar for easy access.
  • If you decide to use a different LLM you will need to change this line in src/pages/sidePanel/Summarize.ts

    const llm = new ChatOllama({
        baseUrl: 'http://localhost:11435', // change if you are using a different endpoint
        temperature: 0.3, // change if you want to experiment with different temperatures
        model: 'mistral', // change if you want to use a different model
        });
  • If you would like to tweak the summarization chain change these lines in src/pages/sidePanel/Summarize.ts

    const chain = loadSummarizationChain(llm, {
      type: 'map_reduce', // you can choose from map_reduce, stuff or refine
      verbose: true, // to view the steps in the console
    });

Demo

Chat with LLM

Chat

Chat with Documents (PDF)

ChatWithDocs

Chat with Web Page

ChatWithPage

Summarization

Summary

TODOS

  • Make the summarization chain configurable
  • Make LLM model configurable
  • Save summary in local storage
  • Improve the UI (not an expert in this area but will try to learn)
  • Add TTS support
  • Check out performance with different tuned prompts
  • Extend to chat with the page (use embeddings and LLMs for RAG)
  • Use transformers.js for local in browser embeddings and Voy for the storage similar to this Building LLM-Powered Web Apps with Client-Side Technology
  • Focus on improving quality of the summarization and chat

References and Inspiration

About

Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 69.5%
  • CSS 20.5%
  • JavaScript 9.0%
  • Other 1.0%