πŸ¦™ Fully Client-Side Chat Over Documents πŸ¦™

  • 🏑Yes, it's another LLM-powered chat over documents implementation... but this one is entirely local!
  • βš™οΈThe default LLM is Mistral-7B run locally by Ollama. You'll need to install the Ollama desktop app and run the following commands to give this site access to the locally running model:
    $ OLLAMA_ORIGINS=https://webml-demo.vercel.app OLLAMA_HOST=127.0.0.1:11435 ollama serve

    Then, in another window:
    $ OLLAMA_HOST=127.0.0.1:11435 ollama pull mistral
  • πŸ—ΊοΈThe default embeddings are
    "Xenova/all-MiniLM-L6-v2"
    . For higher-quality embeddings on machines that can handle it, switch to
    nomic-ai/nomic-embed-text-v1
    in
    app/worker.ts
    .
  • πŸ™This template is open source - you can see the source code and deploy your own version from the GitHub repo!
  • πŸ‘‡Try embedding a PDF below, then asking questions! You can even turn off your WiFi.