🏠 Fully Client-Side Chat Over Documents 🏠

🦀 Voy + 🦙 Ollama + 🦜🔗 LangChain.js + 🤗 Transformers.js

  • 🏡Yes, it's another chat over documents implementation... but this one is entirely local!
  • ⚙️The default LLM is Llama 2 run locally by Ollama. You'll need to install the Ollama desktop app and run the following commands to give this site access to the locally running model:
    $ OLLAMA_ORIGINS=https://webml-demo.vercel.app OLLAMA_HOST=127.0.0.1:11435 ollama serve

    Then, in another window:
    $ OLLAMA_HOST=127.0.0.1:11435 ollama pull mistral
  • 🐙This template is open source - you can see the source code and deploy your own version from the GitHub repo!
  • 👇Try embedding a PDF below, then asking questions! You can even turn off your WiFi.