$ OLLAMA_ORIGINS=https://webml-demo.vercel.app OLLAMA_HOST=127.0.0.1:11435 ollama serve
$ OLLAMA_HOST=127.0.0.1:11435 ollama pull mistral
"Xenova/all-MiniLM-L6-v2". For higher-quality embeddings on machines that can handle it, switch to
nomic-ai/nomic-embed-text-v1in
app/worker.ts.