We currently ship transformers.js + onnxruntime + all-MiniLM-L6-v2
- This will never be used because we default to nomic-embed-text without user configuration
- We expect it to be slower and lower quality than Ollama + nomic-embed-text
- This is a significant part of our bundle size (something like 50M)
We used to need this because the preindexed docs were bound to all-MiniLM-L6-v2, but this is no longer the case.