sentence-transformers as an optional dependency (lose: sem_join, sem_topk)#289
sentence-transformers as an optional dependency (lose: sem_join, sem_topk)#289andrewjradcliffe wants to merge 1 commit intomitdbg:mainfrom
sentence-transformers as an optional dependency (lose: sem_join, sem_topk)#289Conversation
…`sem_topk`) Motivation: there are scenarios in which the `torch` dependency implied by `sentence-transformers` can prevent the inclusion of this package in an application (binary size). With this modification, `sem_join` and `sem_topk` will simply throw an exception, but the rest of the semantic operators are unaffected. Not ideal, but has the intended effect.
|
Hi @andrewjradcliffe, thanks again for opening this PR! This PR gets at a major pain point that I would like to address (narrowing the scope of PZ dependencies, especially hefty ones like sentence transformers and torch). However, we also want to be able to support multimodal queries for all operators out-of-the-box. The one issue with making As a result, I'm going to hold this PR in limbo for at least one more week so that I have time to implement a fallback strategy for Semantic Top-K which will create a textual description of the image and then embed that description. In the long-term, my hope is that the frontier labs will release multimodal embedding models which will make this problem moot. |
|
Quite understandable and there's no rush. I was viewing it from the perspective of using the Admittedly, in Rust the I think the principle of least surprise should apply, which implies that adding another flag (--extra) is not ideal. |
Motivation: there are scenarios in which the
torchdependency implied bysentence-transformerscan prevent the inclusion of this package in an application (binary size). With this modification,sem_joinandsem_topkwill simply throw an exception, but the rest of the semantic operators are unaffected. Not ideal, but has the intended effect.