If you convert facebook/bart-large-mnli and use it to evaluate the demo text at huggingface and compare against a local Python setup for verification, we find that:
- the online demo card and the local Python agree on the label score
- the label probabilities given back are vastly different
- the Python version takes roughly 16 seconds on my local machine, but the Spago version takes 37 seconds - this is a MAC and there is no GPU available
Python code is
text = "A new model offers an explanation for how the Galilean satellites formed around the solar system’s " \
"largest world. Konstantin Batygin did not set out to solve one of the solar system’s most puzzling " \
"mysteries when he went for a run up a hill in Nice, France. Dr. Batygin, a Caltech researcher, " \
"best known for his contributions to the search for the solar system’s missing “Planet Nine,” spotted a " \
"beer bottle. At a steep, 20 degree grade, he wondered why it wasn’t rolling down the hill. He realized " \
"there was a breeze at his back holding the bottle in place. Then he had a thought that would only pop " \
"into the mind of a theoretical astrophysicist: “Oh! This is how Europa formed.” Europa is one of " \
"Jupiter’s four large Galilean moons. And in a paper published Monday in the Astrophysical Journal, " \
"Dr. Batygin and a co-author, Alessandro Morbidelli, a planetary scientist at the Côte d’Azur Observatory " \
"in France, present a theory explaining how some moons form around gas giants like Jupiter and Saturn, " \
"suggesting that millimeter-sized grains of hail produced during the solar system’s formation became " \
"trapped around these massive worlds, taking shape one at a time into the potentially habitable moons we " \
"know today. "
cc = pipeline("zero-shot-classification", model="facebook/bart-large-mnli")
labels = ['space & cosmos', 'scientific discovery', 'microbiology', 'robots', 'archeology']
r = cc(text, labels, multi_class=True)
Go code, with same text and classes, is:
bartModel, err = zsc.LoadModel(bartDir)
// ... check err
result, err := bartModel.Classify(c.InputText, "", classes, true)
Similarly using the model valhalla/distilbart-mnli-12-3 also gives wildly different results to the online huggingface demo, using the same text and label set as above.
So, is there something else I need to do, or is the zsc code not working? My go code is essentially just like the zsc demo code.
If you convert
facebook/bart-large-mnliand use it to evaluate the demo text at huggingface and compare against a local Python setup for verification, we find that:Python code is
Go code, with same text and classes, is:
Similarly using the model
valhalla/distilbart-mnli-12-3also gives wildly different results to the online huggingface demo, using the same text and label set as above.So, is there something else I need to do, or is the zsc code not working? My go code is essentially just like the zsc demo code.