Skip to content Skip to sidebar Skip to footer

Fasttext 0.9.2 - Recall Is 'nan' But Precision Is A Number

I trained a supervised model in FastText using the Python interface and I'm getting weird results for precision and recall. First, I trained a model: model = fasttext.train_superv

Solution 1:

It looks like FastText 0.9.2 has a bug in the computation of recall, and that should be fixed with this commit.

Installing a "bleeding edge" version of FastText e.g. with

pip install git+https://github.com/facebookresearch/fastText.git@b64e359d5485dda4b4b5074494155d18e25c8d13 --quiet

and rerunning your code should allow to get rid of the nan values in the recall computation.

Post a Comment for "Fasttext 0.9.2 - Recall Is 'nan' But Precision Is A Number"