You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
Hi, I have been testing out the DenseHMM implementation in Pomegranate v1.0.3 for models with > 50 states, and have occasionally encountered a bug where initializing the model.edges matrix using the model.add_edge results in an edges matrix containing NaNs. The NaNs then propagate through downstream calculations including model.forward_backward and model.predict.
I tracked this down to the following snippet located here:
where torch.empty sometimes returns an array with NaNs, and NaN - float("inf") = NaN. I think additionally, because in my testing I don't set every entry of the edges matrix manually to a specific probability, subsequent usage of model.edges propagates those NaNs.
To Reproduce
Since the bug (I think) comes from the initialization of a large array using torch.empty, the simplest way I have been able to reproduce it is using the above snippet where n is large (> 50), and then not fill in every edge.
The quickest fix I have found is to just pre-set the matrix with torch.log(torch.zeroes((n,n)).
I'm a big fan of the package, and thank you for all the effort you've put in developing it. Just wanted to put this on your radar.
The text was updated successfully, but these errors were encountered:
Describe the bug
Hi, I have been testing out the DenseHMM implementation in Pomegranate v1.0.3 for models with > 50 states, and have occasionally encountered a bug where initializing the model.edges matrix using the model.add_edge results in an edges matrix containing NaNs. The NaNs then propagate through downstream calculations including model.forward_backward and model.predict.
I tracked this down to the following snippet located here:
where torch.empty sometimes returns an array with NaNs, and NaN - float("inf") = NaN. I think additionally, because in my testing I don't set every entry of the edges matrix manually to a specific probability, subsequent usage of model.edges propagates those NaNs.
To Reproduce
Since the bug (I think) comes from the initialization of a large array using torch.empty, the simplest way I have been able to reproduce it is using the above snippet where n is large (> 50), and then not fill in every edge.
The quickest fix I have found is to just pre-set the matrix with torch.log(torch.zeroes((n,n)).
I'm a big fan of the package, and thank you for all the effort you've put in developing it. Just wanted to put this on your radar.
The text was updated successfully, but these errors were encountered: