Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle warnings output by tests #32

Open
riley-harper opened this issue Aug 2, 2022 · 0 comments
Open

Handle warnings output by tests #32

riley-harper opened this issue Aug 2, 2022 · 0 comments
Labels
tests Unit and integration tests for code

Comments

@riley-harper
Copy link
Contributor

There are four unique warnings output when the tests run (they are repeated many times though). Three of the warnings are deprecation warnings coming from PySpark, which we don't have much control over. Hopefully they will be fixed in future versions of PySpark; we'll see.

The last warning is

/usr/local/lib/python3.10/site-packages/sklearn/metrics/_ranking.py:874: UserWarning: No positive class found in y_true, recall is set to one for all thresholds.

This is caused by the fact that some of our test sets are very small, and so might not have a good mix of positive and negative classifications. This isn't important for this test, so let's try to ignore or silence the error in pytest. It would be good if we could scope this to the single test instead of doing it globally for all tests.

@riley-harper riley-harper added the tests Unit and integration tests for code label Aug 2, 2022
@riley-harper riley-harper self-assigned this Aug 2, 2022
@riley-harper riley-harper removed their assignment Jun 2, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
tests Unit and integration tests for code
Projects
None yet
Development

No branches or pull requests

1 participant