-
-
Notifications
You must be signed in to change notification settings - Fork 26.1k
FIX RuntimeWarning by dividing by zero in test_kernel_gradient #19396
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Hi @nuka137 thanks for this proposition! Could you add a test to assert that the warning is not thrown with this correction? 🙂 |
Thanks! I added the no warning test. |
@@ -19,7 +19,7 @@ | |||
|
|||
from sklearn.utils._testing import (assert_almost_equal, assert_array_equal, | |||
assert_array_almost_equal, | |||
assert_allclose, | |||
assert_allclose, assert_no_warnings, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe add a line break here to respect the structure of the imports.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, I added a line break.
Co-authored-by: Julien Jerphanion <git@jjerphan.xyz>
Co-authored-by: Julien Jerphanion <git@jjerphan.xyz>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @nuka137, here are a couple comments
sklearn/gaussian_process/kernels.py
Outdated
denominator = np.sqrt(D.sum(2))[:, :, np.newaxis] | ||
denominator[denominator == 0.0] = 1e-8 | ||
K_gradient = K[..., np.newaxis] * D / denominator |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I find it a bit cleaner to only divide where it's non zero
denominator = np.sqrt(D.sum(2))[:, :, np.newaxis]
K_gradient = K[..., np.newaxis] * np.divide(D, denominator, where=denominator != 0)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Won't it change the normalisation?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
D = 0 where denominator = 0, so K_gradient = 0 where denominator = 0, regardless of the normalization
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, that is much cleaner way.
I applied your suggestion, thanks.
@@ -52,7 +53,7 @@ | |||
@pytest.mark.parametrize('kernel', kernels) | |||
def test_kernel_gradient(kernel): | |||
# Compare analytic and numeric gradient of kernels. | |||
K, K_gradient = kernel(X, eval_gradient=True) | |||
K, K_gradient = assert_no_warnings(kernel, X, eval_gradient=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think we need to check that no warning is issued. Otherwise we'd have to to that for every line of code. The fact that the warning disappears with this PR should be enough to me.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 with @jeremiedbb
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I agree.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I reverted this line to original.
Co-authored-by: Jérémie du Boisberranger <34657725+jeremiedbb@users.noreply.github.com>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm. Thanks @nuka137
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very minor style improvement, otherwise LGTM.
Reference Issues/PRs
#19334
What does this implement/fix? Explain your changes.
The runtime error is caused by
Matern::__call__
function.The error will be raised when
D
has a zero element.So, I added small value
1e-8
tonp.sqrt(D.sum(2))[:, :, np.newaxis]
before dividingK[..., np.newaxis]
by it.Because
K[..., np.newaxis] * D
will be zero, I think above tweaking is no problem.