Skip to content

Fix failing tests for GPyTorch v1.10 #956

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Apr 18, 2023
Merged

Conversation

BenjaminBossan
Copy link
Collaborator

@BenjaminBossan BenjaminBossan commented Apr 16, 2023

A couple of changes in GPyTorch v1.10 resulted in our tests failing. This PR fixes those tests. The actual integration has not changed, though, only tests.

Be aware that CI will broken until this PR is merged.

Note

The updated tests will now fail for GPyTorch v1.9 and and below. However, I think it is not worth it to write the tests in a backwards compatible manner. First of all, the GPyTorch integration does not appear to be widely used. Second, this only affects the tests, not the actual implementation, so normal users should have no issues. Developers would need to upgrade GPyTorch to avoid the error.

I had to add code for backwards compatible tests after all because GPyTorch v1.10 is not available for Python 3.7, which we still test.

Implementation

Initially, we had issues with GPyTorch kernels not being pickleable/copyable. Therefore, the base testing class tested for those errors. Now, 2 of the 3 actual test classes (ExactGPRegressor and, since v1.10, GPRegressor) work. Therefore, I refactored the base test class to implement the working tests and only the 1 remaining class not working, GPBinaryClassifier, to implement the tests that raise.

Content-wise, these failing tests have not been changed. The tests for the working case are now a bit more exact by testing inference results before and after pickling.

Another issue I encountered was that the default likelihood for GPBinaryClassifier, BernoulliLikelihood, now no longer has any arguments. This is annoying because we want to test a few things with likelihood params, e.g. that params in grid search are passed correctly to the likelihood (since it is an extra module, this could be a real source of errors). Therefore, for testing purposes, I created a likelihood module class that does take an argument (the same as it used to take before v1.10).

I also fixed an incorrect comment from a test, test_set_params_uninitialized_net_correct_message. This was probably a copy/paste oversight.

A couple of changes in GPyTorch v1.10 resulted in our tests failing.
This PR fixes those tests. The actual integration has not changed,
though, only tests.

Be aware that CI will broken until this PR is merged.

Note

The updated tests will now fail for GPyTorch v1.9 and and below.
However, I think it is not worth it to write the tests in a backwards
compatible manner. First of all, the GPyTorch integration does not
appear to be widely used. Second, this only affects the tests, not the
actual implementation, so normal users should have no issues. Developers
would need to upgrade GPyTorch to avoid the error.

Implementation

Initially, we had issues with GPyTorch kernels not being
pickleable/copyable. Therefore, the base testing class tested for those
errors. Now, 2 of the 3 actual test classes (ExactGPRegressor and, since
v1.10, GPRegressor) work. Therefore, I refactored the base test class to
implement the working tests and only the 1 remaining class not working,
GPBinaryClassifier, to implement the tests that raise.

Content-wise, these failing tests have not been changed. The tests for
the working case are now a bit more exact by testing inference results
before and after pickling.

Another issue I encountered was that the default likelihood for
GPBinaryClassifier, BernoulliLikelihood, now no longer has any
arguments. This is annoying because we want to test a few things with
likelihood params, e.g. that params in grid search are passed correctly
to the likelihood (since it is an extra module, this could be a real
source of errors). Therefore, for testing purposes, I created a
likelihood module class that does take an argument (the same as it used
to take before v1.10).
This is necessary because apparently, newer GPyTorch versions don't
support Python 3.7 and below.
Copy link
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor comment, otherwise LGTM

Comment on lines 127 to 128
def __init__(self, *args, max_plate_nesting=1, **kwargs):
self.max_plate_nesting = max_plate_nesting
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given that the parameter is not used. Can this name be changed to something that is meaningless in the context of Gaussian Processes?

Suggested change
def __init__(self, *args, max_plate_nesting=1, **kwargs):
self.max_plate_nesting = max_plate_nesting
def __init__(self, *args, search_parameter=1, **kwargs):
self.search_parameter = search_parameter

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I renamed the parameter to some_parameter. At first, I thought it should not work because the same test is re-used for the 2 other classes of GP that use a different likelihood. But it turns out that passing a non-existing parameter does not cause an error:

https://github.com/cornellius-gp/gpytorch/blob/d23099674f67f04f40467964c37fc02f2e2a8dda/gpytorch/likelihoods/gaussian_likelihood.py#L104-L114

It is a bit strange, thus I put a comment there, but I guess for the purpose of the test we can live with that.

To make it clear that the parameter in the test is not a real parameter.
@thomasjpfan thomasjpfan merged commit 7276c93 into master Apr 18, 2023
@BenjaminBossan BenjaminBossan deleted the gpytorch-v110-fix-tests branch April 18, 2023 16:03
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants