Skip to content

[Bug] settings.variational_cholesky_jitter does not work in run time #2244

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
LuhuanWu opened this issue Jan 7, 2023 · 1 comment · Fixed by #2255
Closed

[Bug] settings.variational_cholesky_jitter does not work in run time #2244

LuhuanWu opened this issue Jan 7, 2023 · 1 comment · Fixed by #2255
Labels

Comments

@LuhuanWu
Copy link
Contributor

LuhuanWu commented Jan 7, 2023

🐛 Bug

To reproduce

Using settings.variational_cholesky_jitter in the model run time does not change the actual jitter value

** Code snippet to reproduce **

from gpytorch.models import ApproximateGP
from gpytorch.variational import CholeskyVariationalDistribution
from gpytorch.variational import VariationalStrategy
from gpytorch import settings

class GPModel(ApproximateGP):
    def __init__(self, inducing_points):
        variational_distribution = CholeskyVariationalDistribution(inducing_points.size(0))
        variational_strategy = VariationalStrategy(self, inducing_points, variational_distribution, learn_inducing_locations=True)
        super(GPModel, self).__init__(variational_strategy)
        self.mean_module = gpytorch.means.ConstantMean()
        self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())

    def forward(self, x):
        mean_x = self.mean_module(x)
        covar_x = self.covar_module(x)
        return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)

train_x = torch.randn(10)
inducing_points = train_x
model = GPModel(inducing_points=inducing_points)

with settings.variational_cholesky_jitter(float_value=1e-2):
    print(model.variational_strategy.jitter_val)

** Stack trace/error message **

The printed output is 0.0001, which is the default jitter value.

Expected Behavior

Expected value is 1e-2.

Additional context

In fact, the jitter value can be set through settings when instantiating the model:

with settings.variational_cholesky_jitter(float_value=1e-2):
    model = GPModel(inducing_points=inducing_points)
    print(model.variational_strategy.jitter_val)

The printed output is 1e-2, which is expected. However, such way of setting variational_cholesky_jitter value is not consistent with other settings, e.g. settings.cholesky_jitter.

@LuhuanWu LuhuanWu added the bug label Jan 7, 2023
@gpleiss
Copy link
Member

gpleiss commented Jan 17, 2023

Agreed. It should be possible to set this value dynamically, rather than when the model is initialized.

Additionally, it doesn't really make sense to have a bunch of different context managers for jitter values - we should think about consolidating them.

I'll put up a PR soon.

gpleiss added a commit that referenced this issue Jan 17, 2023
Previously, this context manager was only used when VariationalStrategy
modules were initialized. With this change,
gpytorch.settings.variational_cholesky_jitter will dynamically change
the jitter value (for variational models already in use), unless the
user specifies a `jitter_val` in the VariationalStrategy constructor.

In addition, this PR adds type hintsd to a majority of the
VariationalStrategy modules.

[Fixes #2244]
gpleiss added a commit that referenced this issue Jan 17, 2023
Previously, this context manager was only used when VariationalStrategy
modules were initialized. With this change,
gpytorch.settings.variational_cholesky_jitter will dynamically change
the jitter value (for variational models already in use), unless the
user specifies a `jitter_val` in the VariationalStrategy constructor.

In addition, this PR adds type hintsd to a majority of the
VariationalStrategy modules.

[Fixes #2244]
gpleiss added a commit that referenced this issue Mar 6, 2023
Previously, this context manager was only used when VariationalStrategy
modules were initialized. With this change,
gpytorch.settings.variational_cholesky_jitter will dynamically change
the jitter value (for variational models already in use), unless the
user specifies a `jitter_val` in the VariationalStrategy constructor.

In addition, this PR adds type hintsd to a majority of the
VariationalStrategy modules.

[Fixes #2244]
gpleiss added a commit that referenced this issue Mar 6, 2023
…#2255)

* gpytorch.settings.variational_cholesky_jitter can be set dynamically.

Previously, this context manager was only used when VariationalStrategy
modules were initialized. With this change,
gpytorch.settings.variational_cholesky_jitter will dynamically change
the jitter value (for variational models already in use), unless the
user specifies a `jitter_val` in the VariationalStrategy constructor.

In addition, this PR adds type hintsd to a majority of the
VariationalStrategy modules.

[Fixes #2244]

* Small doc fix

* WIP

* Include side in LMCVarStrat docstring
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants