-
Notifications
You must be signed in to change notification settings - Fork 563
Fix sample from prior for ConstantMean #2042
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. What exactly goes wrong when using fill_
here? Is the issue that priors.sample()
produces a tensor of some shape rather than a scalar and so fill_
does weird things? Would be good to add some type annotation to understand what _constant_closure
is supposed to operate on.
Also, does that mean all the other modules should be updated in the same way?
if not torch.is_tensor(value): | ||
value = torch.as_tensor(value).to(self.constant) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is this gating necessary or could we just always apply torch.as_tensor
(should be a nullop)?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm just following what is done for other closures, for example: https://github.com/cornellius-gp/gpytorch/blob/master/gpytorch/kernels/kernel.py#L262.
Yes,
results in |
The use of
fill_
doesn't play well with the shape of the priors.