Replies: 1 comment 2 replies
-
So we have a setup for a related thing in botorch for doing input warping with learnable transforms: https://github.com/pytorch/botorch/blob/main/botorch/models/transforms/input.py#L796. This is not the same thing (it's the inputs not the outputs), but related. In terms of implementing the output warping, my hunch is that you could use the |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi together,
I have already successfully implemented some basic models. Now I want to implement warped Gaussian Processes (wGP), as presented in this paper. The general idea of wGPs is to transform the observed targets (not the features) with a learnable parametric monotonic function f(·), and doing GP-Regression on the transformed targets. However, the GP-Regression and the parametric function f(·) are learned simultaneously.
Well, the negative log likelihood of basic gaussian processes regression with simple noise looks the following:
In the paper they make use of the properties of change of variables, to derive the adjusted negative log-likelihood:
So my idea was to implement a custom likelihood (e.g. WarpedGaussianLikelihood) that captures this transformation. Then I planned to extend
gpytorch.models.ExactGP
as usually, however supply a customprediction_strategy
, since we need to transform theinput_targets
with the learned transformation. Moreover, one could maybe overwriteExactGP. __call__
. to handle the evaluation mode differently, e.g. first callingsuper().__call__
and then applying the inverse transformation, etc.Before I try out my ideas, I wanted to make sure if Warping for target variables is not already supported by GPyTroch. Or maybe you have some better ideas how one could implement this feature/model. I would be very thankful for any comments.
Beta Was this translation helpful? Give feedback.
All reactions