Integrating Variational Multi-output GPs with the GP-LVM #2173
Unanswered
AndreaBraschi
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi all,
I have been trying to integrate the variational multi-output GPs with the GPLVM. I have been using the inter coregionalization model from Bonilla et al. (2008) to establish correlations across tasks, and, at the moment I get an acceptable data fit, but the correlations across tasks and the latent variables are not what I expect.
Basically, the covariance structure of the GP prior should be a triple kronecker product between the inter-task covariance, covariance of observed inputs and covariance of latent variables. The noise model should also be a triple kronecker between identity matrix, task noises and latent variable noises.
I've tried to adapt the Multitask kernel and the MultiTaskGaussianLikelihood to a triple Kronecker. Here below you can find the plots that show the data fit after 500 iterations and the code. Although it's a long code, you can just run it and you'll see a bunch of plots that will show which will show much better what I'm talking about.
I've been also thinking of using two separate sets of inducing points, one for the observed input domain and the other one for the latent space. However, I'm not really sure what would be the best way to then combine the means of the resulting 2 approximate posterior q(f).
I am first uploading the classes that handle the Kernel, the likelihood and the variational strategy which are needed to run the main code.
Any suggestion or food for thoughts will be much appreciated.
Forgive me for the extremely long code, but I've had to try to change lots of the source classes!!!
Noise Model
Kernel
Variational Strategy
Main code
Beta Was this translation helpful? Give feedback.
All reactions