Heterogeneous Multi-output Gaussian Process #2157
-
Hi all, New to gpytorch here. Just wondering if there has been any implementation on constructing a heterogeneous multi-output GP using approximate inference? what I mean by this, the likelihoods associated with each output/task are not the same and non-Gaussian. Task_1 could follow a Poisson distribution, and Task_2 could follow a Bernoulli. Both Task_1 and Task_2 are correlated in their latent parameter. Any suggestions on how to implement this if no such code is available? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
This is probably best done using the low-level pyro inferface: https://docs.gpytorch.ai/en/v1.6.0/examples/07_Pyro_Integration/Pyro_GPyTorch_Low_Level.html We don't currently have any examples of a composite likelihood (not one that we've published, anyways), but this should allow you to have a heterogeneous likelihood. Here's how to update this example:
Sorry we don't have a better example available right now, but hope this helps! |
Beta Was this translation helpful? Give feedback.
This is probably best done using the low-level pyro inferface: https://docs.gpytorch.ai/en/v1.6.0/examples/07_Pyro_Integration/Pyro_GPyTorch_Low_Level.html
We don't currently have any examples of a composite likelihood (not one that we've published, anyways), but this should allow you to have a heterogeneous likelihood. Here's how to update this example:
function_samples
term in themodel
method will be of sizenum_samples x ... x N x num_tasks
(rather thannum_samples x ... x N
in the single task example). You can then …