-
Notifications
You must be signed in to change notification settings - Fork 566
Closed
Description
I have found the issue here:
def predict(self, test_x):
with torch.no_grad():
# The output of the model is a multitask MVN, where both the data points
# and the tasks are jointly distributed
# To compute the marginal predictive NLL of each data point,
# we will call `to_data_independent_dist`,
# which removes the data cross-covariance terms from the distribution.
preds = model.likelihood(model(test_x)).to_data_independent_dist()
return preds.mean.mean(0), preds.variance.mean(0)
The predict method in the Deep GP model of the GPyTorch tutorial on Deep GPs does not work correctly. Especially, .to_data_independent_dist()
seems to do something wrong. I suspect it is about some reshaping.
Using this method the uncertainties become correct:
def predict(self, test_x):
with torch.no_grad():
preds = model.likelihood(model(test_x))
preds_mean = preds.mean.mean(axis=0)
preds_var = preds.covariance_matrix.mean(axis=0).diag().reshape(num_tasks, num_tasks * self.train_x_shape[0]).T
return preds_mean, preds_var
Originally posted by @fweberling in #1862 (reply in thread)
Metadata
Metadata
Assignees
Labels
No labels