Skip to content

Fix lazy kernel slicing when there are multiple outputs #2376

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 7 commits into from
Jul 19, 2023
Merged
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 5 additions & 5 deletions gpytorch/lazy/lazy_evaluated_kernel_tensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -161,21 +161,21 @@ def _getitem(self, row_index, col_index, *batch_indices):
# However - if we have multiple outputs per input, then the indices won't directly
# correspond to the entries of row/col. We'll have to do a little pre-processing
if num_outs_per_in_rows != 1 or num_outs_per_in_cols != 1:
if not isinstance(x1, slice) or not isinstance(x2, slice):
if not isinstance(row_index, slice) or not isinstance(col_index, slice):
# It's too complicated to deal with tensor indices in this case - we'll use the super method
return self.evaluate_kernel()._getitem(row_index, col_index, *batch_indices)

# Now we know that x1 and x2 are slices
# Let's make sure that the slice dimensions perfectly correspond with the number of
# outputs per input that we have
row_start, row_end, row_step = (
row_index.start,
row_index.stop,
row_index.start if row_index.start is not None else 0,
row_index.stop if row_index.stop is not None else self.shape[-2],
row_index.step,
)
col_start, col_end, col_step = (
col_index.start,
col_index.stop,
col_index.start if col_index.start is not None else 0,
col_index.stop if col_index.stop is not None else self.shape[-1],
col_index.step,
)
if row_step is not None or col_step is not None:
Expand Down