Skip to content

Bug: Exploit Structure in get_fantasy_strategy #2494

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged

Conversation

naefjo
Copy link
Contributor

@naefjo naefjo commented Mar 11, 2024

Hello :)

This PR is related to #2468 and cornellius-gp/linear_operator#93.

The DefaultPredictionStrategy's get_fantasy_model updates the gram matrix with new datapoints and updates the lik_train_train_covar's root_decomposition and root_inv_decomposition caches by passing them to the constructor. However by using to_dense in lines 214-215, the caches in the __init__ on line 69 and 72 respectively are constructed with root and inv_root of type torch.tensor which in RootLinearOperator.__init__ will assign a DenseLinearOperator to self.root since to_linear_operator defaults to DenseLinearOperator if provided with a torch.tensor.

As a result in LinearOperator.cat_rows, the object E will be of type DenseLinearOperator which in turn will fail the check for for triangular matrices here. This once again leads to a stable_pinverse with QR decomposition instead of exploiting a fast triangular solve to compute the inverse.

@fteufel
Copy link

fteufel commented Mar 13, 2024

Thanks for finding this! I've been confused myself for a few weeks why get_fantasy_model wasn't speeding things up compared to just recomputing caches, but couldn't figure it out.

Can confirm this makes things faster.

naefjo added 2 commits March 19, 2024 11:52
DenseLinearOperator expects a torch.tensor so we convert the linear operator to dense.
Directly construct a BatchRepeatLinearOperator from new_root instead of converting it to a dense operator first.
Copy link
Member

@jacobrgardner jacobrgardner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oof, thanks for catching this!

@jacobrgardner jacobrgardner enabled auto-merge June 20, 2024 20:52
@jacobrgardner jacobrgardner merged commit 2e7959d into cornellius-gp:master Jun 20, 2024
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants