Multi-step time series forecasting #2278
-
Thank you very much for provide this great tool. y(t+1)=2y(t)-1.03y(t-1)+0.05*x(t) where t is the time step, x(t) is a variable, and y(t) is also a variable. In reality x(t) might be the force and y(t) might by the humidity. In this toy example, the relationship between x(t), y(t+1), y(t-1), and y(t) is known; In reality, the model parameters are unknown to me. First, I can fit a model using Gaussian process regression, where y(t+1) is an output and y(t), y(t-1), and x(t) are inputs to the Gaussian process regression. I can measure y(t) and x(t) at each time-step. Therefore, I can use these measurements and fit the hyperparameters of the Gaussian process model. I think this step is works. Thanks for your help and feedback. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
Yeah you can "fantasize" values for y(t+1) by sampling from the model posterior, then condition your model on those fantasized observations (which yields a batched model with each element of the batch corresponding to one sample). You can telescope that out by sampling from that posterior, essentially building a tree of samples. This is implemented in gpytorch through the following: gpytorch/gpytorch/models/exact_gp.py Line 138 in f12c56b See https://proceedings.neurips.cc/paper/2020/hash/d1d5923fc822531bbfd9d87d4760914b-Abstract.html for a discussion of this technique. This is implemented (using gpytorch under the hood) in https://github.com/pytorch/botorch/blob/main/botorch/acquisition/multi_step_lookahead.py#L49 |
Beta Was this translation helpful? Give feedback.
-
Dear @Balandat, thank you very much for your quick and precise answer. This is a very good starting point to begin with the implementation. |
Beta Was this translation helpful? Give feedback.
Yeah you can "fantasize" values for y(t+1) by sampling from the model posterior, then condition your model on those fantasized observations (which yields a batched model with each element of the batch corresponding to one sample). You can telescope that out by sampling from that posterior, essentially building a tree of samples. This is implemented in gpytorch through the following:
gpytorch/gpytorch/models/exact_gp.py
Line 138 in f12c56b
See https://proceedings.neurips.cc/paper/2020/hash/d1d5923fc822531bbfd9d87d4760914b-Abstract.html for a discussion of this technique. This is implemented (using gpytorch under the …