We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running the causal impact algorithm appears to leak memory.
Reproduction code:
import gc import pandas as pd from causalimpact import CausalImpact data = pd.read_csv('https://raw.githubusercontent.com/WillianFuks/tfcausalimpact/master/tests/fixtures/arma_data.csv')[['y', 'X']] data.iloc[70:, 0] += 5 pre_period = [0, 69] post_period = [70, 99] def run_causal_impact(): ci = CausalImpact(data, pre_period, post_period) print(ci.summary()) print(ci.summary(output='report')) for _ in range(10): run_causal_impact() gc.collect()
Dependencies:
[tool.poetry.dependencies] python = "=3.11.4" pandas = "=2.2" tensorflow = "=2.16.1" tfcausalimpact = "=0.0.15" pyarrow = "=15.0.2"
I run the program and watch the activity monitor, and each iteration leaks ~80mb or so.
The text was updated successfully, but these errors were encountered:
Can confirm this issue, running multiple causal impacts in a single script can leak to out of memory errors
I've resorted to using Multiprocessing and running each CI in a separate process. This ensures that the memory is freed so no more kernel crashes
Sorry, something went wrong.
No branches or pull requests
Running the causal impact algorithm appears to leak memory.
Reproduction code:
Dependencies:
I run the program and watch the activity monitor, and each iteration leaks ~80mb or so.
The text was updated successfully, but these errors were encountered: