Replies: 4 comments 1 reply
-
experiencing same issue when using : Storage::disk('s3')->get('path/to/file.ext') workaround is to set this above the code : ini_set('memory_limit', '2048M'); but this is not memory efficient. any thoughts? something like Storage::putFileAs(...) for get() method could help |
Beta Was this translation helpful? Give feedback.
-
Consider streaming your download, instead of loading the entire file into memory. |
Beta Was this translation helpful? Give feedback.
-
I do not think this is framework related. An alternative if something is not clearing right. You could schedule But you should investigate what is the reason for the high memory usage. If you are working with large files, as @georgeboot suggested you should consider streaming the downloads. |
Beta Was this translation helpful? Give feedback.
-
You could try issue a gc_collect_cycle(), and if that does the trick configure the garbage collector correctly. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I'm using laravel 7.4.0 and with php 7.4.10
Currently working on a project which dispatches about 10K job daily using laravel queue database driver. The problem is the worker is consuming about 2GB of our server ram and the below error occurs frequently
I'm using laravel Http client and it throws that error.
Any help here will be appreciated
Beta Was this translation helpful? Give feedback.
All reactions