-
-
Notifications
You must be signed in to change notification settings - Fork 31.5k
"terminate called after throwing an instance of std::bad_alloc" error when filling a bunch of large arrays #27715
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I'd say neither error is unexpected. Your test exhausts available memory. At some point the OS refuses to grant node additional memory. Sometimes that point is reached in C++ land when calling malloc() or new, that's when you get that Sometimes it happens in JS land and then you get that FATAL ERROR. edit: or is your bug report more about that the size of the heap is out of proportion to the amount of data your script generates?
|
Good to know! Cheers.
But why would that happen if my machine has 64GB, and node is only using 30GB or 40GB? Here's what
Here's what
So the machine apparently doesn't want to give up buff/cache to the node process once the free memory runs out?
I also tried the same code on a 192GB machine and got basically identical results:
And here's the
In this case the Interestingly, in one test on the 64GB machine it got to 30GB and then just stalled - no crash, no error messages, and no further log messages indicating that it was still pushing/shifting. Could there be some sort of memory fragmentation stuff causing all this? That's wild speculation because I am a mere web developer with very little low-level/hardware experience. |
Update: This code runs fine (indefinitely) on Firefox and never consumes more than about 15GB of my 64GB machine's memory. Edit: The |
also had this issue on a recent test run: https://ci.nodejs.org/job/node-test-commit-smartos/nodes=smartos18-64/25899/console
|
Here's a very simple script which causes both Chrome and standalone V8 to crash (logs here) at 18GB, no matter how much more memory is actually available: const arrs = [];
let n = 0;
while(1) {
arrs.push(new Array(1e3).fill(0));
if(++n % 1e5 === 0) console.log(n);
} Firefox/SpiderMonkey fills up the all 64GB of my machine's memory with no problems. |
That can probably be explained by the fact that arrays have a hard size limit (512 MB on 32 bits platforms, 1 GB on 64 bits.) Size == length * some constant factor, where the factor depends on the underlying representation of the array. It maxes out around 110 million elements in practice, assuming you don't run out of heap space before that. |
Thank you, @bnoordhuis! The above code runs fine if I change For others hitting this thread via search engine in the future:
Hope V8 can catch up to SpiderMonkey in this regard at some point. It looks like we should have support for huge |
I think this issue has been answered/resolved as much as is possible so I'll take the liberty of closing it out. Let me know if I should reopen. |
This issue doesn't need to be re-opened, since the bug is upstream at V8, not with nodejs specifically, but in case others come across it in the future: The memory problems are back. I don't know what happened, but the following code snippet crashes V8 at only 4GB of RAM usage: const arrs = [];
let n = 0;
while(1) {
arrs.push(new Array(1e6).fill(0));
if(++n % 10 === 0) console.log(arrs.length);
} I said in my last comment (above) that V8 happily fills 100GB of RAM with this snippet, but in trying to resolve a different error (not the I have no idea what's going on! Hopefully that crbug report leads somehere. If anyone knows what's going on here, I'd love to hear from you. I'll report back here if I learn anything. |
Just added this answer to stack overflow that might be relevant to this thread. The problem might be that you reach the max number of mmaps per file. |
Minimal example code:
Started with this command:
Result: After about a minute or two, on a machine with 64GB of memory, the memory usage climbs to a bit over 30GB (according to
htop
), and then it crashes with this message:Notes:
--js-flags="--max-old-space-size=999999 --max-semi-space-size=999999"
) it crashes in less than a minute when its memory usage reaches ~18GB (quite consistently).metaArray
to 50, memory usage stays at around 17GB and it runs indefinitely. This is the case for both node.js and Chrome.FATAL ERROR: invalid table size Allocation failed - JavaScript heap out of memory
in node for other configurations of array sizes and array items (e.g. filling with objects rather than numbers), but that's a separate issue that I still need to look into.I have cross-posted this to crbug V8, so please just close this issue if it's more relevant there (I figured it might be since it's happening in Chrome too).
Edit: In case anyone wants to see the full error for point number 3, above, here it is:
That error can be produced in node.js and Chrome with the following code:
Note that
metaArray
is only of length 50 here. In this case, memory tops out at somewhere around 18GB in both node.js and Chrome before crashing.Also, in case it's relevant: The memory usage that
htop
reports matches that of Chrome's Task Manager.The text was updated successfully, but these errors were encountered: