I'm getting the following error even when I run node with high heap memory using the following mand: node --max-old-space-size=8000 manipulateFiles.js
FATAL ERROR: invalid table size Allocation failed - JavaScript heap out of memory
1: 0x8dc510 node::Abort() [node]
2: 0x8dc55c [node]
3: 0xad9b5e v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [node]
4: 0xad9d94 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [node]
5: 0xec7bf2 [node]
6: 0x102d7b2 v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, 2>::Allocate(v8::internal::Isolate*, int, v8::internal::PretenureFlag) [node]
7: 0x1030946 v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, 2>::Rehash(v8::internal::Handle<v8::internal::OrderedHashMap>, int) [node]
8: 0x1030f69 v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, 2>::EnsureGrowable(v8::internal::Handle<v8::internal::OrderedHashMap>) [node]
9: 0x1114c7e v8::internal::Runtime_MapGrow(int, v8::internal::Object**, v8::internal::Isolate*) [node]
10: 0x908bb15be1d
Aborted (core dumped)
The amount of heap memory used at the time of the crash is 1.79G. The amount available is 6.15G. I used the v8 module and process
to get those numbers.
So apparently something else besides the heap size is causing the issue. the module basically scans a large CSV file and builds a Map for later reference in the process. This map can have up to 30 million keys. the module worked fine on other smaller files (10 million keys). but it keeps giving this error here even when the available heap size is far larger than being used.
What else could cause a problem like this?
I'm getting the following error even when I run node with high heap memory using the following mand: node --max-old-space-size=8000 manipulateFiles.js
FATAL ERROR: invalid table size Allocation failed - JavaScript heap out of memory
1: 0x8dc510 node::Abort() [node]
2: 0x8dc55c [node]
3: 0xad9b5e v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [node]
4: 0xad9d94 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [node]
5: 0xec7bf2 [node]
6: 0x102d7b2 v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, 2>::Allocate(v8::internal::Isolate*, int, v8::internal::PretenureFlag) [node]
7: 0x1030946 v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, 2>::Rehash(v8::internal::Handle<v8::internal::OrderedHashMap>, int) [node]
8: 0x1030f69 v8::internal::OrderedHashTable<v8::internal::OrderedHashMap, 2>::EnsureGrowable(v8::internal::Handle<v8::internal::OrderedHashMap>) [node]
9: 0x1114c7e v8::internal::Runtime_MapGrow(int, v8::internal::Object**, v8::internal::Isolate*) [node]
10: 0x908bb15be1d
Aborted (core dumped)
The amount of heap memory used at the time of the crash is 1.79G. The amount available is 6.15G. I used the v8 module and process
to get those numbers.
So apparently something else besides the heap size is causing the issue. the module basically scans a large CSV file and builds a Map for later reference in the process. This map can have up to 30 million keys. the module worked fine on other smaller files (10 million keys). but it keeps giving this error here even when the available heap size is far larger than being used.
What else could cause a problem like this?
Share Improve this question edited Apr 2, 2019 at 7:24 Bergi 667k161 gold badges1k silver badges1.5k bronze badges asked Apr 2, 2019 at 1:47 tito.300tito.300 1,1661 gold badge12 silver badges23 bronze badges 6- Node version 10.15 and os is linux mint 19.1 cinnamon – tito.300 Commented Apr 2, 2019 at 2:04
-
Can you provide a snippet to reproduce this? Because my memory snippets are working fine using both:
--max-old-space-size=8000
or--max_old_space_size=8000
– Marcos Casagrande Commented Apr 2, 2019 at 2:05 - I tried that option too. did not work. I will try to provide a reproduce when I get to the puter. probably tomorrow or late tonight! Thanks – tito.300 Commented Apr 2, 2019 at 2:35
- okay this is a sample code link. to work on the same data set you can download the file from this link. its the us_class file under 2017 csv. Its a csv file not JSON, my bad. Thanks! – tito.300 Commented Apr 2, 2019 at 5:25
- Maybe you'll need to bite the bullet and just write this in C++ if you want to get more efficient memory usage out of your program for processing large files... Unless you have a way to properly pipeline your processing on small chunks at a time, JavaScript has never been considered a memory-efficient language. – Patrick Roberts Commented Apr 2, 2019 at 5:31
1 Answer
Reset to default 7V8 developer here. Despite the function names you see in the stack trace, this isn't about being out of memory overall, but about running into the maximum size for a single object (the Map
in this case). See this answer explaining the details: https://stackoverflow./a/54466812/6036428
发布者:admin,转转请注明出处:http://www.yc00.com/questions/1745627201a4636882.html
评论列表(0条)