Node.js - Allocation failed - process out of memory even with use of JSONStream parser -


I'm trying to read and parse JSON files in about 12 large (from 100 MB + 500 MB) nodes I am Fatal error: CALL_AND_RETRY_LAST allocation failed - Process out of memory

However, I am still getting this error. This is the first time I have ever seen this Just like trying to use a streaming file reader, so I'm not sure what the problem might be right now, I have the code:

 for  (var i = 0; i & Lt; jsonfileArray.length; i ++) {if (jsonfileArray [i] .mail (/ \ .json $ /)) {stream stream = fs.createReadStream (dirPath + jsonFileArray [i]) pipe (JSONStream.parse ( ['Records', true, 'categories'])); Stream.on ('data', function (data) {console.log ('received:', data);}); }}  

Eventually I want to create a cumulative result object with all the parsed data from these files, but I have tried to parse anything else I have tried so far All files are prevented from being able to? Guide?

You can try the "memory-leak" branch from the JSONStream repository. This branch fixes the memory leak, although you can not use the ".." operator.


Comments

Popular posts from this blog

apache - 504 Gateway Time-out The server didn't respond in time. How to fix it? -

c# - .net WebSocket: CloseOutputAsync vs CloseAsync -

c++ - How to properly scale qgroupbox title with stylesheet for high resolution display? -