Unhandle Promise rejction

Iam trying to insert alot a data in my databse but Iam getting this erro RangeError: Maximum call stack size exceeded

can you please help

2 Likes

thanks , Iam not getting this error anymore , Iam using try catch before send data in dabase .But Iam unable to save big data in my databse , only small file

Can you post your code?

let line;
let resultFileConverted=[];
let headers;

const readStream= fs.createReadStream(filePath, "utf-8");


 readStream.on('data', async function(chunk){


      line= chunk.split(/\r?\n/);
      headers=line[0].split(";");
      
 for(let i=1;i<line.length;i++){

   let obj = {};
   let currentline=line[i].split(";");

   for(let j=0;j<headers.length;j++){
     obj[headers[j]] = currentline[j];
   }

   resultFileConverted.push(obj);
 }

   
 try {
   console.log(resultFileConverted[0]);
  await StagingInput.createMany();
 } catch (error) {
   
 }

    
}); 
readStream.on('error', (data)=>{
  
});
readStream.on('end',  async(e)=>{

 

  
});
1 Like

First of all, I think you do not need to async that function because it is done implicitly

if I take it will return that RangeError:: Maximum call stack size exceeded

1 Like

Problem is that readStream.on('data', async function () {}) will keep calling StagingInput.createMany() quite a lot of times.

What you can do is make your own new Writable stream where you can save data and make callback when data is saved. Then stream will “wait” for data to be saved.

Here’s NodeJS documentation
https://nodejs.org/api/stream.html#stream_simplified_construction

And here’s some code example how to do it. https://github.com/Keyang/node-csvtojson/issues/135#issuecomment-274561108

Also interesting and might be useful read about stream backpressuring https://nodejs.org/es/docs/guides/backpressuring-in-streams/

Easier option but not so efficient would be to read whole file into memory and then start inserting from there with .createMany(). Depending on your files that might not work. If file is bigger than allocatable memory.

I will be facing same thing in some weeks when trying to read 10GB log files row by row and insert them into DB, if I remember it I can come back and post implementation in here.

would be grreat , thanks alot