What are Streams ?

Streams are just collections of data that is sent in small chunks, and it also don’t have to fit in memory. Meaning, when dealing with big chunk of data either read or write, the memory usage is less consumed, which is extremely useful.

let me show two examples how streams can be optimized the memory consumption.

writeStream.js

const fs = require('fs');
 const file = fs.createWriteStream('file.txt');
 for(let i=0; i<=2e6; i++) {
   file.write('this is some dummy text')
}
file.end()

readStream.js

const fs = require('fs')
const server = require('http').createServer();
server.on('request', (req, res) => {
   const src = fs.readFile('file.txt', (err, data) => {
      res.end(data)
  });
})
server.listen(8000)

node readStream.js

try to run the above code, also open up your activity monitor and try checking for the memory usage.

when we try to run the above code initial value of the memory is less and once the code executed its jumps into into a big number.

why is that ?

It’s because we directly put the chunk of data to the memory and buffers it out, before we even write it.

how can we optimize memory usage ?

It’s where the createReadStream comes in. createReadStream is a readable stream for any file, doesn’t matter how big the file is, we can pipe into the response object.

readStream.js

const fs = require('fs')
const server = require('http').createServer();
server.on('request', (req, res) => {
   const src = fs.createReadStream('file.txt');
   src.pipe(res)
})
server.listen(8000)

And if you try to run the above code you could see its we buffer the output one chunk at a time, meaning we don’t buffer in memory.

try to increase the file size and you could see the difference.