Understanding Streams in Node JS

As NodeJS is known for it’s asynchronous nature and have many modules that we used in daily code base but don’t get deeper into it.One of the core modules is Streams.

Streams allow us to handle data flow asynchronously.Two data handling approaches are there in NodeJS.

1) Buffered approach :
Buffered approach say that reciever can read the data only if whole data is written to the buffer.

2) Streams approach :
In Streamed approach data arrives in chunks and also can be read in chunks this can be a  single part of the data.

Types of Streams available :

  1. Let us do some experiment by creating a big file

We have created a file using Writable Stream. fs module in node js can be used to read from and write to files using a Stream interface.Running the script above generates a file that’s about ~400 MB.

2. Read the same big file using read Stream

Then I connected to the server. Note what happened to the memory consumed:

 

Optimised Solution for data transformation

Time Efficiency :
There is great behaviour of Streams that is piping.basically you can pipe two of the stream where output of one stream is an input to the other.
What happens is “data” (chunk) arrives at the “stream 1″ which is piped to stream 2″ which can further be piped to other streams,

With Pipes :

This is how we can parallelize multiple stages a data chunk might go through,  This strategy is called as pipe lining. Nodejs allows us to pipeline our tasks with the help of streams.

Hence Node js works on single thread but this doesn’t means we can’t do two tasks or process at a time.This can be done via Child Processes in NodeJs.Read my next article regarding child process in Node JS.

Leave a reply:

Your email address will not be published.