Streams

Node.js streams offer a powerful abstraction for managing data flow in your applications.

Streams process data in chunks, significantly reducing memory usage. All streams in Node.js inherit from the EventEmitterarrow-up-right class, allowing them to emit events at various stages of data processing.

These streams can be readable, writable, or both, providing flexibility for different data-handling scenarios.

Why use Streams

  1. Memory Efficiency: Streams process data incrementally, consuming and processing data in chunks rather than loading the entire dataset into memory.

  2. Improved Response Time: Streams allow for immediate data processing. When a chunk of data arrives, it can be processed without waiting for the entire payload or dataset to be received. This reduces latency and improves your application's overall responsiveness.

  3. Scalability for Real-Time Processing: By handling data in chunks, Node.js streams can efficiently handle large amounts of data with limited resources. This scalability makes streams ideal for applications that process high volumes of data in real time.

Readable streams

Readablearrow-up-right is the class that we use to sequentially read a source of data.

.on('data')

Triggered whenever data is available from the stream.

.on('end')

Emitted when there is no more data to read from the stream. (This event is only fired when all the data from the stream has been consumed)

.on('readable')

Triggered when there is data available to read from the stream or when the end of the stream has been reached.

.on('close')

Emitted when the stream and its underlying resources have been closed and indicates that no more events will be emitted.

.on('error')

Can be emitted at any point, signaling that there was an error processing.

Writable streams

Writablearrow-up-right streams are useful for creating files, uploading data, or any task that involves sequentially outputting data.

.write()

Used to write a chunk of data to the stream. It handles the data by buffering it up to a defined limit (highWaterMark), and returns a boolean indicating whether more data can be written immediately.

.end()

Signals the end of the data writing process. It signals the stream to complete the write operation and potentially perform any necessary cleanup.

The .pipe()arrow-up-right method concatenates one readable stream to a writable (or transform) stream.

To avoid the pitfalls and low-level complexity of the .pipe() method, in most cases, it is recommended to use the pipeline()arrow-up-right method.

This method is a safer and more robust way to pipe streams together, handling errors and cleanup automatically.

Async iterators are recommended as the standard way of interfacing with the Streams API.

In Node.js, all readable streams are asynchronous iterables. This means you can use the for await...of syntax to loop through the stream's data as it becomes available, handling each piece of data with the efficiency and simplicity of asynchronous code.

Last updated