bugl
bugl
HomeLearnPatternsSearch
HomeLearnPatternsSearch

Loading lesson path

Learn/Node.js/Core Modules
Node.js•Core Modules

Node.js Streams

What are Streams?

In Node.js, streams are collections of data, which might not be available in full at once and don't have to fit in memory. Think of them as conveyor belts that move data from one place to another, allowing you to work with each piece as it arrives rather than waiting for the whole dataset. Streams are one of Node.js's most powerful features and are used extensively in:

Formula

File system operations (reading/writing files)

HTTP requests and responses

Data compression and decompression

Database operations

Real-time data processing

Getting Started with Streams

Streams are one of the fundamental concepts in Node.js for handling data efficiently. They allow you to process data in chunks as it becomes available, rather than loading everything into memory at once.

Basic Stream Example

const fs = require('fs');
// Create a readable stream from a file const readableStream = fs.createReadStream('input.txt', 'utf8');
// Create a writable stream to a file const writableStream = fs.createWriteStream('output.txt');
// Pipe the data from readable to writable stream readableStream.pipe(writableStream);
// Handle completion and errors writableStream.on('finish', () => {
console.log('File copy completed!');
});
readableStream.on('error', (err) => {
console.error('Error reading file:', err);
});
writableStream.on('error', (err) => {
console.error('Error writing file:', err);
});

Why Use Streams?

There are several advantages to using streams:

Memory Efficiency:

Process large files without loading them entirely into memory

Time Efficiency:

Start processing data as soon as you have it, instead of waiting for all the data

Composability:

Build powerful data pipelines by connecting streams

Better User Experience:

Deliver data to users as it becomes available (e.g., video streaming) Imagine reading a 1GB file on a server with 512MB of RAM:

Without streams:

You'd crash the process attempting to load the entire file into memory

With streams:

You process the file in small chunks (e.g., 64KB at a time)

Core Stream Types

Node.js provides four fundamental types of streams, each serving a specific purpose in data handling:

Stream Type

Description

Common Examples

Readable

Streams from which data can be read (data source) fs.createReadStream(), HTTP responses, process.stdin

Writable

Streams to which data can be written (data destination) fs.createWriteStream(), HTTP requests, process.stdout

Duplex

Streams that are both Readable and Writable

TCP sockets, Zlib streams

Transform

Duplex streams that can modify or transform data as it's written and read Zlib streams, crypto streams

Previous

Node.js Events

Next

Node.js Buffer Module