This content originally appeared on DEV Community and was authored by Rohit Singh
When working with large files (videos, images, logs, CSVs) or real-time data, loading everything into memory is inefficient. Thatโs where streams in Node.js come in.
Instead of reading/writing the whole file at once, streams break the data into small chunks. This makes Node.js apps faster, memory-efficient, and scalable.
๐น What Are Streams in Node.js?
Streams are built-in objects in Node.js that allow reading/writing data piece by piece.
There are four main types of streams:
Readable โ stream you can read from (e.g., fs.createReadStream)
Writable โ stream you can write to (e.g., fs.createWriteStream)
Duplex โ both readable and writable (e.g., TCP sockets)
Transform โ streams that can modify data (e.g., compressing files)
๐น Why Use Streams?
โ
Handle large files without crashing the server
โ
Improve performance & scalability
โ
Enable real-time processing (chat apps, video streaming, logging)
โ
Lower memory usage
๐น Example 1: Reading a File Using Streams
Instead of reading an entire file with fs.readFile(), use streams:
const fs = require("fs");
// Create a readable stream
const readableStream = fs.createReadStream("bigfile.txt", {
encoding: "utf-8",
highWaterMark: 64 * 1024 // 64 KB chunks
});
// Listen to data events
readableStream.on("data", (chunk) => {
console.log("Received chunk:", chunk.length);
});
readableStream.on("end", () => {
console.log("File reading completed!");
});
๐ Here, the file is read in 64KB chunks, not all at once.
๐น Example 2: File Copy Using Pipe
Node.js makes it super simple to pipe streams:
const fs = require("fs");
const readable = fs.createReadStream("input.txt");
const writable = fs.createWriteStream("output.txt");
// Pipe data from read โ write
readable.pipe(writable);
console.log("File copied successfully using streams!");
โ Efficiently copies large files without loading them fully into memory.
๐น Example 3: Streaming an HTTP Response
Streams are great for serving large files in HTTP servers:
const http = require("http");
const fs = require("fs");
http.createServer((req, res) => {
const stream = fs.createReadStream("video.mp4");
res.writeHead(200, { "Content-Type": "video/mp4" });
stream.pipe(res);
}).listen(3000, () => {
console.log("Server running on http://localhost:3000");
});
๐ Instead of loading the full video into memory, the server sends chunks to the client as theyโre read.
๐น Example 4: Transform Streams (Compression)
You can also transform data on the fly:
const fs = require("fs");
const zlib = require("zlib");
const readable = fs.createReadStream("input.txt");
const compressed = fs.createWriteStream("input.txt.gz");
// Compress file using gzip
readable.pipe(zlib.createGzip()).pipe(compressed);
console.log("File compressed successfully!");
โ Useful for logs, backups, and optimizing storage.
๐น Real-World Use Cases of Node.js Streams
Video/Audio streaming (Netflix, YouTube-like apps)
Large file uploads/downloads
Log processing in real-time
Chat applications with WebSockets
Data pipelines (ETL jobs)
๐ฏ Conclusion
Streams are a core strength of Node.js that let you handle large data efficiently without crashing servers. Whether youโre copying files, serving videos, or compressing data, streams make your app scalable, memory-friendly, and performant.
If youโre building high-performance apps in 2025, mastering Node.js streams is a must! ๐
This content originally appeared on DEV Community and was authored by Rohit Singh

Rohit Singh | Sciencx (2025-09-02T14:36:50+00:00) ๐ Streaming in Node.js: A Complete Guide with Examples. Retrieved from https://www.scien.cx/2025/09/02/%f0%9f%9a%80-streaming-in-node-js-a-complete-guide-with-examples/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.