Tuesday, May 13, 2025
HomeProgrammingNode.js Streams with TypeScript — SitePoint

Node.js Streams with TypeScript — SitePoint


Node.js is famend for its capability to deal with I/O operations effectively, and on the coronary heart of this functionality lies the idea of streams. Streams help you course of information piece by piece, somewhat than loading all the pieces into reminiscence directly—excellent for dealing with massive information, community requests, or real-time information. Whenever you pair streams with TypeScript’s robust typing, you get a robust combo: efficiency meets security.

On this information, we’ll dive deep into Node.js streams, discover their varieties, and stroll by sensible examples utilizing TypeScript. Whether or not you’re a Node.js beginner or a TypeScript fanatic trying to degree up, this put up has you coated.

Why Streams Matter?

Image this: you’re tasked with processing a 50GB log file. Loading it totally into reminiscence would exhaust your server’s sources, resulting in crashes or sluggish efficiency. Streams clear up this by letting you deal with information because it flows, like sipping from a straw as a substitute of chugging a gallon jug.

This effectivity is why streams are a cornerstone of Node.js, powering all the pieces from file operations to HTTP servers. TypeScript enhances this by including kind definitions, catching errors at compile time, and enhancing code readability. Let’s dive into the basics and see how this synergy works in apply.

The 4 Varieties of Streams

Node.js provides 4 fundamental stream varieties, every with a selected goal:

  1. Readable Streams: Knowledge sources you’ll be able to learn from (e.g., information, HTTP responses).
  2. Writable Streams: Locations you’ll be able to write to (e.g., information, HTTP requests).
  3. Duplex Streams: Each readable and writable (e.g., TCP sockets).
  4. Remodel Streams: A particular duplex stream that modifies information because it passes by (e.g., compression).

TypeScript enhances this by permitting us to outline interfaces for the information flowing by them. Let’s break them down with examples.

Setting Up Your TypeScript Surroundings

Earlier than we dive into code, guarantee you might have Node.js and TypeScript put in.

Create a brand new challenge:

mkdir node-streams-typescript
cd node-streams-typescript
npm init -y
npm set up typescript @varieties/node --save-dev
npx tsc --init

Replace your tsconfig.json to incorporate:

{
  "compilerOptions": {
    "goal": "ES2020",
    "module": "commonjs",
    "strict": true,
    "outDir": "./dist"
  },
  "embody": ["src/**/*"]
}

Create a src folder and let’s begin coding!

Instance 1: Studying a File with a Readable Stream

Let’s learn a textual content file chunk by chunk. First, create a file named information.txt within the root listing of your challenge with some pattern textual content (e.g., “Whats up, streams!”).

Now, in src/readStream.ts:

import { createReadStream } from 'fs';
import { Readable } from 'stream';

const readStream: Readable = createReadStream('information.txt', { encoding: 'utf8' });

readStream
  .on('information', (chunk: string) => {
    console.log('Chunk obtained:', chunk);
  })
  .on('finish', () => {
    console.log('Completed studying the file.');
  })
  .on('error', (err: Error) => {
    console.error('Error:', err.message);
  });

Run it with:

npx tsc && node dist/readStream.js

Right here, TypeScript ensures the chunk adheres to our Chunk interface, and the error occasion handler expects an Error kind. This stream reads information.txt in chunks (default 64KB for information) and logs them.

Instance 2: Writing Knowledge with a Writable Stream

Now, let’s write information to a brand new file. In src/writeStream.ts:

import { createWriteStream } from 'fs';
import { Writable } from 'stream';

const writeStream: Writable = createWriteStream('output.txt', { encoding: 'utf8' });

const information: string[] = ['Line 1n', 'Line 2n', 'Line 3n'];

information.forEach((line: string) => {
  writeStream.write(line);
});

writeStream.finish(() => {
  console.log('Completed writing to output.txt');
});

writeStream.on('error', (err: Error) => {
  console.error('Error:', err.message);
});

Compile and run:

npx tsc && node dist/writeStream.js

This creates output.txt with three strains. TypeScript ensures the road is a string and offers autocompletion for stream strategies.

Instance 3: Piping with a Remodel Stream

Piping is the place streams shine, connecting a readable stream to a writable stream. Let’s add a twist with a Remodel stream to uppercase our textual content.

In src/transformStream.ts:

import { createReadStream, createWriteStream } from 'fs';
import { Remodel, TransformCallback } from 'stream';


class UppercaseTransform extends Remodel {
  _transform(chunk: Buffer, encoding: string, callback: TransformCallback): void {
    const upperChunk = chunk.toString().toUpperCase();
    this.push(upperChunk);
    callback();
  }
}

const readStream = createReadStream('information.txt', { encoding: 'utf8' });
const writeStream = createWriteStream('output_upper.txt');
const transformStream = new UppercaseTransform();

readStream
  .pipe(transformStream)
  .pipe(writeStream)
  .on('end', () => {
    console.log('Remodel full! Test output_upper.txt');
  })
  .on('error', (err: Error) => {
    console.error('Error:', err.message);
  });

Run it:

npx tsc && node dist/transformStream.js

This reads information.txt, transforms the textual content to uppercase, and writes it to output_upper.txt.

TypeScript’s TransformCallback kind ensures our _transform technique is accurately applied.

Instance 4: Compressing Information with a Duplex Stream

Let’s deal with a extra superior situation: compressing a file utilizing the zlib module, which offers a duplex stream. It comes with the ‘@varieties/node’ package deal, which we put in earlier. 

In src/compressStream.ts:

import { createReadStream, createWriteStream } from 'fs';
import { createGzip } from 'zlib';
import { pipeline } from 'stream';

const supply = createReadStream('information.txt');
const vacation spot = createWriteStream('information.txt.gz');
const gzip = createGzip();

pipeline(supply, gzip, vacation spot, (err: Error | null) => {
  if (err) {
    console.error('Compression failed:', err.message);
    return;
  }
  console.log('File compressed efficiently! Test information.txt.gz');
});

Run it:

npx tsc && node dist/compressStream.js

Right here, the pipeline ensures correct error dealing with and cleanup. The gzip stream compresses information.txt into information.txt.gz. TypeScript’s kind inference retains our code clear and protected.

Instance 5: Streaming HTTP Responses

Streams shine in community operations. Let’s simulate streaming information from an HTTP server utilizing axios. Set up it:

npm set up axios @varieties/axios

In src/httpStream.ts:

import axios from 'axios';
import { createWriteStream } from 'fs';
import { Writable } from 'stream';

async operate streamHttpResponse(url: string, outputFile: string): Promise<void> {
  const response = await axios({
    technique: 'get',
    url,
    responseType: 'stream',
  });

  const writeStream: Writable = createWriteStream(outputFile);
  response.information.pipe(writeStream);

  return new Promise((resolve, reject) => {
    writeStream.on('end', () => {
      console.log(`Downloaded to ${outputFile}`);
      resolve();
    });
    writeStream.on('error', (err: Error) => {
      console.error('Obtain failed:', err.message);
      reject(err);
    });
  });
}

streamHttpResponse('https://instance.com', 'instance.html').catch(console.error);

Run it:

npx tsc && node dist/httpStream.js

This streams an HTTP response (e.g., an online web page) to instance.html. TypeScript ensures the url and outputFile parameters are strings, and the Promise typing provides readability.

​​We are able to additionally use Node.js’s built-in Fetch API (obtainable since Node v18) or libraries like node-fetch, which additionally help streaming responses, though the stream varieties might differ (Net Streams vs. Node.js Streams).

Instance:

const response = await fetch('https://instance.com');
const writeStream = createWriteStream(outputFile);
response.physique.pipe(writeStream);

Instance 6: Actual-Time Knowledge Processing with a Customized Readable Stream

Let’s create a customized, readable stream to simulate real-time information, resembling sensor readings. In src/customReadable.ts:

import { Readable } from 'stream';

class SensorStream extends Readable {
  non-public rely: quantity = 0;
  non-public max: quantity = 10;

  constructor(choices?: any) {
    tremendous(choices);
  }

  _read(): void {
    if (this.rely < this.max) {
      const information = `Sensor studying ${this.rely}: ${Math.random() * 100}n`;
      this.push(information);
      this.rely++;
    } else {
      this.push(null); 
    }
  }
}

const sensor = new SensorStream({ encoding: 'utf8' });

sensor
  .on('information', (chunk: string) => {
    console.log('Obtained:', chunk.trim());
  })
  .on('finish', () => {
    console.log('Sensor stream full.');
  })
  .on('error', (err: Error) => {
    console.error('Error:', err.message);
  });

Run it:

npx tsc && node dist/customReadable.js

This generates 10 random “sensor readings” and streams them. TypeScript’s class typing ensures our implementation aligns with the Readable interface.

Instance 7: Chaining A number of Remodel Streams

Let’s chain transforms to course of textual content in levels: uppercase it, then prepend a timestamp. In src/chainTransform.ts:

import { createReadStream, createWriteStream } from 'fs';
import { Remodel, TransformCallback } from 'stream';

class UppercaseTransform extends Remodel {
  _transform(chunk: Buffer, encoding: string, callback: TransformCallback): void {
    this.push(chunk.toString().toUpperCase());
    callback();
  }
}

class TimestampTransform extends Remodel {
  _transform(chunk: Buffer, encoding: string, callback: TransformCallback): void {
    const timestamp = new Date().toISOString();
    this.push(`[${timestamp}] ${chunk.toString()}`);
    callback();
  }
}

const readStream = createReadStream('information.txt', { encoding: 'utf8' });
const writeStream = createWriteStream('output_chain.txt');
const higher = new UppercaseTransform();
const timestamp = new TimestampTransform();

readStream
  .pipe(higher)
  .pipe(timestamp)
  .pipe(writeStream)
  .on('end', () => {
    console.log('Chained rework full! Test output_chain.txt');
  })
  .on('error', (err: Error) => {
    console.error('Error:', err.message);
  });

Run it:

npx tsc && node dist/chainTransform.js

This reads information.txt, uppercases the information, provides a timestamp, and writes the end result to output_chain.txt. Chaining transforms showcases streams’ modularity.

Finest Practices for Streams in TypeScript

  1. Kind Your Knowledge: Outline interfaces for chunks to catch kind errors early.
  2. Deal with Errors: All the time connect error occasion listeners to keep away from unhandled exceptions.
  3. Use Pipes Correctly: Piping reduces guide occasion dealing with and improves readability.
  4. Backpressure: For giant information, monitor writeStream.writableHighWaterMark to keep away from overwhelming the vacation spot.

Actual-World Use Case: Streaming API Responses

Think about you’re constructing an API that streams a big dataset. Utilizing specific and streams:

import specific from 'specific';
import { Readable } from 'stream';

const app = specific();

app.get('/stream-data', (req, res) => {
  const information = ['Item 1n', 'Item 2n', 'Item 3n'];
  const stream = Readable.from(information);

  res.setHeader('Content material-Kind', 'textual content/plain');
  stream.pipe(res);
});

app.pay attention(3000, () => {
  console.log('Server operating on port 3000');
});

Set up dependencies (npm set up specific @varieties/specific), then run it. Go to http://localhost:3000/stream-data to see the information stream in your browser!

Superior Ideas: Dealing with Backpressure

When a writable stream can’t sustain with a readable stream, backpressure happens. Node.js handles this mechanically with pipes, however you’ll be able to monitor it manually:

const writeStream = createWriteStream('large_output.txt');

if (!writeStream.write('information')) {
  console.log('Backpressure detected! Pausing...');
  writeStream.as soon as('drain', () => {
    console.log('Resuming...');
  });
}

This ensures your app stays responsive below heavy hundreds.

Precautions for utilizing Backpressure: When writing massive quantities of information, the readable stream might produce information quicker than the writable stream can eat it. Whereas pipe and pipeline deal with this mechanically, if writing manually, verify if write() returns false and watch for the ‘drain’ occasion earlier than writing extra.

Moreover, async iterators (for await…of) are trendy options for consuming readable streams, which might typically simplify the code in comparison with utilizing .on(‘information’) and .on(‘finish’).

Instance:

async operate processStream(readable: Readable) {
  for await (const chunk of readable) {
    console.log('Chunk:', chunk);
  }
  console.log('Completed studying.');
}

Further factors:

Guarantee Useful resource Cleanup: That is particularly vital in customized stream implementations or when utilizing stream.pipeline. Explicitly name stream.destroy() in error eventualities or when the stream is now not wanted to launch underlying sources and stop leaks. stream.pipeline handles this mechanically for piped streams.

Use Readable.from() for Comfort: When you must create a stream from an current iterable (resembling an array) or an async iterable, Readable.from() is usually the only and most trendy method, requiring much less boilerplate code than making a customized Readable class.

Conclusion

Streams are a game-changer in Node.js, and TypeScript enhances them additional by introducing kind security and readability. From studying information to remodeling information in real-time, mastering streams opens up a world of environment friendly I/O potentialities. The examples right here—studying, writing, altering, compressing, and streaming over HTTP—scratch the floor of what’s attainable.

Experiment with your personal pipelines: strive streaming logs, processing CSV information, or constructing a reside chat system. The extra you discover, the extra you’ll recognize the flexibility of streams.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments