Monday, July 15, 2024
HomeProgrammingHelpful Constructed-in Node.js APIs - SitePoint

Helpful Constructed-in Node.js APIs – SitePoint

We’ve compiled an inventory of essentially the most used and helpful APIs which might be in-built to the usual Node.js runtime. For every module you’ll discover easy english explanations and examples that can assist you perceive.

This information has been tailored from my course Node.js: Novice to Ninja. Test it out there to comply with complete course to construct your individual multi-user actual time chat software. It additionally consists of quizzes, movies, code to run your individual docker containers.

When constructing your first Node.js software it’s useful to know what utilities and APIs node provides out of the field to assist with widespread use instances and growth wants.

Helpful Node.js APIs

  • Course of: Retrieve data on atmosphere variables, args, CPU utilization and reporting. 
  • OS: Retrieve OS and system associated data that Node is operating on: CPUs, Working system model, residence directories, and so forth. 
  • Util: A set of helpful and customary strategies that assist with decoding textual content, sort checking and evaluating objects. 
  • URL: Simply create and parse URLs. 
  • File System API: Work together with the file system to create, learn, replace, and delete recordsdata, directories, and permissions.
  • Occasions: For emitting and subscribing to occasions in Node.js. Works equally to client-side occasion listeners.
  • Streams: Used to course of massive quantities of information in smaller and extra manageable chunks to keep away from reminiscence points.
  • Employee Threads: Used to separate the execution of capabilities on separate threads to keep away from bottleneck. Helpful for CPU-intensive JavaScript operations.
  • Little one Processes: Permits you to run sub-processes you could monitor and terminate as mandatory.
  • Clusters: Permit you to fork any variety of equivalent processes throughout cores to deal with the load extra effectively.

Course of

The course of object gives details about your Node.js software in addition to management strategies. Use it to get data like atmosphere variables, and CPU and Reminiscence utilization. course of is on the market globally: you should utilize it with out import, though the Node.js documentation recommends you explicitly reference it:

import course of from 'course of';
  • course of.argv returns an array the place the primary two objects are the Node.js executable path and the script title. The merchandise at index 2 is the primary argument handed.
  • course of.env: returns an object containing atmosphere title/worth pairs—equivalent to course of.env.NODE_ENV.
  • course of.cwd(): returns the present working listing.
  • course of.platform: returns a string figuring out the working system: 'aix''darwin' (macOS), 'freebsd''linux''openbsd''sunos', or 'win32' (Home windows).
  • course of.uptime(): returns the variety of seconds the Node.js course of has been operating.
  • course of.cpuUsage(): returns the consumer and system CPU time utilization of the present course of—equivalent to { consumer: 12345, system: 9876 }. Cross the item again to the strategy to get a relative studying.
  • course of.memoryUsage(): returns an object describing reminiscence utilization in bytes.
  • course of.model: returns the Node.js model string—equivalent to 18.0.0.
  • course generates a diagnostic report.
  • course of.exit(code): exits the present software. Use an exit code of 0 to point success or an applicable error code the place mandatory.


The os API is analogous to course of (see the “Course of” part above), however it might additionally return details about the Working System Node.js is operating in. This gives data equivalent to what OS model, CPUs and up time.

  • os.cpus(): returns an array of objects with details about every logical CPU core. The “Clusters” part under references os.cpus() to fork the method. On a 16-core CPU, you’d have 16 cases of your Node.js software operating to enhance efficiency.
  • os.hostname(): the OS host title.
  • os.model(): a string figuring out the OS kernel model.
  • os.homedir(): the complete path of the consumer’s residence listing.
  • os.tmpdir(): the complete path of the working system’s default momentary file listing.
  • os.uptime(): the variety of seconds the OS has been operating.


The util module gives an assortment of helpful JavaScript strategies. One of the crucial helpful is util.promisify(perform), which takes an error-first callback type perform and returns a promise-based perform. The Util module also can assist with widespread patterns like decoding textual content, sort checking, and inspecting objects.

import util from 'util';

util.varieties.isDate( new Date() ); 
util.varieties.isMap( new Map() );  
util.varieties.isRegExp( /abc/ ); 
util.varieties.isAsyncFunction( async () => {} ); 


URL is one other world object that allows you to safely create, parse, and modify net URLs. It’s actually helpful for rapidly extracting protocols, ports, parameters and hashes from URLs with out resorting to regex. For instance:

  href: '',
  origin: '',
  protocol: 'https:',
  username: '',
  password: '',
  host: '',
  hostname: '',
  port: '8000',
  pathname: '/path/',
  search: '?abc=123',
  searchParams: URLSearchParams { 'abc' => '123' },
  hash: '#goal'

You’ll be able to view and alter any property. For instance:

myURL.port = 8001;
console.log( myURL.href );

You’ll be able to then use the URLSearchParams API to change question string values. For instance:

myURL.searchParams.append('xyz', 987);
console.log( );

There are additionally strategies for changing file system paths to URLs and again once more.

The dns module gives title decision capabilities so you’ll be able to lookup the IP tackle, title server, TXT data, and different area data.

File System API

The fs API can create, learn, replace, and delete recordsdata, directories, and permissions. Current releases of the Node.js runtime present promise-based capabilities in fs/guarantees, which make it simpler to handle asynchronous file operations.

You’ll usually use fs at the side of path to resolve file names on totally different working techniques.

The next instance module returns details about a file system object utilizing the stat and entry strategies:

import { constants as fsConstants } from 'fs';
import { entry, stat } from 'fs/guarantees';

export async perform getFileInfo(file) {

  const fileInfo = {};

  strive {
    const data = await stat(file);
    fileInfo.isFile = data.isFile();
    fileInfo.isDir = data.isDirectory();
  catch (e) {
    return { new: true };

  strive {
    await entry(file, fsConstants.R_OK);
    fileInfo.canRead = true;
  catch (e) {}

  strive {
    await entry(file, fsConstants.W_OK);
    fileInfo.canWrite = true;
  catch (e) {}

  return fileInfo;


When handed a filename, the perform returns an object with details about that file. For instance:

  isFile: true,
  isDir: false,
  canRead: true,
  canWrite: true

The principle filecompress.js script makes use of path.resolve() to resolve enter and output filenames handed on the command line into absolute file paths, then fetches data utilizing getFileInfo() above:

#!/usr/bin/env node
import path from 'path';
import { readFile, writeFile } from 'fs/guarantees';
import { getFileInfo } from './lib/fileinfo.js';

  enter = path.resolve(course of.argv[2] || ''),
  output = path.resolve(course of.argv[3] || ''),
  [ inputInfo, outputInfo ] = await Promise.all([ getFileInfo(input), getFileInfo(output) ]),
  error = [];

The code validates the paths and terminates with error messages if mandatory:

if (outputInfo.isDir && outputInfo.canWrite && inputInfo.isFile) {
  output = path.resolve(output, path.basename(enter));

if (!inputInfo.isFile || !inputInfo.canRead) error.push(`can't learn enter file ${ enter }`);
if (enter === output) error.push('enter and output recordsdata can't be the identical');

if (error.size) dir]');
  console.error('n  ' + a part of('n  '));
  course of.exit(1);

The entire file is then learn right into a string named content material utilizing readFile():

console.log(`processing ${ enter }`);
let content material;

strive {
  content material = await readFile(enter, { encoding: 'utf8' });
catch (e) {
  course of.exit(1);

let lengthOrig = content material.size;
console.log(`file measurement  ${ lengthOrig }`);

JavaScript common expressions then take away feedback and whitespace:

content material = content material
  .exchange(/ns+/g, 'n')                
  .exchange(///.*?n/g, '')              
  .exchange(/s+/g, ' ')                   
  .exchange(//*.*?*//g, '')            
  .exchange(/<!--.*?-->/g, '')             
  .exchange(/s*([<>(){}}[]])s*/g, '$1') 

let lengthNew = content material.size;

The ensuing string is output to a file utilizing writeFile(), and a standing message exhibits the saving:

let lengthNew = content material.size;

console.log(`outputting ${output}`);
console.log(`file measurement  ${ lengthNew } - saved ${ Math.spherical((lengthOrig - lengthNew) / lengthOrig * 100) }%`);

strive {
  content material = await writeFile(output, content material);
catch (e) {
  course of.exit(1);

Run the venture code with an instance HTML file:

node filecompress.js ./take a look at/instance.html ./take a look at/output.html


You usually must execute a number of capabilities when one thing happens. For instance, a consumer registers in your app, so the code should add their particulars to a database, begin a brand new logged-in session, and ship a welcome electronic mail. The Occasions module :

async perform userRegister(title, electronic mail, password) {

  strive {

    await dbAddUser(title, electronic mail, password);
    await new UserSession(electronic mail);
    await emailRegister(title, electronic mail);

  catch (e) {


This collection of perform calls is tightly coupled to consumer registration. Additional actions incur additional perform calls. For instance:

strive {

  await dbAddUser(title, electronic mail, password);
  await new UserSession(electronic mail);
  await emailRegister(title, electronic mail);

  await crmRegister(title, electronic mail); 
  await emailSales(title, electronic mail);  


You possibly can have dozens of calls managed on this single, ever-growing code block.

The Node.js Occasions API gives another approach to construction the code utilizing a publish–subscribe sample. The userRegister() perform can emit an occasion—maybe named newuser —after the consumer’s database report is created.

Any variety of occasion handler capabilities can subscribe and react to newuser occasions; there’s no want to vary the userRegister() perform. Every handler runs independently of the others, so they might execute in any order.

Occasions in Consumer-side JavaScript

Occasions and handler capabilities are continuously utilized in client-side JavaScript—for instance, to run a perform when the consumer clicks a component:

  doc.getElementById('myelement').addEventListener('click on', e => {

In most conditions, you’re attaching handlers for consumer or browser occasions, though you’ll be able to increase your individual customized occasions. Occasion dealing with in Node.js is conceptually related, however the API is totally different.

Objects that emit occasions should be cases of the Node.js EventEmitter class. These have an emit() technique to boost new occasions and an on() technique for attaching handlers.

The occasion instance venture gives a category that triggers a tick occasion on predefined intervals. The ./lib/ticker.js module exports a default class that extends EventEmitter:

import EventEmitter from 'occasions';
import { setInterval, clearInterval } from 'timers';

export default class extends EventEmitter {

Its constructor should name the guardian constructor. It then passes the delay argument to a begin() technique:

constructor(delay) {

The begin() technique checks delay is legitimate, resets the present timer if mandatory, and units the brand new delay property:

begin(delay) {

  if (!delay || delay == this.delay) return;

  if (this.interval) {

  this.delay = delay;

It then begins a brand new interval timer that runs the emit() technique with the occasion title "tick". Subscribers to this occasion obtain an object with the delay worth and variety of seconds because the Node.js software began:C

    this.interval = setInterval(() => {

      this.emit('tick', {
        delay:  this.delay,

    }, this.delay);



The principle occasion.js entry script imports the module and units a delay interval of 1 second (1000 milliseconds):Copy

import Ticker from './lib/ticker.js';

const ticker = new Ticker(1000);

It attaches handler capabilities triggered each time a tick occasion happens:

ticker.on('tick', e => {
  console.log('handler 1 tick!', e);

ticker.on('tick', e => {
  console.log('handler 2 tick!', e);

A 3rd handler triggers on the primary tick occasion solely utilizing the as soon as() technique: soon as('tick', e => {
  console.log('handler 3 tick!', e);

Lastly, the present variety of listeners is output:

console.log(`listeners: ${ ticker.listenerCount('tick') }`);

Run the venture code with node occasion.js.

The output exhibits handler 3 triggering as soon as, whereas handler 1 and a pair of run on each tick till the app is terminated.


The file system instance code above (within the “File System” part) reads a complete file into reminiscence earlier than outputting the minified outcome. What if the file was bigger than the RAM out there? The Node.js software would fail with an “out of reminiscence” error.

The answer is streaming. This processes incoming knowledge in smaller, extra manageable chunks. A stream may be:

  • readable: from a file, a HTTP request, a TCP socket, stdin, and so forth.
  • writable: to a file, a HTTP response, TCP socket, stdout, and so forth.
  • duplex: a stream that’s each readable and writable
  • rework: a duplex stream that transforms knowledge

Every chunk of information is returned as a Buffer object, which represents a fixed-length sequence of bytes. It’s possible you’ll must convert this to a string or one other applicable sort for processing.

The instance code has a filestream venture which makes use of a rework stream to deal with the file measurement downside within the filecompress venture. As earlier than, it accepts and validates enter and output filenames earlier than declaring a Compress class, which extends Remodel:

import { createReadStream, createWriteStream } from 'fs';
import { Remodel } from 'stream';

class Compress extends Remodel {

  constructor(opts) {
    this.chunks = 0;
    this.lengthOrig = 0;
    this.lengthNew = 0;

  _transform(chunk, encoding, callback) {

      knowledge = chunk.toString(),                  
      content material = knowledge
        .exchange(/ns+/g, 'n')                
        .exchange(///.*?n/g, '')              
        .exchange(/s+/g, ' ')                   
        .exchange(//*.*?*//g, '')            
        .exchange(/<!--.*?-->/g, '')             
        .exchange(/s*([<>(){}}[]])s*/g, '$1') 

    this.lengthOrig += knowledge.size;
    this.lengthNew += content material.size;

    this.push( content material );



The _transform technique is known as when a brand new chunk of information is prepared. It’s acquired as a Buffer object that’s transformed to a string, minified, and output utilizing the push() technique. A callback() perform is known as as soon as chunk processing is full.

The applying initiates file learn and write streams and instantiates a brand new compress object:

  readStream = createReadStream(enter),
  writeStream = createWriteStream(output),
  compress = new Compress();

console.log(`processing ${ enter }`)

The incoming file learn stream has .pipe() strategies outlined, which feed the incoming knowledge by way of a collection of capabilities which will (or might not) alter the contents. The information is piped by way of the compress rework earlier than that output is piped to the writeable file. A closing on('end') occasion handler perform executes as soon as the stream has ended:

readStream.pipe(compress).pipe(writeStream).on('end', () => {
  console.log(`file measurement  ${ compress.lengthOrig }`);  console.log(`output     ${ output }`);  console.log(`chunks     readStream.pipe(compress).pipe(writeStream).on('end', () => {

  console.log(`file measurement  ${ compress.lengthOrig }`);
  console.log(`output     ${ output }`);
  console.log(`chunks     ${ compress.chunks }`);
  console.log(`file measurement  ${ compress.lengthNew } - saved ${ Math.spherical((compress.lengthOrig - compress.lengthNew) / compress.lengthOrig * 100) }%`);


Run the venture code with an instance HTML file of any measurement:

node filestream.js ./take a look at/instance.html ./take a look at/output.html
filestream.js output

It is a small demonstration of Node.js streams. Stream dealing with is a posh subject, and chances are you’ll not use them usually. In some instances, a module equivalent to Categorical makes use of streaming underneath the hood however abstracts the complexity from you.

You also needs to pay attention to knowledge chunking challenges. A bit may very well be any measurement and break up the incoming knowledge in inconvenient methods. Contemplate minifying this code:

<script sort="module">

Two chunks may arrive in sequence:

<script sort="module">



Processing every chunk independently leads to the next invalid minified script:

<script sort="module">script console.log('loaded');</script>

The answer is to pre-parse every chunk and break up it into entire sections that may be processed. In some instances, chunks (or elements of chunks) will likely be added to the beginning of the subsequent chunk.

Minification is greatest utilized to entire traces, though an additional complication happens as a result of <!-- --> and /* */ feedback can span multiple line. Right here’s a doable algorithm for every incoming chunk:

  1. Append any knowledge saved from the earlier chunk to the beginning of the brand new chunk.
  2. Take away any entire <!-- to --> and /* to */ sections from the chunk.
  3. Break up the remaining chunk into two elements, the place part2 begins with the primary <!-- or /* discovered. If both exists, take away additional content material from part2 aside from that image.If neither is discovered, break up on the final carriage return character. If none is discovered, set part1 to an empty string and part2 to the entire chunk.If part2 turns into considerably massive—maybe greater than 100,000 characters as a result of there aren’t any carriage returns—append part2 to part1 and set part2 to an empty string. This can guarantee saved elements can’t develop indefinitely.
  4. Minify and output part1.
  5. Save part2 (which is added to the beginning of the subsequent chunk).

The method runs once more for every incoming chunk.

That’s your subsequent coding problem— when you’re prepared to just accept it!

Employee Threads

From the docs: “Employees (threads) are helpful for performing CPU-intensive JavaScript operations. They don’t assist a lot with I/O-intensive work. The Node.js built-in asynchronous I/O operations are extra environment friendly than Employees may be”.

Assume a consumer may set off a posh, ten-second JavaScript calculation in your Categorical software. The calculation would develop into a bottleneck that halted processing for all customers. Your software can’t deal with any requests or run different capabilities till it completes.

Asynchronous Calculations

Complicated calculations that course of knowledge from a file or database could also be much less problematic, as a result of every stage runs asynchronously because it waits for knowledge to reach. Processing happens on separate iterations of the occasion loop.

Nonetheless, long-running calculations written in JavaScript alone—equivalent to picture processing or machine-learning algorithms—will hog the present iteration of the occasion loop.

One answer is employee threads. These are much like browser net staff and launch a JavaScript course of on a separate thread. The principle and employee thread can change messages to set off or terminate processing.

Employees and Occasion Loops

Employees are helpful for CPU-intensive JavaScript operations, though the primary Node.js occasion loop ought to nonetheless be used for asynchronous I/O actions.

The instance code has a employee venture that exports a diceRun() perform in lib/cube.js. This throws any variety of N-sided cube numerous instances and data a depend of the entire rating (which ought to lead to a Regular distribution curve):

export perform diceRun(runs = 1, cube = 2, sides = 6) {

  const stat = [];

  whereas (runs > 0) {

    let sum = 0;
    for (let d = cube; d > 0; d--) {
      sum += Math.ground( Math.random() * sides ) + 1;

    stat[sum] = (stat[sum] || 0) + 1;


  return stat;


The code in index.js begins a course of that runs each second and outputs a message:

const timer = setInterval(() => {
  console.log('  one other course of');
}, 1000);

Two cube are then thrown one billion instances utilizing a typical name to the diceRun() perform:

import { diceRun } from './lib/cube.js';

  numberOfDice = 2,
  runs = 999_999_999;

const stat1 = diceRun(runs, numberOfDice);

This halts the timer, as a result of the Node.js occasion loop can’t proceed to the subsequent iteration till the calculation completes.

The code then tries the identical calculation in a brand new Employee. This masses a script named employee.js and passes the calculation parameters within the workerData property of an choices object:

import { Employee } from 'worker_threads';

const employee = new Employee('./employee.js', { workerData: { runs, numberOfDice } });

Occasion handlers are connected to the employee object operating the employee.js script so it might obtain incoming outcomes:

// outcome returned
employee.on('message', outcome => {

… and deal with errors:

employee.on('error', e => {

… and tidy up as soon as processing has accomplished:

employee.on('exit', code => {

The employee.js script begins the diceRun() calculation and posts a message to the guardian when it’s full—which is acquired by the "message" handler above:

import { workerData, parentPort } from 'worker_threads';
import { diceRun } from './lib/cube.js';

const stat = diceRun( workerData.runs, workerData.numberOfDice );

parentPort.postMessage( stat );

The timer isn’t paused whereas the employee runs, as a result of it executes on one other CPU thread. In different phrases, the Node.js occasion loop continues to iterate with out lengthy delays.

Run the venture code with node index.js.

The worker output

You must notice that the worker-based calculation runs barely sooner as a result of the thread is totally devoted to that course of. Think about using staff when you encounter efficiency bottlenecks in your software.

Little one Processes

It’s generally essential to name purposes which might be both not written in Node.js or have a danger of failure.

A Actual-world Instance

I labored on an Categorical software that generated a fuzzy picture hash used to determine related graphics. It ran asynchronously and labored effectively—till somebody uploaded a malformed GIF containing a round reference (animation frameA referenced frameB which referenced frameA).

The hash calculation by no means ended. The consumer gave up and tried importing once more. And once more. And once more. The entire software ultimately crashed with reminiscence errors.

The issue was mounted by operating the hashing algorithm in a toddler course of. The Categorical software remained steady as a result of it launched, monitored, and terminated the calculation when it took too lengthy.

The little one course of API means that you can run sub-processes you could monitor and terminate as mandatory. There are three choices:

  • spawn: spawns a toddler course of.
  • fork: a particular sort of spawn that launches a brand new Node.js course of.
  • exec: spawns a shell and runs a command. The result’s buffered and returned to a callback perform when the method ends.

Not like employee threads, little one processes are unbiased from the primary Node.js script and might’t entry the identical reminiscence.


Is your 64-core server CPU under-utilized when your Node.js software runs on a single core? Clusters will let you fork any variety of equivalent processes to deal with the load extra effectively.

The preliminary major course of can fork itself—maybe as soon as for every CPU returned by os.cpus(). It will possibly additionally deal with restarts when a course of fails, and dealer communication messages between forked processes.

Clusters work amazingly effectively, however your code can develop into advanced. Less complicated and extra strong choices embrace:

Each can begin, monitor, and restart a number of remoted cases of the identical Node.js software. The applying will stay lively even when one fails.

Write Stateless Purposes

It’s price mentioning: make your software stateless to make sure it might scale and be extra resilient. It needs to be doable to begin any variety of cases and share the processing load.


This text has supplied a pattern of the extra helpful Node.js APIs, however I encourage you to browse the documentation and uncover them for your self. The documentation is usually good and exhibits easy examples, however it may be terse in locations.

As talked about, this information relies on my course Node.js: Novice to Ninja which is on the market on SitePoint Premium.



Please enter your comment!
Please enter your name here

Most Popular

Recent Comments