Thursday, October 9, 2025
HomeJavaScriptToken.js | Token.js

Token.js | Token.js


Import the Token.js shopper and name the create perform with a immediate in OpenAI’s format. Specify the mannequin and LLM supplier utilizing their respective fields.

OPENAI_API_KEY=<openai api key>

import { TokenJS } from 'token.js'

// Create the Token.js shopper
const tokenjs = new TokenJS()

async perform most important() {
  // Create a mannequin response
  const completion = await tokenjs.chat.completions.create({
    // Specify the supplier and mannequin
    supplier: 'openai',
    mannequin: 'gpt-4o',
    // Outline your message
    messages: [
      {
        position: 'person',
        content material: 'Hi there!',
      },
    ],
  })
  console.log(completion.decisions[0])
}
most important()

We advocate utilizing atmosphere variables to configure the credentials for every LLM supplier.

# OpenAI
OPENAI_API_KEY=
# AI21
AI21_API_KEY=
# Anthropic
ANTHROPIC_API_KEY=
# Cohere
COHERE_API_KEY=
# Gemini
GEMINI_API_KEY=
# Groq
GROQ_API_KEY=
# Mistral
MISTRAL_API_KEY=
# Perplexity
PERPLEXITY_API_KEY=
# AWS Bedrock
AWS_REGION_NAME=
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=

Token.js helps streaming responses for all suppliers that supply it.

import { TokenJS } from 'token.js'

const tokenjs = new TokenJS()

async perform most important() {
  const outcome = await tokenjs.chat.completions.create({
    stream: true,
    supplier: 'openai',
    mannequin: 'gpt-4o',
    messages: [
      {
        position: 'person',
        content material: `Inform me about your self.`,
      },
    ],
  })

  for await (const half of outcome)  '')
  
}
most important()

Token.js helps the perform calling instrument for all suppliers and fashions that supply it.

import { TokenJS, ChatCompletionTool } from 'token.js'

const tokenjs = new TokenJS()

async perform most important() {
  const instruments: ChatCompletionTool[] = [
    {
      sort: 'perform',
      perform: {
        title: 'get_current_weather',
        description: 'Get the present climate in a given location',
        parameters: {
          sort: 'object',
          properties: {
            location: {
              sort: 'string',
              description: 'Town and state, e.g. San Francisco, CA',
            },
          },
          required: ['location'],
        },
      },
    },
  ]

  const outcome = await tokenjs.chat.completions.create({
    supplier: 'gemini',
    mannequin: 'gemini-1.5-pro',
    messages: [
      {
        position: 'person',
        content material: `What is the climate like in San Francisco?`,
      },
    ],
    instruments,
    tool_choice: 'auto',
  })

  console.log(outcome.decisions[0].message.tool_calls)
}
most important()

This desk gives an outline of the options that Token.js helps from every LLM supplier.

Be aware: Sure LLMs, significantly older or weaker fashions, don’t help some options on this desk. For particulars about these restrictions, see our LLM supplier documentation.

RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments