Skip to main content

Overview

The Chatbase REST API enables you to integrate AI-powered conversations into any application or workflow. Build custom chat experiences, automate customer interactions, and manage your AI agents programmatically.

Send Messages

Chat with your AI agents and handle real-time streaming responses

Manage Agents

Create, configure, and update AI agents with custom training data

Access Data

Retrieve conversations, leads, and analytics from your AI interactions

Quick Start

1

Get Your API Key

Creating API key in Chatbase dashboard
  1. Visit your Chatbase Dashboard
  2. Navigate to Workspace SettingsAPI Keys
  3. Click Create API Key and copy the generated key
Store your API key securely and never expose it in client-side code.
2

Get Your Agent ID

Finding Agent ID in Chatbase settings
  1. Select your AI Agent in the dashboard
  2. Go to SettingsGeneral
  3. Copy the Chatbot ID (UUID format)
3

Send Your First Message

Test your integration with a simple chat request:
curl -X POST 'https://www.chatbase.co/api/v1/chat' \
  -H 'Authorization: Bearer YOUR_API_KEY' \
  -H 'Content-Type: application/json' \
  -d '{
    "messages": [{"content": "Hello! How can you help me?", "role": "user"}],
    "chatbotId": "your-chatbot-id-here"
  }'
Expected Response:
{
  "text": "Hello! I'm here to help answer your questions and assist with any information you need. What can I help you with today?"
}

Chat API Streaming

The chat API supports real-time streaming responses for better user experience.
// streamer.js

const axios = require('axios')
const {Readable} = require('stream')

const apiKey = '<Your-Secret-Key>'
const chatId = '<Your Chatbot ID>'
const apiUrl = 'https://www.chatbase.co/api/v1/chat'

const messages = [{content: '<Your query here>', role: 'user'}]

const authorizationHeader = `Bearer ${apiKey}`

async function readChatbotReply() {
  try {
    const response = await axios.post(
      apiUrl,
      {
        messages,
        chatId,
        stream: true,
        temperature: 0,
      },
      {
        headers: {
          Authorization: authorizationHeader,
          'Content-Type': 'application/json',
        },
        responseType: 'stream',
      }
    )

    const readable = new Readable({
      read() {},
    })

    response.data.on('data', (chunk) => {
      readable.push(chunk)
    })

    response.data.on('end', () => {
      readable.push(null)
    })

    const decoder = new TextDecoder()
    let done = false

    readable.on('data', (chunk) => {
      const chunkValue = decoder.decode(chunk)

      // Process the chunkValue as desired
      // Here we just output it as in comes in without \n
      process.stdout.write(chunkValue)
    })

    readable.on('end', () => {
      done = true
    })
  } catch (error) {
    console.log('Error:', error.message)
  }
}

readChatbotReply()

Performance Best Practices

Optimization Strategies:
  • Use streaming for chat responses to improve perceived performance
  • Cache agent responses when appropriate
  • Batch multiple operations when possible
  • Monitor and optimize conversation context length

🚀 Try It Live!

Ready to see the magic in action? Dive straight into our interactive playground where you can test every API endpoint, experiment with real responses, and build your integration in real-time.

Key API Endpoints

I