Skip to content

5.1. Continuum Router & API

Backend.AI GO isn't just a chat app; it's a powerful AI API Gateway for your local network.

What is Continuum Router?

The Continuum Router is an internal component that acts as a traffic controller. It provides a single, unified entry point for all your AI models—whether they are running locally on your machine or in the cloud.

For more detailed technical documentation, please refer to docs.continuum.lablup.ai.

OpenAI-Compatible API

API endpoints API endpoints

Backend.AI GO exposes an API that mimics the OpenAI API standard. This means you can use Backend.AI GO as a drop-in replacement for any application that supports OpenAI.

  • Endpoint: http://localhost:8000/v1 (Default)
  • Authentication: Optional (configurable in Settings)

Example: Using with curl

curl http://localhost:8000/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d 
    "model": "llama-3",
    "messages": [{"role": "user", "content": "Hello!"}]

API mesh API mesh

External Access

By default, the API is only accessible from your own computer (127.0.0.1). If you want to access your models from other devices on your local network (like a tablet or another laptop):

  1. Go to Settings > Network.
  2. Enable Allow External Access.
  3. Restart the router.
  4. Use your computer's local IP address (e.g., http://192.168.1.10:8000/v1) in your other apps.

Security Tip

Only enable external access if you trust all devices on your local network.