Skip to content

Cloud AI Integration

Backend.AI GO allows you to seamlessly mix and match local models with powerful cloud-based APIs.

Why Integrate Cloud APIs?

  • Hybrid Workflow: Handle sensitive data locally with Llama 3, but switch to GPT-5.2 or Claude 4.5 Sonnet for complex reasoning tasks.

  • Unified Interface: Chat with all your models—local or cloud—in one place. No need to switch tabs or apps.

  • Tools & Agents: Cloud models can use Backend.AI GO's agent system and tools just like local models.

Supported Providers

Backend.AI GO supports a wide range of providers:

  • OpenAI: GPT-5.2, GPT-5.1, GPT-4o (legacy).

  • Anthropic: Claude 4.5 Sonnet, Claude 4.5 Opus, Claude 4.5 Haiku.

  • Google Gemini: Gemini 1.5 Pro, Gemini 1.5 Flash.

  • OpenAI Compatible: Connect to Any service that speaks the OpenAI API dialect (Ollama, LocalAI, DeepSeek, Groq, etc.).

  • Remote vLLM: Connect to your own high-performance inference server.

Security

Your API keys are stored securely in your operating system's native keychain (macOS Keychain, Windows Credential Manager). Backend.AI GO communicates directly with the provider's API; your keys and data are never sent to Lablup servers.