from openai import OpenAI
client = OpenAI(
base_url="https://api.vip.lingapi.ai/v1",
api_key="<YOUR_API_KEY>"
)
response = client.chat.completions.create(
model="<model id from /v1/models>",
messages=[{"role": "user", "content": "Introduce the Ling.AI API Gateway in three sentences."}],
stream=False
)
Developer Docs
VIP Multi-Model API Gateway
Private capacity for priority AI workloads.
Ling.AI VIP runs as an isolated English instance with its own API endpoint, Redis runtime, database schema, and outbox workers for priority workloads.
Three compatible entry points
All examples on this VIP site target api.vip.lingapi.ai, keeping SDK configuration separate from the standard English instance.
from openai import OpenAI
client = OpenAI(
base_url="https://api.vip.lingapi.ai/v1",
api_key="<YOUR_API_KEY>"
)
response = client.responses.create(
model="<model id from /v1/models>",
input="Summarize the public endpoints and tool capabilities available in this system."
)
curl -sS https://api.vip.lingapi.ai/v1/models \
-H "x-api-key: <YOUR_API_KEY>"
Explore the docs
Browse public APIs, model capabilities, tool integrations, account usage, and async task flows based on the capabilities enabled in this instance.
Quickstart
Start with account setup, API keys, model discovery, a first request, and task polling.
Open docs →API Reference
Chat, Responses, images, video, embeddings, audio, Models, Health, Cache Health, and async tasks.
Open docs →Tool Integrations
Connect Claude Code, Zed, Cline, Cherry Studio, OpenCode, and other clients to one OpenAI-compatible gateway.
Open docs →Model Catalog
Browse public models, capability tags, protocol compatibility, operations, and pricing summaries.
Open docs →