For AI agents and developers
LangBot integration guide
LangBot is open-source AI application delivery infrastructure for instant messaging. It connects AI apps and agents built with Dify, Coze, n8n, OpenAI-compatible models, Claude, DeepSeek, Gemini, Qwen, Ollama, and custom tools to Discord, Slack, Telegram, WeChat, WeCom, QQ, Lark, DingTalk, LINE, KOOK, and more.
Integration surfaces
- Self-hosted LangBot: operator-controlled runtime, adapters, models, plugins, and deployment configuration.
- LangBot Space / Cloud: hosted account, OAuth/device authorization, cloud instance, marketplace, billing, and telemetry surfaces at
https://space.langbot.app. - Plugin runtime: process-isolated Python SDK for custom capabilities, tools, commands, and event listeners.
Authentication
- Browser users authenticate through LangBot Space account login.
- Devices and LangBot instances can use OAuth/device authorization flows.
- API-enabled surfaces use bearer tokens or API keys where configured.
- Self-hosted deployments may add their own network or app-level access controls.
Rate limits
Self-hosted LangBot has no universal public rate limit by default; limits may come from the operator, configured model providers, messaging adapters, or plugins. LangBot Cloud applies fair-use and plan-based resource limits. Agents should back off on HTTP 429 and respect Retry-After when present.
Streaming behavior
LangBot can forward streaming model/provider output when the runner, provider, and messaging adapter support it. Integrations should also support non-streaming final responses because some providers, workflows, and messaging platforms only deliver complete messages.
Error response format
HTTP APIs return JSON error responses. Agents should handle common status classes such as 400, 401, 403, 404, 409, 429, and 500.
{
"code": 400,
"message": "human-readable error message",
"data": null
}