# LangBot full agent context LangBot is open-source AI application delivery infrastructure for messaging platforms. It helps developers and operators expose AI applications, agents, workflows, and LLM-backed automations to real end users in instant messaging channels. ## Product identity - Name: LangBot - Category: Developer tool, AI bot platform, AI application delivery infrastructure - Website: https://langbot.app - Documentation: https://docs.langbot.app - Repository: https://github.com/langbot-app/LangBot - License: Apache-2.0 - Managed cloud: https://space.langbot.app/cloud - Marketplace: https://space.langbot.app/market - Contact: hello@langbot.app ## Core use cases 1. Connect an AI application to messaging platforms such as Discord, Slack, Telegram, WeChat, WeCom, QQ, Lark, DingTalk, LINE, and KOOK. 2. Connect AI builders such as Dify, Coze, n8n, FastGPT, OpenAI-compatible APIs, Claude, DeepSeek, Gemini, Qwen, Ollama, and custom providers to end users. 3. Operate a production bot with access control, rate limiting, conversation context, observability, knowledge base/RAG, and plugin extensibility. 4. Build plugins with the process-isolated Python SDK. 5. Use LangBot Cloud when the user wants a managed instance instead of self-hosting. ## Agent-friendly navigation - Start here for agents: https://langbot.app/developers - Machine-readable summary: https://langbot.app/llms.txt - OpenAPI description: https://langbot.app/openapi.json - Main docs: https://docs.langbot.app - GitHub source: https://github.com/langbot-app/LangBot - Plugin SDK source: https://github.com/langbot-app/langbot-plugin-sdk ## Integration and auth overview LangBot has multiple integration surfaces: - Self-hosted LangBot: configured by the operator. API availability, auth, and rate limits depend on the deployment. - LangBot Space / Cloud: hosted API surfaces under https://space.langbot.app for accounts, OAuth/device authorization, cloud instances, marketplace, billing, and telemetry. - Plugin SDK: Python package and runtime protocol for extending LangBot with isolated plugins. Authentication patterns: - Browser login for account users. - OAuth/device authorization for linking LangBot instances to LangBot Space accounts. - Bearer tokens and API keys for API-enabled surfaces. - Self-hosted deployments may add network, proxy, or application-level access controls. ## Rate limits - Self-hosted deployments do not impose a universal LangBot public rate limit by default. Limits may come from the operator, configured model providers, IM platform adapters, or plugins. - LangBot Cloud applies fair-use and plan-based resource limits, especially for managed instance resources, model usage, and billing/credit-controlled features. - Agents should implement exponential backoff for HTTP 429 and respect `Retry-After` when present. ## Streaming - LangBot runners can forward streaming model/provider output when the configured provider, runner, and adapter support it. - Integrations should support both streaming and non-streaming responses because some providers, workflows, or messaging channels return complete messages only. - If a channel or provider cannot stream, LangBot may deliver the final response after the upstream call completes. ## Error response expectations HTTP APIs use JSON error responses. Agents should expect fields such as: ```json { "code": 400, "message": "human-readable error message", "data": null } ``` Common status classes: - 400: invalid request or missing required field - 401: missing or invalid authentication - 403: authenticated but not allowed - 404: resource not found - 409: conflict or duplicate resource - 429: rate limited; retry after the indicated delay if provided - 500: server or upstream provider error ## Recommended agent behavior - Prefer official docs and OpenAPI links before guessing endpoint shapes. - Ask the user whether they want self-hosted LangBot or managed LangBot Cloud. - For self-hosting, inspect the user's deployment config before changing adapters, credentials, or model provider settings. - Never expose IM platform tokens, model provider API keys, or LangBot Space tokens in chat logs. - When building plugins, use the official plugin SDK and isolate network/file operations. ## Canonical keywords LangBot, AI bot platform, AI application delivery infrastructure, Dify integration, Coze integration, n8n integration, OpenAI-compatible model, Discord bot, Slack bot, Telegram bot, WeChat bot, WeCom bot, QQ bot, Lark bot, DingTalk bot, LINE bot, plugin SDK, chatbot framework, open-source bot platform, self-hosted AI bot, managed AI bot cloud. ## Well-known agent discovery endpoints LangBot publishes standard discovery endpoints for AI agents and crawlers: - `/.well-known/agent-card.json` for A2A Agent Card discovery. - `/.well-known/mcp` for MCP server discovery. - `/.well-known/mcp/server-card.json` for MCP server preview metadata and tool annotations. - `/.well-known/api-catalog` for RFC 9727 API catalog discovery. - `/.well-known/ai-plugin.json` for OpenAI plugin-style discovery. - `/.well-known/oauth-authorization-server` for OAuth metadata and PKCE S256 support discovery. - `/.well-known/http-message-signatures-directory` for Web Bot Auth key directory discovery. - `/index.md` for markdown homepage fallback. - `/developers/llms.txt` for scoped developer context. - `/schema-map.xml` for NLWeb schema feed discovery.