# LangBot > LangBot is an open-source AI application delivery infrastructure for instant messaging platforms. It connects AI applications and agents built with Dify, Coze, n8n, OpenAI-compatible models, Claude, DeepSeek, Gemini, Qwen, Ollama, and custom tools to Discord, Slack, Telegram, WeChat, WeCom, QQ, Lark, DingTalk, LINE, KOOK, and more. ## Primary links - Website: https://langbot.app - Agent and developer guide: https://langbot.app/developers - Full agent context: https://langbot.app/llms-full.txt - OpenAPI description: https://langbot.app/openapi.json - Documentation: https://docs.langbot.app - GitHub repository: https://github.com/langbot-app/LangBot - Plugin SDK: https://github.com/langbot-app/langbot-plugin-sdk - Plugin marketplace: https://space.langbot.app/market - Managed cloud instances: https://space.langbot.app/cloud - Blog: https://blog.langbot.app ## What agents can do with LangBot - Deploy or guide deployment of self-hosted LangBot with Docker. - Create managed LangBot Cloud instances for users who do not want to operate servers. - Connect AI apps, agents, and workflows to real messaging channels. - Build custom plugins with the Python plugin SDK. - Inspect integration surfaces, auth methods, rate-limit expectations, streaming support, and error response formats in the developer guide. ## Integration surfaces - Self-hosted LangBot admin/API surface: controlled by the operator's deployment and configuration. - LangBot Space / Cloud API: account, OAuth/device authorization, cloud instance, marketplace, billing, and telemetry surfaces hosted at https://space.langbot.app. - Plugin runtime API: process-isolated Python SDK for extending LangBot behavior. ## Authentication summary - Browser users authenticate through LangBot Space account login. - Devices and LangBot instances can use OAuth/device authorization flows. - API access uses bearer tokens or API keys where enabled. - Self-hosted deployments may configure their own access controls. ## Rate limits and streaming summary - Self-hosted LangBot has no global public SaaS rate limit by default; operators define limits through their deployment, model providers, and plugin configuration. - LangBot Cloud applies fair-use and plan-based limits for managed resources. - Model/provider calls may stream responses when the configured provider and runner support streaming; integrations should tolerate non-streaming responses and provider-specific errors. ## Support - Support email: hello@langbot.app - GitHub Discussions: https://github.com/langbot-app/LangBot/discussions - Issues: https://github.com/langbot-app/LangBot/issues ## Agent discovery endpoints - Markdown homepage: https://langbot.app/index.md - Section-specific developer context: https://langbot.app/developers/llms.txt - A2A agent card: https://langbot.app/.well-known/agent-card.json - MCP discovery: https://langbot.app/.well-known/mcp - MCP server card: https://langbot.app/.well-known/mcp/server-card.json - API catalog: https://langbot.app/.well-known/api-catalog - OpenAI plugin manifest: https://langbot.app/.well-known/ai-plugin.json - OAuth authorization server metadata: https://langbot.app/.well-known/oauth-authorization-server - Web Bot Auth HTTP message signatures directory: https://langbot.app/.well-known/http-message-signatures-directory - NLWeb schema map: https://langbot.app/schema-map.xml