Skip to main content

Build with AI

Use AI coding assistants to build with Agora faster. With access to Agora's full documentation, your AI assistant can generate working code examples, suggest API implementations, and answer platform-specific questions in real-time.

Agora MCP server

Agora provides a free Model Context Protocol (MCP) server with tools for AI coding assistants to browse and search the docs site.

The Agora MCP server is available at:


_1
https://mcp.agora.io

Installation

Refer to the installation instructions for your coding assistant.

Cursor

Click the button below to install the MCP server in Cursor

Install Agora MCP Server in Cursor

or add it manually with the following JSON:


_7
{
_7
"mcpServers": {
_7
"agora-docs": {
_7
"url": "https://mcp.agora.io"
_7
}
_7
}
_7
}

Claude

Claude Code

Run the following command in your terminal to install the MCP server in Claude Code:


_1
claude mcp add --transport http agora-docs https://mcp.agora.io

Claude Desktop

In Settings, select Connectors and then choose Add custom connector. Enter the following values and click Add:

  • Name: agora-docs
  • Remote MCP server URL: https://mcp.agora.io

Codex

Run the following command in your terminal to install the server in OpenAI Codex:


_1
codex mcp add --url https://mcp.agora.io agora-docs

Gemini CLI

Run the following command in your terminal to install the server in Gemini CLI:


_1
gemini mcp add --transport http agora-docs https://mcp.agora.io

Manual installation

Add the server URL https://mcp.agora.io to your MCP client of choice. If prompted, set the transport to http or "Streamable HTTP".

Getting started

Once installed, your coding assistant has access to Agora's documentation through the MCP server. The assistant will intelligently use this resource when relevant to your questions. For more targeted results, mention Agora along with your target product and platform, such as 'iOS', 'Web', 'Conversational AI', 'Video Calling' in your prompts.

System Prompt

This MCP works with all LLMs that support MCP, but performs best when the assistant understands facet-based exploration. Add the following prompt to your LLMs custom instructions:

System prompt for LLMs

_41
# Agora MCP Markdown - System Prompt
_41
_41
You have access to Agora's documentation search via three tools:
_41
- `algolia_search_index_docs_platform_aware_markdown` - Full-text search with facets
_41
- `algolia_search_for_facet_values` - Browse products/platforms
_41
- `algolia_recommendations` - Find related documentation
_41
_41
## Key Behaviors
_41
_41
**1. Use facets for discovery**
_41
- When users ask about "what's available", explore facets first
_41
- Example: User asks "what video products exist?"
_41
→ Use `algolia_search_for_facet_values(facetName: "product", facetQuery: "video")`
_41
→ Shows video-calling is the main product
_41
_41
**2. Platform-aware searching**
_41
- Include target platform in search queries (iOS, Android, Web, etc.)
_41
- Results are intelligently ranked by platform relevance
_41
_41
**3. Interpret facet results for refinement**
_41
- Search results include facet breakdowns by product and platform
_41
- Use these to suggest filtering: "15 results are for video-calling, 8 for voice-calling"
_41
_41
**4. Maintain session continuity**
_41
- Use same sessionId (UUID) across related searches in a conversation
_41
- Improves ranking for follow-up queries
_41
_41
## Available Products & Platforms
_41
_41
**Key Products:** conversational-ai, video-calling, broadcast-streaming, interactive-live-streaming, agora-chat, voice-calling, signaling, extensions-marketplace, interactive-whiteboard
_41
_41
**Key Platforms:** android, web, ios, windows, macos, unity, flutter, react-native, electron, unreal
_41
_41
## Workflow Example
_41
_41
User: "I'm building an iOS app with Video Calling. Where do I start?"
_41
_41
1. Search: "iOS video calling quickstart"
_41
2. Results show facets indicating video-calling is the best match
_41
3. Suggest: "Most results are for Video Calling, here's the quickstart"
_41
4. Follow-up searches: Keep same sessionId for context

What This Enables

  • Intelligent facet exploration before deep searches
  • Platform-aware result refinement suggestions
  • Cross-search session continuity for better context

Without the prompt, searches still work but won't leverage faceted browsing patterns.

AGENTS.md

To get the most out of the MCP server, Agora recommends that you include an AGENTS.md or similar file in your repository, with instructions specific to the Agora Docs MCP Server. This file ensures that your agent always consults Agora docs to produce reliable, working code.

Add the following instructions to your own AGENTS.md file:


_3
## Agora Documentation
_3
_3
Agora documentation is updated frequently. You should always refer to the latest documentation when working with Agora products. Agora offers an MCP server that you can use to browse and search its documentation. If the developer has not yet installed this server, you should recommend that they install it at https://mcp.agora.io.

Markdown docs

Each page on the Agora docs site is available in Markdown format, optimized for pasting into AI assistants when you can't use MCP.

To access the Markdown version, modify the HTML page URL as follows:

  1. Change docs.agora.io to docs-md.agora.io

  2. Replace any platform query parameter such as ?platform=web with underscore suffixes like _web.md

  3. Add .md extension to the end of the path. For example:

    • HTML: https://docs.agora.io/en/video-calling/get-started/get-started-sdk?platform=web
    • Markdown:https://docs-md.agora.io/en/video-calling/get-started/get-started-sdk_web.md

You can also use the Markdown dropdown on the HTML page to copy, download or view the .md file. Choose Open in ChatGPT or Open in Claude to directly start a chat with the page context.

LLMs.txt

A Markdown-based index of the docs site is available at https://docs.agora.io/llms.txt. This file includes links to all product overview pages along with brief page descriptions. It also provides instruction on how to access LLM-friendly markdown versions of all documentation pages.

For more about how to use LLMs.txt files, see llmstxt.org.