Skip to main content

Overview

Large Language Models (LLMs) power the conversational intelligence in your AI agents. They process user input, invoke tools, and generate responses. Agora supports multiple LLM providers, allowing you to choose the best model for your specific requirements.

Integration steps

To integrate the LLM of your choice, follow these steps:

  1. Choose your LLM provider from the Supported LLMs table
  2. Obtain an API key from the provider's console
  3. Copy the sample configuration for your chosen provider
  4. Replace <api_key> with your actual API key
  5. Customize the system_messages for your use case
  6. Specify the configuration in the request body as properties > llm when Starting a conversational AI agent

Supported LLMs

Conversational AI Engine currently supports the following LLMs:

ProviderStyleDocumentation
OpenAIopenaihttps://platform.openai.com/docs/api-reference/responses/create
Azure OpenAIopenaihttps://learn.microsoft.com/en-us/azure/ai-services/openai/reference
Google Geminigeminihttps://cloud.google.com/docs/authentication/rest
Google Vertex AIgeminihttps://cloud.google.com/vertex-ai/generative-ai/docs/learn/prompts/system-instructions
Claudeanthropichttps://docs.anthropic.com/en/api/messages
DifydifyAgora Conversational AI - Dify Marketplace
Custom LLMopenaihttps://docs.agora.io/en/conversational-ai/develop/custom-llm

Custom LLMs

You can integrate any LLM that provides an OpenAI-compatible REST API interface. Your custom service must handle requests and responses in the OpenAI API format. For implementation details and requirements, see the Custom LLM integration guide.