Skip to main content

Google Vertex AI

Google Vertex AI provides enterprise-grade access to Google's generative AI models with enhanced security, scaling capabilities, and integration with Google Cloud services.

Sample configuration

The following example shows a starting llm parameter configuration you can use when you Start a conversational AI agent.


_17
"llm": {
_17
"url": "https://{region}-aiplatform.googleapis.com/v1/projects/{project}/locations/{region}/publishers/google/models/{model}:{resource}",
_17
"api_key": "$(gcloud auth print-access-token)",
_17
"system_messages": [
_17
{
_17
"role": "user",
_17
"parts": [ {"text": "You are a helpful chatbot."} ]
_17
}
_17
],
_17
"max_history": 32,
_17
"greeting_message": "Good to see you!",
_17
"failure_message": "Hold on a second.",
_17
"params": {
_17
"model": "gemini-2.0-flash-001"
_17
},
_17
"style": "gemini"
_17
}

Key parameters

llmrequired
  • api_key stringrequired

    Refer to Google Cloud REST authentication to get your GCP credentials.

  • url stringrequired

    Use the Vertex AI endpoint with your Google Cloud project ID and region in the URL path. Refer to Google Vertex AI API documentation for details.

  • system_messages array[object]nullable

    Use parts array with text objects instead of simple content string.

  • style stringrequired

    Set to gemini to use Gemini's message format.

  • params objectrequired
    Show propertiesHide properties

For advanced configuration options, model capabilities, and detailed parameter descriptions, see the Google Vertex AI API documentation.