Skip to main content

Google Vertex AI

Google Vertex AI provides enterprise-grade access to Google's generative AI models with enhanced security, scaling capabilities, and integration with Google Cloud services.

Sample configuration

The following example shows a starting llm parameter configuration you can use when you Start a conversational AI agent.


_17
"llm": {
_17
"url": "https://{region}-aiplatform.googleapis.com/v1/projects/{project}/locations/{region}/publishers/google/models/{model}:{resource}",
_17
"api_key": "$(gcloud auth print-access-token)",
_17
"system_messages": [
_17
{
_17
"role": "user",
_17
"parts": [ {"text": "You are a helpful chatbot."} ]
_17
}
_17
],
_17
"max_history": 32,
_17
"greeting_message": "Good to see you!",
_17
"failure_message": "Hold on a second.",
_17
"params": {
_17
"model": "gemini-2.0-flash-001"
_17
},
_17
"style": "gemini"
_17
}

Key parameters

  • api_key: Refer to Google Cloud REST authentication to get your GCP credentials.
  • url: Use the Vertex AI endpoint with your Google Cloud project ID and region in the URL path. Refer to Google Vertex AI API documentation for details.
  • model: Refer to Vertex AI models for available models.
  • system_messages: Use parts array with text objects instead of simple content string.
  • style: Set to "gemini" to use Gemini's message format.

For advanced configuration options, model capabilities, and detailed parameter descriptions, see the Google Vertex AI API documentation.