Skip to content

ADK 0.5.0: LlmAgent with Gemini model fails Vertex AI ADC auth despite GOOGLE_GENAI_USE_VERTEXAI=TRUE #718

@ilteris

Description

@ilteris

Describe the bug

When using google-adk==0.5.0, an LlmAgent configured with google.adk.models.google_llm.Gemini (by passing a model name string as the model parameter) fails to authenticate with Vertex AI using Application Default Credentials (ADC) in a Google Cloud Run environment. This occurs even when the environment variables GOOGLE_GENAI_USE_VERTEXAI=TRUE, GOOGLE_CLOUD_PROJECT, and GOOGLE_CLOUD_LOCATION are correctly set. The agent's attempt to call the Gemini model results in a ValueError: Project and location or API key must be set when using the Vertex AI API. This error originates from the underlying google-generativeai library, suggesting it's not correctly recognizing or utilizing the ADC and environment context for Vertex AI mode.

Passing project and location arguments directly to the Gemini() constructor also does not resolve the issue. The problem prevents the use of Gemini models via Vertex AI with service account authentication for LlmAgent in this ADK version.

To Reproduce

Steps to reproduce the behavior:

  1. Install google-adk==0.5.0 and google-generativeai (version compatible with ADK 0.5.0, e.g., latest 0.x if ADK 0.5.0 doesn't pin a specific one, or the one it pulls).
  2. Create an LlmAgent (e.g., QuestionAnsweringAgent in our case) within a FastAPI application.
    • In the agent's __init__, pass a Gemini model name string (e.g., "gemini-1.0-pro" or a Google AI Studio model name like "gemini-2.5-pro-preview-05-06") as the model parameter to super().__init__().
    # Example from QuestionAnsweringAgent.__init__
    # model_name_from_config = os.environ.get('GEMINI_MODEL_NAME')
    # super().__init__(model=model_name_from_config, ...) 
  3. Deploy this FastAPI application to Google Cloud Run.
  4. Set the following environment variables for the Cloud Run service:
    • GOOGLE_GENAI_USE_VERTEXAI=TRUE
    • GOOGLE_CLOUD_PROJECT="[YOUR_GCP_PROJECT_ID]"
    • GOOGLE_CLOUD_LOCATION="[YOUR_GCP_REGION_FOR_VERTEX_AI]" (e.g., us-central1)
    • GEMINI_MODEL_NAME="[MODEL_NAME_STRING_AS_ABOVE]"
    • Ensure GOOGLE_API_KEY is not set.
  5. Invoke an endpoint in the FastAPI application that causes the LlmAgent to make a call to the Gemini model (e.g., agent_runner.run_async(...)).
  6. Observe the Cloud Run logs for the error.

Expected behavior

The LlmAgent, configured with a Gemini model string and the specified environment variables (GOOGLE_GENAI_USE_VERTEXAI=TRUE, GOOGLE_CLOUD_PROJECT, GOOGLE_CLOUD_LOCATION), should successfully authenticate to Vertex AI using the Cloud Run service's Application Default Credentials. It should make calls to the specified Gemini model via the Vertex AI backend without requiring an explicit API key.

Screenshots

  • Cloud Run Log Snippet showing the error:
    ERROR:main_api:Error during QA agent runner execution or streaming for video_id ...: Project and location or API key must be set when using the Vertex AI API.
    Traceback (most recent call last): 
      File "/app/main_api.py", line 203, in event_generator 
        async for event in qa_agent_runner.run_async( 
      File "/usr/local/lib/python3.11/site-packages/google/adk/runners.py", line 197, in run_async 
        async for event in invocation_context.agent.run_async(invocation_context): 
      File "/usr/local/lib/python3.11/site-packages/google/adk/agents/base_agent.py", line 133, in run_async 
        async for event in self._run_async_impl(ctx): 
      File "/usr/local/lib/python3.11/site-packages/google/adk/agents/llm_agent.py", line 246, in _run_async_impl 
        async for event in self._llm_flow.run_async(ctx): 
      # ... further ADK internal calls ...
      File "/usr/local/lib/python3.11/site-packages/google/adk/models/google_llm.py", line 86, in generate_content_async 
        self._api_backend,
      File "/usr/local/lib/python3.11/functools.py", line 1001, in __get__
        val = self.func(instance)
      File "/usr/local/lib/python3.11/site-packages/google/adk/models/google_llm.py", line 161, in _api_backend
        return 'vertex' if self.api_client.vertexai else 'ml_dev' # Indicates it knows it should be 'vertex'
      # ... calls leading to google.genai._api_client.py ...
      File "/usr/local/lib/python3.11/site-packages/google/genai/_api_client.py", line 424, in __init__ # Or similar line in your version
        raise ValueError("Project and location or API key must be set when using the Vertex AI API.")
    ValueError: Project and location or API key must be set when using the Vertex AI API.
    
  • Cloud Run Log Snippet showing QuestionAnsweringAgent.__init__ diagnostic logs (confirming env vars are read):
    INFO:agents.question_answering_agent:Initializing QuestionAnsweringAgent: QuestionAnsweringAgent
    INFO:agents.question_answering_agent: Attempting to use GEMINI_MODEL_NAME from config: [YOUR_MODEL_NAME]
    INFO:agents.question_answering_agent: GCP_PROJECT from env: [YOUR_GCP_PROJECT_ID]
    INFO:agents.question_answering_agent: GCP_REGION (GOOGLE_CLOUD_LOCATION) from env: [YOUR_GCP_REGION]
    INFO:agents.question_answering_agent: GOOGLE_GENAI_USE_VERTEXAI from env: TRUE
    INFO:agents.question_answering_agent: GOOGLE_API_KEY from env: Not set
    INFO:agents.question_answering_agent:QuestionAnsweringAgent initialized. LlmAgent was configured with model string: '[YOUR_MODEL_NAME]' and env GOOGLE_GENAI_USE_VERTEXAI='TRUE'
    

Desktop (please complete the following information):

  • OS: MacOSX
  • Python version (python -V): Python 3.11
  • ADK version (pip show google-adk): google-adk==0.5.0

Additional context
The issue seems specific to how google.adk.models.google_llm.Gemini (in ADK 0.5.0) interacts with the google-generativeai library when GOOGLE_GENAI_USE_VERTEXAI=TRUE is set. Attempts to pass project and location directly to the Gemini() constructor did not resolve the issue. Explicitly calling google.generativeai.configure(project=...) also did not resolve it. The problem occurs during the LlmAgent's internal call to the model, not during the Gemini object instantiation itself. Other agents in the same project, potentially using Gemini models in a different context (e.g., not as a top-level agent run by ADK Runner in a FastAPI app, or initialized differently), might be working with ADC, which makes this behavior particularly confusing. The documentation implies that setting GOOGLE_GENAI_USE_VERTEXAI, GOOGLE_CLOUD_PROJECT, and GOOGLE_CLOUD_LOCATION should be sufficient for ADC to work with Vertex AI when passing a model string to LlmAgent.

Metadata

Metadata

Assignees

Labels

core[Component] This issue is related to the core interface and implementation

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions