Skip to content

Claude MODELS Max token not fetching from generation config #1584

@kishorekamaleshn

Description

@kishorekamaleshn

** Please make sure you read the contribution guide and file the issues in the right place. **
Contribution guide.

Describe the bug
LLM registry for Claude has max tokens configured as hardcoded instead of it being fetched from generation config so its being limited to 1024 tokens which is way too low also streaming need to be enabled to utilize the full adavntage of Claude models

To Reproduce
Steps to reproduce the [behavior:]
For any value of max output tokens in generation config for LLM agent only 1024 output tokens is received in claude models output

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots

Desktop (please complete the following information):

  • OS: [e.g. iOS] windows 11
  • Python version(python -V): 3.12.10
  • ADK version(pip show google-adk): 1.4.1

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bot triaged[Bot] This issue is triaged by ADK botmodels[Component] Issues related to model support

    Type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions