-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Description
** Please make sure you read the contribution guide and file the issues in the right place. **
Contribution guide.
Describe the bug
LLM registry for Claude has max tokens configured as hardcoded instead of it being fetched from generation config so its being limited to 1024 tokens which is way too low also streaming need to be enabled to utilize the full adavntage of Claude models
To Reproduce
Steps to reproduce the [behavior:
]
For any value of max output tokens in generation config for LLM agent only 1024 output tokens is received in claude models output
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
Desktop (please complete the following information):
- OS: [e.g. iOS] windows 11
- Python version(python -V): 3.12.10
- ADK version(pip show google-adk): 1.4.1
Additional context
Add any other context about the problem here.