-
-
Notifications
You must be signed in to change notification settings - Fork 232
Insights: crmne/ruby_llm
Overview
-
- 0 Merged pull requests
- 2 Open pull requests
- 5 Closed issues
- 1 New issue
Could not load contribution data
Please try again later
2 Pull requests opened by 2 people
-
Fix array parameter handling in tool declarations for Gemini and OpenAI
#358 opened
Aug 21, 2025 -
Adding option for configuring custom log Regexp timeout
#364 opened
Aug 23, 2025
5 Issues closed by 2 people
-
[BUG] OpenAI Reasoning effort parameter not working with .with_params
#357 closed
Aug 20, 2025 -
[BUG] Schema Issues with Gemini
#354 closed
Aug 20, 2025 -
[FEATURE] Config default temperature
#348 closed
Aug 17, 2025 -
[BUG] Use model's default temperature if not set
#349 closed
Aug 17, 2025
1 Issue opened by 1 person
-
[BUG] RubyLLM flattens references in schema (RubyLLM::Schema integration)
#362 opened
Aug 22, 2025
6 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
-
[BUG] Faraday Retry incompatible with OpenAI
#341 commented on
Aug 19, 2025 • 0 new comments -
[FEATURE] Add support for xAI
#308 commented on
Aug 22, 2025 • 0 new comments -
Azure OpenAI support
#15 commented on
Aug 22, 2025 • 0 new comments -
[FEATURE] Have custom Regexp timeout configuration for logging
#331 commented on
Aug 23, 2025 • 0 new comments -
Improve `RubyLLM::Chat#with_params` (#265) by allowing to override default params
#303 commented on
Aug 19, 2025 • 0 new comments -
Allow RubyLLM::Schema to be used for Tool schema (OpenAI, Anthropic and Gemini)
#337 commented on
Aug 19, 2025 • 0 new comments