Skip to content

release: 3.2.0 #590

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/create-releases.yml
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ jobs:
export -- GPG_SIGNING_KEY_ID
printenv -- GPG_SIGNING_KEY | gpg --batch --passphrase-fd 3 --import 3<<< "$GPG_SIGNING_PASSWORD"
GPG_SIGNING_KEY_ID="$(gpg --with-colons --list-keys | awk -F : -- '/^pub:/ { getline; print "0x" substr($10, length($10) - 7) }')"
./gradlew publishAndReleaseToMavenCentral -Dorg.gradle.jvmargs="-Xmx8g" --stacktrace -PmavenCentralUsername="$SONATYPE_USERNAME" -PmavenCentralPassword="$SONATYPE_PASSWORD" --no-configuration-cache
./gradlew publishAndReleaseToMavenCentral --stacktrace -PmavenCentralUsername="$SONATYPE_USERNAME" -PmavenCentralPassword="$SONATYPE_PASSWORD" --no-configuration-cache
env:
SONATYPE_USERNAME: ${{ secrets.OPENAI_SONATYPE_USERNAME || secrets.SONATYPE_USERNAME }}
SONATYPE_PASSWORD: ${{ secrets.OPENAI_SONATYPE_PASSWORD || secrets.SONATYPE_PASSWORD }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/publish-sonatype.yml
Original file line number Diff line number Diff line change
Expand Up @@ -60,7 +60,7 @@ jobs:
export -- GPG_SIGNING_KEY_ID
printenv -- GPG_SIGNING_KEY | gpg --batch --passphrase-fd 3 --import 3<<< "$GPG_SIGNING_PASSWORD"
GPG_SIGNING_KEY_ID="$(gpg --with-colons --list-keys | awk -F : -- '/^pub:/ { getline; print "0x" substr($10, length($10) - 7) }')"
./gradlew publishAndReleaseToMavenCentral -Dorg.gradle.jvmargs="-Xmx8g" --stacktrace -PmavenCentralUsername="$SONATYPE_USERNAME" -PmavenCentralPassword="$SONATYPE_PASSWORD" --no-configuration-cache
./gradlew publishAndReleaseToMavenCentral --stacktrace -PmavenCentralUsername="$SONATYPE_USERNAME" -PmavenCentralPassword="$SONATYPE_PASSWORD" --no-configuration-cache
env:
SONATYPE_USERNAME: ${{ secrets.OPENAI_SONATYPE_USERNAME || secrets.SONATYPE_USERNAME }}
SONATYPE_PASSWORD: ${{ secrets.OPENAI_SONATYPE_PASSWORD || secrets.SONATYPE_PASSWORD }}
Expand Down
2 changes: 1 addition & 1 deletion .release-please-manifest.json
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
{
".": "3.1.2"
".": "3.2.0"
}
8 changes: 4 additions & 4 deletions .stats.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
configured_endpoints: 111
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/openai%2Fopenai-7ef7a457c3bf05364e66e48c9ca34f31bfef1f6c9b7c15b1812346105e0abb16.yml
openapi_spec_hash: a2b1f5d8fbb62175c93b0ebea9f10063
config_hash: 4870312b04f48fd717ea4151053e7fb9
configured_endpoints: 119
openapi_spec_url: https://storage.googleapis.com/stainless-sdk-openapi-specs/openai%2Fopenai-ddbdf9343316047e8a773c54fb24e4a8d225955e202a1888fde6f9c8898ebf98.yml
openapi_spec_hash: 9802f6dd381558466c897f6e387e06ca
config_hash: fe0ea26680ac2075a6cd66416aefe7db
18 changes: 18 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,23 @@
# Changelog

## 3.2.0 (2025-08-22)

Full Changelog: [v3.1.2...v3.2.0](https://github.com/openai/openai-java/compare/v3.1.2...v3.2.0)

### Features

* **api:** Add connectors support for MCP tool ([ee175a9](https://github.com/openai/openai-java/commit/ee175a98a1ed4f1e939c301dd5bf62f10e367c6c))
* **api:** adding support for /v1/conversations to the API ([9d088c5](https://github.com/openai/openai-java/commit/9d088c5d77c450a2e1371a8293e9c05ebf102937))


### Chores

* add missing delegate methods ([557e9ee](https://github.com/openai/openai-java/commit/557e9ee897e11c95788300922e8094383eb1dd97))
* **ci:** reduce log noise ([9e91952](https://github.com/openai/openai-java/commit/9e9195269a225cdbaba81e2dd6ee26caba407d00))
* **client:** refactor closing / shutdown ([94cdfcd](https://github.com/openai/openai-java/commit/94cdfcdf57225ee606acf51ffe3e0b6a4321f7e6))
* **internal:** support running formatters directly ([6242da5](https://github.com/openai/openai-java/commit/6242da594134e736470c006bc1af6b38ce757705))
* remove memory upper bound from publishing step ([fdc5fdd](https://github.com/openai/openai-java/commit/fdc5fdd3969d2ecc3d6df1546b34167b63074991))

## 3.1.2 (2025-08-20)

Full Changelog: [v3.1.1...v3.1.2](https://github.com/openai/openai-java/compare/v3.1.1...v3.1.2)
Expand Down
14 changes: 7 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,16 @@

<!-- x-release-please-start-version -->

[![Maven Central](https://img.shields.io/maven-central/v/com.openai/openai-java)](https://central.sonatype.com/artifact/com.openai/openai-java/3.1.2)
[![javadoc](https://javadoc.io/badge2/com.openai/openai-java/3.1.2/javadoc.svg)](https://javadoc.io/doc/com.openai/openai-java/3.1.2)
[![Maven Central](https://img.shields.io/maven-central/v/com.openai/openai-java)](https://central.sonatype.com/artifact/com.openai/openai-java/3.2.0)
[![javadoc](https://javadoc.io/badge2/com.openai/openai-java/3.2.0/javadoc.svg)](https://javadoc.io/doc/com.openai/openai-java/3.2.0)

<!-- x-release-please-end -->

The OpenAI Java SDK provides convenient access to the [OpenAI REST API](https://platform.openai.com/docs) from applications written in Java.

<!-- x-release-please-start-version -->

The REST API documentation can be found on [platform.openai.com](https://platform.openai.com/docs). Javadocs are available on [javadoc.io](https://javadoc.io/doc/com.openai/openai-java/3.1.2).
The REST API documentation can be found on [platform.openai.com](https://platform.openai.com/docs). Javadocs are available on [javadoc.io](https://javadoc.io/doc/com.openai/openai-java/3.2.0).

<!-- x-release-please-end -->

Expand All @@ -24,7 +24,7 @@ The REST API documentation can be found on [platform.openai.com](https://platfor
### Gradle

```kotlin
implementation("com.openai:openai-java:3.1.2")
implementation("com.openai:openai-java:3.2.0")
```

### Maven
Expand All @@ -33,7 +33,7 @@ implementation("com.openai:openai-java:3.1.2")
<dependency>
<groupId>com.openai</groupId>
<artifactId>openai-java</artifactId>
<version>3.1.2</version>
<version>3.2.0</version>
</dependency>
```

Expand Down Expand Up @@ -1330,7 +1330,7 @@ If you're using Spring Boot, then you can use the SDK's [Spring Boot starter](ht
#### Gradle

```kotlin
implementation("com.openai:openai-java-spring-boot-starter:3.1.2")
implementation("com.openai:openai-java-spring-boot-starter:3.2.0")
```

#### Maven
Expand All @@ -1339,7 +1339,7 @@ implementation("com.openai:openai-java-spring-boot-starter:3.1.2")
<dependency>
<groupId>com.openai</groupId>
<artifactId>openai-java-spring-boot-starter</artifactId>
<version>3.1.2</version>
<version>3.2.0</version>
</dependency>
```

Expand Down
2 changes: 1 addition & 1 deletion build.gradle.kts
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ repositories {

allprojects {
group = "com.openai"
version = "3.1.2" // x-release-please-version
version = "3.2.0" // x-release-please-version
}

subprojects {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,8 @@ class OpenAIOkHttpClient private constructor() {
* The executor to use for running [AsyncStreamResponse.Handler] callbacks.
*
* Defaults to a dedicated cached thread pool.
*
* This class takes ownership of the executor and shuts it down, if possible, when closed.
*/
fun streamHandlerExecutor(streamHandlerExecutor: Executor) = apply {
clientOptions.streamHandlerExecutor(streamHandlerExecutor)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -128,6 +128,8 @@ class OpenAIOkHttpClientAsync private constructor() {
* The executor to use for running [AsyncStreamResponse.Handler] callbacks.
*
* Defaults to a dedicated cached thread pool.
*
* This class takes ownership of the executor and shuts it down, if possible, when closed.
*/
fun streamHandlerExecutor(streamHandlerExecutor: Executor) = apply {
clientOptions.streamHandlerExecutor(streamHandlerExecutor)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ import com.openai.services.blocking.BetaService
import com.openai.services.blocking.ChatService
import com.openai.services.blocking.CompletionService
import com.openai.services.blocking.ContainerService
import com.openai.services.blocking.ConversationService
import com.openai.services.blocking.EmbeddingService
import com.openai.services.blocking.EvalService
import com.openai.services.blocking.FileService
Expand Down Expand Up @@ -91,6 +92,8 @@ interface OpenAIClient {

fun responses(): ResponseService

fun conversations(): ConversationService

fun evals(): EvalService

fun containers(): ContainerService
Expand Down Expand Up @@ -150,6 +153,8 @@ interface OpenAIClient {

fun responses(): ResponseService.WithRawResponse

fun conversations(): ConversationService.WithRawResponse

fun evals(): EvalService.WithRawResponse

fun containers(): ContainerService.WithRawResponse
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ import com.openai.services.async.BetaServiceAsync
import com.openai.services.async.ChatServiceAsync
import com.openai.services.async.CompletionServiceAsync
import com.openai.services.async.ContainerServiceAsync
import com.openai.services.async.ConversationServiceAsync
import com.openai.services.async.EmbeddingServiceAsync
import com.openai.services.async.EvalServiceAsync
import com.openai.services.async.FileServiceAsync
Expand Down Expand Up @@ -91,6 +92,8 @@ interface OpenAIClientAsync {

fun responses(): ResponseServiceAsync

fun conversations(): ConversationServiceAsync

fun evals(): EvalServiceAsync

fun containers(): ContainerServiceAsync
Expand Down Expand Up @@ -152,6 +155,8 @@ interface OpenAIClientAsync {

fun responses(): ResponseServiceAsync.WithRawResponse

fun conversations(): ConversationServiceAsync.WithRawResponse

fun evals(): EvalServiceAsync.WithRawResponse

fun containers(): ContainerServiceAsync.WithRawResponse
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ import com.openai.services.async.CompletionServiceAsync
import com.openai.services.async.CompletionServiceAsyncImpl
import com.openai.services.async.ContainerServiceAsync
import com.openai.services.async.ContainerServiceAsyncImpl
import com.openai.services.async.ConversationServiceAsync
import com.openai.services.async.ConversationServiceAsyncImpl
import com.openai.services.async.EmbeddingServiceAsync
import com.openai.services.async.EmbeddingServiceAsyncImpl
import com.openai.services.async.EvalServiceAsync
Expand Down Expand Up @@ -117,6 +119,10 @@ class OpenAIClientAsyncImpl(private val clientOptions: ClientOptions) : OpenAICl
ResponseServiceAsyncImpl(clientOptionsWithUserAgent)
}

private val conversations: ConversationServiceAsync by lazy {
ConversationServiceAsyncImpl(clientOptionsWithUserAgent)
}

private val evals: EvalServiceAsync by lazy { EvalServiceAsyncImpl(clientOptionsWithUserAgent) }

private val containers: ContainerServiceAsync by lazy {
Expand Down Expand Up @@ -162,11 +168,13 @@ class OpenAIClientAsyncImpl(private val clientOptions: ClientOptions) : OpenAICl

override fun responses(): ResponseServiceAsync = responses

override fun conversations(): ConversationServiceAsync = conversations

override fun evals(): EvalServiceAsync = evals

override fun containers(): ContainerServiceAsync = containers

override fun close() = clientOptions.httpClient.close()
override fun close() = clientOptions.close()

class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) :
OpenAIClientAsync.WithRawResponse {
Expand Down Expand Up @@ -235,6 +243,10 @@ class OpenAIClientAsyncImpl(private val clientOptions: ClientOptions) : OpenAICl
ResponseServiceAsyncImpl.WithRawResponseImpl(clientOptions)
}

private val conversations: ConversationServiceAsync.WithRawResponse by lazy {
ConversationServiceAsyncImpl.WithRawResponseImpl(clientOptions)
}

private val evals: EvalServiceAsync.WithRawResponse by lazy {
EvalServiceAsyncImpl.WithRawResponseImpl(clientOptions)
}
Expand Down Expand Up @@ -282,6 +294,8 @@ class OpenAIClientAsyncImpl(private val clientOptions: ClientOptions) : OpenAICl

override fun responses(): ResponseServiceAsync.WithRawResponse = responses

override fun conversations(): ConversationServiceAsync.WithRawResponse = conversations

override fun evals(): EvalServiceAsync.WithRawResponse = evals

override fun containers(): ContainerServiceAsync.WithRawResponse = containers
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ import com.openai.services.blocking.CompletionService
import com.openai.services.blocking.CompletionServiceImpl
import com.openai.services.blocking.ContainerService
import com.openai.services.blocking.ContainerServiceImpl
import com.openai.services.blocking.ConversationService
import com.openai.services.blocking.ConversationServiceImpl
import com.openai.services.blocking.EmbeddingService
import com.openai.services.blocking.EmbeddingServiceImpl
import com.openai.services.blocking.EvalService
Expand Down Expand Up @@ -103,6 +105,10 @@ class OpenAIClientImpl(private val clientOptions: ClientOptions) : OpenAIClient
ResponseServiceImpl(clientOptionsWithUserAgent)
}

private val conversations: ConversationService by lazy {
ConversationServiceImpl(clientOptionsWithUserAgent)
}

private val evals: EvalService by lazy { EvalServiceImpl(clientOptionsWithUserAgent) }

private val containers: ContainerService by lazy {
Expand Down Expand Up @@ -148,11 +154,13 @@ class OpenAIClientImpl(private val clientOptions: ClientOptions) : OpenAIClient

override fun responses(): ResponseService = responses

override fun conversations(): ConversationService = conversations

override fun evals(): EvalService = evals

override fun containers(): ContainerService = containers

override fun close() = clientOptions.httpClient.close()
override fun close() = clientOptions.close()

class WithRawResponseImpl internal constructor(private val clientOptions: ClientOptions) :
OpenAIClient.WithRawResponse {
Expand Down Expand Up @@ -221,6 +229,10 @@ class OpenAIClientImpl(private val clientOptions: ClientOptions) : OpenAIClient
ResponseServiceImpl.WithRawResponseImpl(clientOptions)
}

private val conversations: ConversationService.WithRawResponse by lazy {
ConversationServiceImpl.WithRawResponseImpl(clientOptions)
}

private val evals: EvalService.WithRawResponse by lazy {
EvalServiceImpl.WithRawResponseImpl(clientOptions)
}
Expand Down Expand Up @@ -268,6 +280,8 @@ class OpenAIClientImpl(private val clientOptions: ClientOptions) : OpenAIClient

override fun responses(): ResponseService.WithRawResponse = responses

override fun conversations(): ConversationService.WithRawResponse = conversations

override fun evals(): EvalService.WithRawResponse = evals

override fun containers(): ContainerService.WithRawResponse = containers
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ import java.time.Clock
import java.time.Duration
import java.util.Optional
import java.util.concurrent.Executor
import java.util.concurrent.ExecutorService
import java.util.concurrent.Executors
import java.util.concurrent.ThreadFactory
import java.util.concurrent.atomic.AtomicLong
Expand All @@ -32,6 +33,8 @@ private constructor(
* The HTTP client to use in the SDK.
*
* Use the one published in `openai-java-client-okhttp` or implement your own.
*
* This class takes ownership of the client and closes it when closed.
*/
@get:JvmName("httpClient") val httpClient: HttpClient,
/**
Expand All @@ -53,6 +56,8 @@ private constructor(
* The executor to use for running [AsyncStreamResponse.Handler] callbacks.
*
* Defaults to a dedicated cached thread pool.
*
* This class takes ownership of the executor and shuts it down, if possible, when closed.
*/
@get:JvmName("streamHandlerExecutor") val streamHandlerExecutor: Executor,
/**
Expand Down Expand Up @@ -196,6 +201,8 @@ private constructor(
* The HTTP client to use in the SDK.
*
* Use the one published in `openai-java-client-okhttp` or implement your own.
*
* This class takes ownership of the client and closes it when closed.
*/
fun httpClient(httpClient: HttpClient) = apply {
this.httpClient = PhantomReachableClosingHttpClient(httpClient)
Expand Down Expand Up @@ -224,9 +231,14 @@ private constructor(
* The executor to use for running [AsyncStreamResponse.Handler] callbacks.
*
* Defaults to a dedicated cached thread pool.
*
* This class takes ownership of the executor and shuts it down, if possible, when closed.
*/
fun streamHandlerExecutor(streamHandlerExecutor: Executor) = apply {
this.streamHandlerExecutor = streamHandlerExecutor
this.streamHandlerExecutor =
if (streamHandlerExecutor is ExecutorService)
PhantomReachableExecutorService(streamHandlerExecutor)
else streamHandlerExecutor
}

/**
Expand Down Expand Up @@ -554,4 +566,19 @@ private constructor(
)
}
}

/**
* Closes these client options, relinquishing any underlying resources.
*
* This is purposefully not inherited from [AutoCloseable] because the client options are
* long-lived and usually should not be synchronously closed via try-with-resources.
*
* It's also usually not necessary to call this method at all. the default client automatically
* releases threads and connections if they remain idle, but if you are writing an application
* that needs to aggressively release unused resources, then you may call this method.
*/
fun close() {
httpClient.close()
(streamHandlerExecutor as? ExecutorService)?.shutdown()
}
}
Loading