Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
147 changes: 75 additions & 72 deletions docs/platforms/javascript/common/configuration/integrations/openai.mdx
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
---
title: OpenAI
description: "Adds instrumentation for OpenAI API."
description: "Adds instrumentation for the OpenAI SDK."
supported:
- javascript.node
- javascript.aws-lambda
Expand All @@ -25,66 +25,65 @@ supported:
- javascript.tanstackstart-react
- javascript.cloudflare
- javascript
- javascript.react
- javascript.angular
- javascript.vue
- javascript.svelte
- javascript.solid
- javascript.ember
- javascript.gatsby
---

<PlatformSection notSupported={["javascript", "javascript.react", "javascript.angular", "javascript.vue", "javascript.svelte", "javascript.solid", "javascript.ember", "javascript.gatsby"]}>

## Server-Side Usage

_Import name: `Sentry.openAIIntegration`_

The `openAIIntegration` adds instrumentation for the `openai` SDK to capture spans by wrapping OpenAI client calls and recording LLM interactions.

<PlatformSection notSupported={["javascript.nextjs", "javascript.nuxt", "javascript.solidstart", "javascript.sveltekit", "javascript.react-router", "javascript.remix", "javascript.astro", "javascript.tanstackstart-react", "javascript.bun", "javascript.cloudflare"]}>

<Alert>

This integration works in the Node.js, Cloudflare Workers, Vercel Edge Functions, and browser runtimes. It requires SDK version `10.2.0` or higher.
Enabled by default and automatically captures spans for OpenAI SDK calls. Requires SDK version `10.2.0` or higher.

</Alert>

_Import name: `Sentry.openAIIntegration`_
</PlatformSection>

The `openAIIntegration` adds instrumentation for the `openai` API to capture spans by wrapping OpenAI client calls and recording LLM interactions with configurable input/output recording.
<PlatformSection supported={["javascript.nextjs", "javascript.nuxt", "javascript.solidstart", "javascript.sveltekit", "javascript.react-router", "javascript.remix", "javascript.astro", "javascript.tanstackstart-react", "javascript.bun", "javascript.cloudflare"]}>

<PlatformSection notSupported={["javascript.cloudflare", "javascript.nextjs", "javascript"]}>
It is enabled by default and will automatically capture spans for OpenAI API method calls. You can opt-in to capture inputs and outputs by setting `recordInputs` and `recordOutputs` in the integration config:
<Alert>

```javascript
Sentry.init({
dsn: "____PUBLIC_DSN____",
tracesSampleRate: 1.0,
integrations: [
Sentry.openAIIntegration({
recordInputs: true,
recordOutputs: true,
}),
],
});
```
Enabled by default and automatically captures spans for OpenAI SDK calls. Requires SDK version `10.2.0` or higher.

**For runtimes**, manual instrumentation is required. See the [setup instructions below](#browser-side-usage).

</Alert>

</PlatformSection>

<PlatformSection supported={["javascript.cloudflare"]}>
For Cloudflare Workers, you need to manually instrument the OpenAI client using the `instrumentOpenAiClient` helper:
To customize what data is captured (such as inputs and outputs), see the [Options](#options) in the Configuration section.

```javascript
import * as Sentry from "@sentry/cloudflare";
import OpenAI from "openai";
</PlatformSection>

const openai = new OpenAI();
const client = Sentry.instrumentOpenAiClient(openai, {
recordInputs: true,
recordOutputs: true,
});
<PlatformSection supported={["javascript", "javascript.react", "javascript.angular", "javascript.vue", "javascript.svelte", "javascript.solid", "javascript.ember", "javascript.gatsby", "javascript.nextjs", "javascript.nuxt", "javascript.solidstart", "javascript.sveltekit", "javascript.react-router", "javascript.remix", "javascript.astro", "javascript.tanstackstart-react", "javascript.electron", "javascript.cloudflare", "javascript.bun"]}>

// Use the wrapped client instead of the original openai instance
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
});
```
## Browser-Side Usage

_Import name: `Sentry.instrumentOpenAiClient`_

</PlatformSection>

<PlatformSection supported={["javascript.nextjs"]}>
For Next.js applications, you need to manually instrument the OpenAI client using the `instrumentOpenAiClient` helper:
The `instrumentOpenAiClient` helper adds instrumentation for the `openai` SDK to capture spans by wrapping OpenAI client calls and recording LLM interactions with configurable input/output recording. You need to manually wrap your OpenAI client instance with this helper:

```javascript
import * as Sentry from "@sentry/nextjs";
import OpenAI from "openai";

const openai = new OpenAI();
const openai = new OpenAI({
apiKey: 'your-api-key', // Warning: API key will be exposed in browser!
});

const client = Sentry.instrumentOpenAiClient(openai, {
recordInputs: true,
recordOutputs: true,
Expand All @@ -97,68 +96,72 @@ const response = await client.chat.completions.create({
});
```

To customize what data is captured (such as inputs and outputs), see the [Options](#options) in the Configuration section.

</PlatformSection>

<PlatformSection supported={["javascript"]}>
For browser applications, you need to manually instrument the OpenAI client using the `instrumentOpenAiClient` helper:
## Configuration

```javascript
import * as Sentry from "@sentry/browser";
import OpenAI from "openai";
### Options

const openai = new OpenAI();
const client = Sentry.instrumentOpenAiClient(openai, {
recordInputs: true,
recordOutputs: true,
});
The following options control what data is captured from OpenAI SDK calls:

// Use the wrapped client instead of the original openai instance
const response = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Hello!" }],
});
```
#### `recordInputs`

</PlatformSection>
_Type: `boolean` (optional)_

## Options
Records inputs to OpenAI SDK calls (such as prompts and messages).

### `recordInputs`
Defaults to `true` if `sendDefaultPii` is `true`.

_Type: `boolean`_
#### `recordOutputs`

Records inputs to OpenAI API method calls (such as prompts and messages).
_Type: `boolean` (optional)_

Records outputs from OpenAI SDK calls (such as generated text and responses).

Defaults to `true` if `sendDefaultPii` is `true`.

**Usage**

<PlatformSection notSupported={["javascript", "javascript.react", "javascript.angular", "javascript.vue", "javascript.svelte", "javascript.solid", "javascript.ember", "javascript.gatsby"]}>

Automatic Instrumentation

```javascript
Sentry.init({
integrations: [Sentry.openAIIntegration({ recordInputs: true })],
dsn: "____PUBLIC_DSN____",
tracesSampleRate: 1.0, // Required for AI observability
integrations: [
Sentry.openAIIntegration({
// your options here
}),
],
});
```

### `recordOutputs`

_Type: `boolean`_
</PlatformSection>

Records outputs from OpenAI API method calls (such as generated text and responses).
<PlatformSection supported={["javascript", "javascript.react", "javascript.angular", "javascript.vue", "javascript.svelte", "javascript.solid", "javascript.ember", "javascript.gatsby", "javascript.nextjs", "javascript.nuxt", "javascript.solidstart", "javascript.sveltekit", "javascript.react-router", "javascript.remix", "javascript.astro", "javascript.tanstackstart-react", "javascript.electron", "javascript.cloudflare", "javascript.bun"]}>

Defaults to `true` if `sendDefaultPii` is `true`.
Manual Instrumentation

```javascript
Sentry.init({
integrations: [Sentry.openAIIntegration({ recordOutputs: true })],
const client = Sentry.instrumentOpenAiClient(openai, {
// your options here
});
```

## Configuration
</PlatformSection>

## Supported Operations

By default this integration adds tracing support to OpenAI API method calls including:
By default, tracing support is added to the following OpenAI SDK calls:

- `chat.completions.create()` - Chat completion requests
- `responses.create()` - Response API requests
- `responses.create()` - Response SDK requests

The integration will automatically detect streaming vs non-streaming requests and handle them appropriately.
Streaming and non-streaming requests are automatically detected and handled appropriately.

## Supported Versions

Expand Down