Skip to content

Sentry breaking OpenAI STREAMINGΒ #18962

@murbanowicz

Description

@murbanowicz

Is there an existing issue for this?

How do you use Sentry?

Sentry Saas (sentry.io)

Which SDK are you using?

@sentry/nestjs

SDK Version

latest 10+

Framework Version

No response

Link to Sentry event

No response

Reproduction Example/SDK Setup

No response

Steps to Reproduce

Importing instrument.ts breaks OpenAI streaming capabilities (only last full response is coming through).
I am using OpenAI with Langchain wrapper if it matters.
I've lost few hours until I discovered that this is Sentry causing my streaming not to work.

import * as Sentry from '@sentry/nestjs';

import { isSentryTracingEnabled } from './common/utils/sentry-tracing';

// DSN is safe to hardcode - it only allows sending data TO Sentry
// Can be overridden via SENTRY_DSN env var for per-environment routing
const DEFAULT_DSN =
  'xxx;

const dsn = process.env.SENTRY_DSN ?? DEFAULT_DSN;
// Use SENTRY_ENVIRONMENT if set, otherwise infer from NODE_ENV
const environment =
  process.env.SENTRY_ENVIRONMENT ??
  (process.env.NODE_ENV === 'production' ? 'production' : 'local');
const release = process.env.RELEASE_TAG ?? 'local';
const tracingEnabled = isSentryTracingEnabled();
const normalizedEnvironment = environment.toLowerCase();
const isLocalEnvironment = normalizedEnvironment === 'local';
const isProductionEnvironment = normalizedEnvironment === 'production';
type LogLevel = 'info' | 'warn' | 'error' | 'fatal';
const logLevels: LogLevel[] = isProductionEnvironment
  ? ['warn', 'error', 'fatal']
  : ['info', 'warn', 'error', 'fatal'];

// Only initialize Sentry if DSN is provided and not in local development
if (dsn) {
  //  && environment !== 'local') {
  Sentry.init({
    dsn,
    environment,
    release,
    sendDefaultPii: true,
    tracesSampleRate:
      tracingEnabled && environment === 'production'
        ? 1.0
        : tracingEnabled
          ? 1.0
          : 0,
    // Enable logs to be sent to Sentry
    enableLogs: !isLocalEnvironment,
    integrations: (integrations) => {
      const filtered = integrations.filter((integration) => {
        const name = integration?.name ?? '';
        const normalized = name.toLowerCase();
        // Filter out Redis (noisy) and OpenAI (breaks streaming)
        return !(
          normalized.includes('redis') ||
          normalized.includes('ioredis') ||
          normalized.includes('openai')
        );
      });

      return [
        // Automatically capture Pino logs
        Sentry.pinoIntegration({
          log: {
            levels: logLevels,
          },
          error: {
            levels: ['error', 'fatal'],
          },
        }),
        ...filtered,
      ];
    },
    // Scrub sensitive data from events before sending to Sentry
    beforeSend(event) {
      // Remove sensitive headers (Pino redaction doesn't cover Sentry's request context)
      if (event.request?.headers) {
        const sensitiveHeaders = [
          'authorization',
          'cookie',
          'x-api-key',
          'x-auth-token',
          'x-access-token',
        ];
        for (const header of sensitiveHeaders) {
          if (event.request.headers[header]) {
            event.request.headers[header] = '[REDACTED]';
          }
        }
      }

      // Remove sensitive user data if present
      if (event.user) {
        // Keep user.id but remove other PII
        const userId = event.user.id;
        event.user = userId ? { id: userId } : undefined;
      }

      // Remove sensitive tags/context
      if (event.tags) {
        delete event.tags.password;
        delete event.tags.token;
        delete event.tags.secret;
      }

      // Scrub sensitive data from extra context
      if (event.contexts) {
        const sensitiveKeys = ['password', 'token', 'secret', 'apiKey', 'auth'];
        for (const key of sensitiveKeys) {
          if (key in event.contexts) {
            delete event.contexts[key as keyof typeof event.contexts];
          }
        }
      }

      return event;
    },
    // Filter logs: keep info+ and drop trace/debug noise
    beforeSendLog(log) {
      if (isLocalEnvironment) {
        return null;
      }
      if (log.level === 'trace' || log.level === 'debug') {
        return null;
      }
      if (isProductionEnvironment && log.level === 'info') {
        return null;
      }
      return log;
    },
  });
}

Expected Result

OpenAI / Langchain streaming to work.

Actual Result

Streaming is broken.

No matter if I added filtering of integrations, process.env.OTEL_NODE_DISABLED_INSTRUMENTATIONS = 'openai'; as per #17159 and many more - only completely removing instrument.ts import allows streaming to work.

Additional Context

I can live without openai instrumentation working until this is fixed, but please give me/us some workaround.

Priority

React with πŸ‘ to help prioritize this issue. Please use comments to provide useful context, avoiding +1 or me too, to help us triage it.

Metadata

Metadata

Assignees

No one assigned

    Projects

    Status

    Waiting for: Product Owner

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions