Skip to content

Sentry LLM monitoring incorrectly logging prompts, despite sendDefaultPii defaulting to false #17414

@RichardJECooke

Description

@RichardJECooke

Environment

SaaS (https://sentry.io/)

Steps to Reproduce

Create index.ts with the contents below:

import express from 'express';
import Types from 'express';
import { createOllama } from 'ollama-ai-provider';
import { generateText, experimental_createMCPClient as createMCPClient } from 'ai';
import { Experimental_StdioMCPTransport as StdioMCPTransport } from 'ai/mcp-stdio';
import * as Sentry from '@sentry/node';

//=== SENTRY

Sentry.init({
  dsn: "https://dsn@o13.ingest.us.sentry.io/435",
  tracesSampleRate: 1.0,
  integrations: [Sentry.vercelAIIntegration()],
  sendDefaultPii: false, // false by default
});

//=== WEATHER MCP SERVER

const mcpClient = await createMCPClient({
  transport: new StdioMCPTransport({
    command: 'npx',
    args: ['open-meteo-mcp-server'],
  }),
});
const tools = await mcpClient.tools();
// console.log('Available MCP Tools:', JSON.stringify(tools, null, 2)); // Uncomment this to see what tools OpenMeteo offers the LLM

//=== OLLAMA

const ollama = createOllama({ baseURL: 'http://host.docker.internal:11434/api' });

async function generateWithOllama(prompt: string): Promise<string> {
  try {
    const { text, toolCalls, toolResults } = await generateText({
      'model': ollama('qwen3:1.7b'),
      'tools': tools,
      'maxSteps': 2,
      'prompt': prompt,
      'experimental_telemetry': {
        'isEnabled': true,
        'functionId': 'ai-agent-mains',
        // 'recordInputs': true,
        // 'recordOutputs': true,
      },
    });

    if (toolCalls && toolCalls.length > 0) {
      console.log('--- Tool Calls ---');
      toolCalls.forEach((call) => {
        console.log(`Tool Name: ${call.toolName}`);
        console.log(`Args: ${JSON.stringify(call.args, null, 2)}`);
      });
    }

    if (toolResults && toolResults.length > 0) {
      console.log('--- Tool Results ---');
      toolResults.forEach((result) => {
        console.log(`Tool Call ID: ${result.toolCallId}`);
        console.log(`Result: ${JSON.stringify(result.result, null, 2)}`);
      });
    }

    return text.replace(/<think>.*?<\/think>/gs, '').trim();
  }
  catch (error) {
    console.error('Error calling Ollama:', error);
    throw error;
  }
}

//=== EXPRESS.JS

const app = express();
app.use(express.json());

app.post('/', async (request: Types.Request, response: Types.Response) => {
  try {
    if (!request.body || !request.body.prompt)
      return response.status(400).send('Invalid or missing "prompt" in request body.');
    const responseText = await generateWithOllama(request.body.prompt);
    response.send(responseText + '\r\n');
  }
  catch (error) {
    Sentry.captureException(error);
    response.status(500).send('Error generating text from Ollama: ' + error);
  }
});

const server = app.listen(3000, '0.0.0.0', () => { console.log(`Express server is running`); });
server.timeout = 30000;

Run the commands below:

docker run --init -it --rm --name "app" -v ".:/app" -w "/app" node:24.0.1-alpine3.21 sh -c "npm install ai@4.3.19 ollama-ai-provider@1.2.0 express@5.1.0 open-meteo-mcp-server@1.1.1"

docker run --init  -it --rm --add-host=host.docker.internal:host-gateway --name "app" -v ".:/app" -w "/app" -p 7777:3000 node:24.0.1-alpine3.21 sh -c  "node index.ts";

curl --max-time 300 -X POST -H "Content-Type: application/json" -d '{"prompt": "Provide the current temperature for Oslo, Norway."}' http://localhost:7777

Expected Result

No prompts or responses are sent to Sentry, because that is the default according to your docs. I even explicitly set sendDefaultPII to false in the code.

Actual Result

All prompts sent to Sentry.

Image

Product Area

Explore

Link

No response

DSN

No response

Version

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    Status

    No status

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions