Anthropic Integration

Trace every Anthropic Claude API call with a single wrapper function. Glassbrain automatically captures messages, responses, token usage, latency, tool use, and errors across all Claude models.

Installation

Install the Glassbrain SDK alongside the Anthropic client library for your language.

JavaScript / TypeScript

bashTerminal
npm install @glassbrain/js @anthropic-ai/sdk

Python

bashTerminal
pip install glassbrain anthropic

Quick Start

Wrap your Anthropic client with Glassbrain to start tracing. The wrapped client is a drop-in replacement - no other code changes are needed.

JavaScript / TypeScript

typescriptindex.ts
400 font-semibold">import Anthropic 400 font-semibold">from 400 font-semibold">class="text-emerald-400">"@anthropic-ai/sdk";
400 font-semibold">import { wrapAnthropic } 400 font-semibold">from 400 font-semibold">class="text-emerald-400">"@glassbrain/js";

400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// Initialize the Anthropic client 400 font-semibold">as usual
400 font-semibold">const anthropic = 400 font-semibold">new Anthropic({
  apiKey: process.env.ANTHROPIC_API_KEY,
});

400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// Wrap it with Glassbrain
400 font-semibold">const tracedAnthropic = wrapAnthropic(anthropic, {
  projectKey: process.env.GLASSBRAIN_PROJECT_KEY,
});

400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// Use tracedAnthropic exactly like the original client
400 font-semibold">const message = 400 font-semibold">await tracedAnthropic.messages.create({
  model: 400 font-semibold">class="text-emerald-400">"claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [
    { role: 400 font-semibold">class="text-emerald-400">"user", content: 400 font-semibold">class="text-emerald-400">"What is the capital 400 font-semibold">of France?" },
  ],
});

console.log(message.content[0].text);

Python

pythonmain.py
400 font-semibold">import os
400 font-semibold">from anthropic 400 font-semibold">import Anthropic
400 font-semibold">from glassbrain 400 font-semibold">import wrap_anthropic

400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic"># Initialize the Anthropic client 400 font-semibold">as usual
anthropic = Anthropic(api_key=os.environ[400 font-semibold">class="text-emerald-400">"ANTHROPIC_API_KEY"])

400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic"># Wrap it 400 font-semibold">with Glassbrain
traced_anthropic = wrap_anthropic(anthropic, project_key=os.environ[400 font-semibold">class="text-emerald-400">"GLASSBRAIN_PROJECT_KEY"])

400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic"># Use traced_anthropic exactly like the original client
message = traced_anthropic.messages.create(
    model=400 font-semibold">class="text-emerald-400">"claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[
        {400 font-semibold">class="text-emerald-400">"role": 400 font-semibold">class="text-emerald-400">"user", 400 font-semibold">class="text-emerald-400">"content": 400 font-semibold">class="text-emerald-400">"What 400 font-semibold">is the capital of France?"},
    ],
)

print(message.content[0].text)

How It Works

The wrapAnthropic() function creates a transparent proxy around the Anthropic client. Every call to messages.create() is intercepted to capture the input messages, model configuration, and the full response. Trace data is sent asynchronously to Glassbrain and does not block your application.

The following Anthropic methods are traced automatically:

  • messages.create() - Message creation (including streaming)
  • messages.create() with tools - Tool use calls
  • messages.batches.create() - Batch message processing

What Gets Traced

Each traced Anthropic call produces a span with the following data structure. You can view and search this data in the Glassbrain dashboard.

jsonTrace span structure
{
  "span_id": "sp_def456",
  "trace_id": "tr_uvw012",
  "provider": "anthropic",
  "operation": "messages.create",
  "timestamp": "2026-04-03T12:00:00.000Z",
  "duration_ms": 2150,
  "status": "success",
  "model": {
    "name": "claude-sonnet-4-20250514",
    "provider": "anthropic"
  },
  "input": {
    "messages": [
      { "role": "user", "content": "What is the capital of France?" }
    ],
    "system": null,
    "max_tokens": 1024,
    "temperature": 1.0,
    "top_p": null,
    "top_k": null
  },
  "output": {
    "content": [
      { "type": "text", "text": "The capital of France is Paris." }
    ],
    "stop_reason": "end_turn",
    "role": "assistant"
  },
  "usage": {
    "input_tokens": 14,
    "output_tokens": 10,
    "cache_creation_input_tokens": 0,
    "cache_read_input_tokens": 0
  },
  "cost": {
    "input_cost_usd": 0.000042,
    "output_cost_usd": 0.00015,
    "total_cost_usd": 0.000192
  },
  "error": null
}

Cache token fields are populated when using Anthropic prompt caching. Cost calculations are updated automatically as Anthropic adjusts their pricing.

Streaming Support

Glassbrain supports streaming responses from the Anthropic API. The wrapper captures each streamed event and reconstructs the full response for tracing while forwarding events to your application immediately.

typescriptstreaming.ts
400 font-semibold">const stream = tracedAnthropic.messages.stream({
  model: 400 font-semibold">class="text-emerald-400">"claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: 400 font-semibold">class="text-emerald-400">"user", content: 400 font-semibold">class="text-emerald-400">"Write a short poem about code." }],
});

stream.on(400 font-semibold">class="text-emerald-400">"text", (text) => {
  process.stdout.write(text);
});

400 font-semibold">const finalMessage = 400 font-semibold">await stream.finalMessage();
400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// The complete interaction is traced automatically

Tool Use Tracing

When Claude uses tools, Glassbrain captures the full tool use flow including the tool definition, the tool call arguments, and the tool result. Each tool call is represented as a child span within the parent message span.

typescripttool-use.ts
400 font-semibold">const message = 400 font-semibold">await tracedAnthropic.messages.create({
  model: 400 font-semibold">class="text-emerald-400">"claude-sonnet-4-20250514",
  max_tokens: 1024,
  tools: [
    {
      name: 400 font-semibold">class="text-emerald-400">"get_weather",
      description: 400 font-semibold">class="text-emerald-400">"Get the current weather 400 font-semibold">for a city",
      input_schema: {
        400 font-semibold">type: 400 font-semibold">class="text-emerald-400">"object",
        properties: {
          city: { 400 font-semibold">type: 400 font-semibold">class="text-emerald-400">"string", description: 400 font-semibold">class="text-emerald-400">"The city name" },
        },
        required: [400 font-semibold">class="text-emerald-400">"city"],
      },
    },
  ],
  messages: [
    { role: 400 font-semibold">class="text-emerald-400">"user", content: 400 font-semibold">class="text-emerald-400">"What's the weather 400 font-semibold">in Tokyo?" },
  ],
});

400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// Tool calls, inputs, and results are all traced automatically

Advanced Configuration

Customize the wrapper behavior with additional options.

typescriptconfig.ts
400 font-semibold">const tracedAnthropic = wrapAnthropic(anthropic, {
  projectKey: process.env.GLASSBRAIN_PROJECT_KEY,

  400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// Add custom metadata to every trace
  metadata: {
    environment: 400 font-semibold">class="text-emerald-400">"production",
    service: 400 font-semibold">class="text-emerald-400">"assistant-api",
    version: 400 font-semibold">class="text-emerald-400">"2.1.0",
  },

  400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// Control what gets captured
  captureInput: 400">true,    400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// Set to 400">false to skip logging messages
  captureOutput: 400">true,   400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// Set to 400">false to skip logging responses

  400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// Sampling rate (0.0 to 1.0)
  sampleRate: 1.0,

  400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">// Custom base URL 400 font-semibold">for self-hosted Glassbrain
  baseUrl: 400 font-semibold">class="text-emerald-400">"https:400 font-semibold">class="text-[rgba(255,255,255,0.3)] italic">//glassbrain.dev/api/traces",
});

Troubleshooting

Traces are not appearing in the dashboard

Confirm that your GLASSBRAIN_PROJECT_KEY is correct and that you are using the wrapped client instance for your API calls. Check the browser console or server logs for any error messages from the Glassbrain SDK.

Tool use spans are missing

Tool use tracing requires @glassbrain/js version 0.5.0 or later. Run npm ls @glassbrain/js to verify your installed version and upgrade if needed with npm install @glassbrain/js@latest.

Streaming events arrive but trace is incomplete

Ensure you are awaiting the stream to completion. If the stream is interrupted or the process exits before the stream finishes, the trace may be partial. Call await stream.finalMessage() to guarantee the stream completes and the full trace is sent.

SDK version compatibility

The Glassbrain Anthropic wrapper supports @anthropic-ai/sdk version 0.20.0 and above. If you are using an older version, upgrade with npm install @anthropic-ai/sdk@latest. For Python, the minimum supported version is anthropic>=0.25.0.