Skip to content

[BOT ISSUE] AWS Bedrock Runtime SDK (@aws-sdk/client-bedrock-runtime) not instrumented — no tracing for direct Converse or InvokeModel calls #1741

@braintrust-bot

Description

@braintrust-bot

Summary

The AWS SDK for JavaScript v3 Bedrock Runtime client (@aws-sdk/client-bedrock-runtime) provides execution APIs for model invocation via ConverseCommand, ConverseStreamCommand, and InvokeModelCommand. This repository has zero direct instrumentation for any Bedrock Runtime SDK surface — no wrapper, no channels, no plugin, no auto-instrumentation config. Users who call the AWS Bedrock Runtime SDK directly get no Braintrust spans.

What instrumentation is missing

The @aws-sdk/client-bedrock-runtime package exposes these execution surfaces, none of which are instrumented:

SDK Method Description
client.send(new ConverseCommand(...)) Unified chat completions across all Bedrock models
client.send(new ConverseStreamCommand(...)) Streaming chat completions
client.send(new InvokeModelCommand(...)) Direct model invocation with provider-specific payloads
client.send(new InvokeModelWithResponseStreamCommand(...)) Streaming direct model invocation

The Converse API is AWS's unified interface for chat completions across all Bedrock-hosted models (Claude, Titan, Llama, Mistral, Cohere, etc.). It has its own request/response format distinct from any provider's native API. Users cannot use wrapOpenAI() or any other existing wrapper with this SDK.

No coverage in any instrumentation layer:

  • No wrapper function (e.g. wrapBedrockRuntime())
  • No diagnostics channels for Bedrock methods
  • No plugin handler in js/src/instrumentation/plugins/
  • No auto-instrumentation config in js/src/auto-instrumentations/configs/
  • No vendor SDK types in js/src/vendor-sdk-types/
  • No e2e test scenarios

A grep for bedrock (case-insensitive) across js/src/ returns zero matches. The only repo references are in an example project's package.json (@ai-sdk/amazon-bedrock) and a user bug report (issue #928) showing Bedrock usage via LangChain.

Indirect coverage exists but is limited:

Users can access Bedrock models through the Vercel AI SDK (@ai-sdk/amazon-bedrock) or LangChain (@langchain/aws / ChatBedrockConverse), both of which are instrumented. However, many enterprise users use the AWS SDK directly for tighter control over IAM, regions, and model parameters. Issue #928 demonstrates real user demand — a user was already using Bedrock via LangChain and encountered tracing issues.

Context

The Braintrust tracing docs at https://www.braintrust.dev/docs/guides/tracing list "AWS Bedrock" as a supported AI provider, but this JS SDK has no direct wrapper for the AWS Bedrock Runtime SDK. The listed support likely refers to indirect coverage through frameworks (LangChain, AI SDK) or Python SDK support.

Braintrust docs status

unclear — Braintrust's tracing overview page lists "AWS Bedrock" as a supported provider, but the wrap-providers documentation does not include any Bedrock-specific setup instructions, and this JS SDK has no direct instrumentation.

Upstream references

Local files inspected

  • js/src/wrappers/ — no Bedrock wrapper
  • js/src/instrumentation/plugins/ — no Bedrock channels or plugin
  • js/src/auto-instrumentations/configs/ — no Bedrock config
  • js/src/vendor-sdk-types/ — no Bedrock types
  • e2e/scenarios/ — no Bedrock test scenarios
  • js/examples/ai-sdk/next-openai-app/package.json — only reference is @ai-sdk/amazon-bedrock (AI SDK adapter, not direct SDK)
  • GitHub issue Eval (js) concurrency groups all spans under the same eval task #928 — user using Bedrock via LangChain, demonstrating real-world usage

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions