Overview
The BeeAI Framework provides comprehensive observability through OpenInference instrumentation, enabling you to trace and monitor your AI applications with industry-standard telemetry. This allows you to debug issues, optimize performance, and understand how your agents, tools, and workflows are performing in production.Currently supported in Python only.
Quickstart
1. Install the package
This package provides the OpenInference instrumentor specifically designed for the BeeAI Framework.2. Set up observability
Configure OpenTelemetry to create and export spans. This example sets up an OTLP HTTP exporter, a tracer provider, and the BeeAI instrumentor, with the endpoint pointing to a local Arize Phoenix instance.To override the default traces endpoint (http://localhost:4318/v1/traces), set the
OTEL_EXPORTER_OTLP_TRACES_ENDPOINT
environment variable.For Arize Phoenix, set
OTEL_EXPORTER_OTLP_TRACES_ENDPOINT
to http://localhost:6006/v1/traces
.3. Enable instrumentation
Call the setup function before running any BeeAI Framework code:What Gets Instrumented
When instrumentation is enabled, BeeAI emits spans and attributes for core runtime operations.Agents
- Agent execution start/stop times
- Input prompts and output responses
- Tool usage within agent workflows
- Memory operations and state changes
Tools
- Tool invocation details
- Input parameters and return values
- Execution time and success/failure status
- Error details when tools fail
Chat Models
- Model inference requests (including streaming)
- Token usage statistics
- Model parameters (temperature, max tokens, etc.)
- Response timing and latency
Embedding Models
- Text embedding requests
- Input text and embedding dimensions
- Processing time and batch sizes
Workflows
- Workflow step execution
- State transitions and data flow
- Step dependencies and execution order
Observability Backends
Arize Phoenix
Open-source observability for LLM applications.Documentation: Arize Phoenix
LangFuse
Production-ready LLMOps platform with advanced analytics.Documentation: LangFuse OpenTelemetry Integration
LangSmith
Comprehensive LLM development platform by LangChain.Documentation: LangSmith OpenTelemetry Guide