The BeeAI Framework provides comprehensive observability through OpenInference instrumentation, enabling you to trace and monitor your AI applications with industry-standard telemetry. This allows you to debug issues, optimize performance, and understand how your agents, tools, and workflows are performing in production.
Configure OpenTelemetry to create and export spans. This example sets up an OTLP HTTP exporter, a tracer provider, and the BeeAI instrumentor, with the endpoint pointing to a local Arize Phoenix instance.