Patch Changes
-
#94
004df81
Thanks @omeraplak! - feat: Add Langfuse Observability ExporterThis introduces a new package
@voltagent/langfuse-exporter
that allows you to export OpenTelemetry traces generated by@voltagent/core
directly to Langfuse (https://langfuse.com/) for detailed observability into your agent's operations.How to Use:
Installation
Install the necessary packages:
npm install @voltagent/langfuse-exporter
Configuration
Configure the
LangfuseExporter
and pass it toVoltAgent
:import { Agent, VoltAgent } from "@voltagent/core"; import { VercelAIProvider } from "@voltagent/vercel-ai"; import { openai } from "@ai-sdk/openai"; import { LangfuseExporter } from "@voltagent/langfuse-exporter"; // Ensure LANGFUSE_SECRET_KEY and LANGFUSE_PUBLIC_KEY are set in your environment // Define your agent(s) const agent = new Agent({ name: "my-voltagent-app", description: "A helpful assistant that answers questions without using tools", llm: new VercelAIProvider(), model: openai("gpt-4o-mini"), }); // Configure the Langfuse Exporter const langfuseExporter = new LangfuseExporter({ publicKey: process.env.LANGFUSE_PUBLIC_KEY, secretKey: process.env.LANGFUSE_SECRET_KEY, baseUrl: process.env.LANGFUSE_BASE_URL, // Optional: Defaults to Langfuse Cloud // debug: true // Optional: Enable exporter logging }); // Initialize VoltAgent with the exporter // This automatically sets up OpenTelemetry tracing new VoltAgent({ agents: { agent, // Register your agent(s) }, telemetryExporter: langfuseExporter, // Pass the exporter instance }); console.log("VoltAgent initialized with Langfuse exporter."); // Now, any operations performed by 'agent' (e.g., agent.generateText(...)) // will automatically generate traces and send them to Langfuse.
By providing the
telemetryExporter
toVoltAgent
, OpenTelemetry is automatically configured, and detailed traces including LLM interactions, tool usage, and agent metadata will appear in your Langfuse project. -
Updated dependencies [
004df81
]:- @voltagent/core@0.1.12