From langfuse-pack
Installs Langfuse SDK and configures authentication for LLM observability and tracing in JavaScript/TypeScript or Python projects.
npx claudepluginhub jeremylongshore/claude-code-plugins-plus-skills --plugin langfuse-packThis skill is limited to using the following tools:
Install the Langfuse SDK and configure authentication for LLM observability. Covers both the legacy `langfuse` package (v3) and the modern modular SDK (v4+/v5) built on OpenTelemetry.
Provides Langfuse SDK patterns for singleton clients, observe wrappers, nested traces, session tracking, and OTel integration for LLM observability in Node.js apps.
Provides expertise in Langfuse LLM observability: tracing, prompt management, evaluations, datasets. Integrates with LangChain, LlamaIndex, OpenAI for debugging and monitoring production LLM apps.
Interact with Langfuse via CLI to query/modify traces, prompts, datasets, scores, sessions; access documentation, SDK usage, integrations, and features.
Share bugs, ideas, or general feedback.
Install the Langfuse SDK and configure authentication for LLM observability. Covers both the legacy langfuse package (v3) and the modern modular SDK (v4+/v5) built on OpenTelemetry.
pk-lf-...) and Secret Key (sk-lf-...) from project settingsTypeScript/JavaScript (v4+ modular SDK -- recommended):
set -euo pipefail
# Core client for prompt management, datasets, scores
npm install @langfuse/client
# Tracing (observe, startActiveObservation)
npm install @langfuse/tracing @langfuse/otel @opentelemetry/sdk-node
# OpenAI integration (drop-in wrapper)
npm install @langfuse/openai
# LangChain integration
npm install @langfuse/langchain
TypeScript/JavaScript (v3 legacy -- single package):
npm install langfuse
Python:
pip install langfuse
pk-lf-... (identifies your project)sk-lf-... (grants write access -- keep secret)https://cloud.langfuse.com)# Set environment variables
export LANGFUSE_PUBLIC_KEY="pk-lf-..."
export LANGFUSE_SECRET_KEY="sk-lf-..."
export LANGFUSE_BASE_URL="https://cloud.langfuse.com"
# Or create .env file
cat >> .env << 'EOF'
LANGFUSE_PUBLIC_KEY=pk-lf-your-public-key
LANGFUSE_SECRET_KEY=sk-lf-your-secret-key
LANGFUSE_BASE_URL=https://cloud.langfuse.com
EOF
Note: v4+ uses
LANGFUSE_BASE_URL. Legacy v3 usesLANGFUSE_HOSTorLANGFUSE_BASEURL.
// src/lib/langfuse.ts
import { LangfuseClient } from "@langfuse/client";
import { startActiveObservation } from "@langfuse/tracing";
import { LangfuseSpanProcessor } from "@langfuse/otel";
import { NodeSDK } from "@opentelemetry/sdk-node";
// 1. Register the OpenTelemetry span processor (once at app startup)
const sdk = new NodeSDK({
spanProcessors: [new LangfuseSpanProcessor()],
});
sdk.start();
// 2. Create the Langfuse client for prompt/dataset/score operations
export const langfuse = new LangfuseClient({
publicKey: process.env.LANGFUSE_PUBLIC_KEY,
secretKey: process.env.LANGFUSE_SECRET_KEY,
baseUrl: process.env.LANGFUSE_BASE_URL,
});
// 3. Verify connection
async function verify() {
await startActiveObservation("connection-test", async (span) => {
span.update({ input: { test: true } });
span.update({ output: { status: "connected" } });
});
console.log("Langfuse connection verified. Check dashboard for trace.");
}
verify();
import { Langfuse } from "langfuse";
const langfuse = new Langfuse({
publicKey: process.env.LANGFUSE_PUBLIC_KEY,
secretKey: process.env.LANGFUSE_SECRET_KEY,
baseUrl: process.env.LANGFUSE_HOST,
});
// Verify with a test trace
const trace = langfuse.trace({
name: "connection-test",
metadata: { test: true },
});
await langfuse.flushAsync();
console.log("Connected. Trace URL:", trace.getTraceUrl());
// Clean shutdown
process.on("beforeExit", async () => {
await langfuse.shutdownAsync();
});
from langfuse import Langfuse
import os
langfuse = Langfuse(
public_key=os.environ["LANGFUSE_PUBLIC_KEY"],
secret_key=os.environ["LANGFUSE_SECRET_KEY"],
host=os.environ.get("LANGFUSE_HOST", "https://cloud.langfuse.com"),
)
# Test trace
trace = langfuse.trace(name="connection-test", metadata={"test": True})
langfuse.flush()
print(f"Connected. Trace: {trace.get_trace_url()}")
| Feature | v3 (langfuse) | v4+ (@langfuse/*) |
|---|---|---|
| Package | Single langfuse | Modular: @langfuse/client, @langfuse/tracing, @langfuse/otel |
| Base URL env var | LANGFUSE_HOST | LANGFUSE_BASE_URL |
| Tracing | langfuse.trace() | startActiveObservation() / observe() |
| Client class | Langfuse | LangfuseClient |
| OpenAI wrapper | observeOpenAI() from langfuse | observeOpenAI() from @langfuse/openai |
| Foundation | Custom | OpenTelemetry |
| Error | Cause | Solution |
|---|---|---|
401 Unauthorized | Invalid or expired API key | Re-check keys in Langfuse dashboard Settings > API Keys |
ECONNREFUSED | Wrong host URL or server down | Verify LANGFUSE_BASE_URL / LANGFUSE_HOST |
Missing required configuration | Env vars not loaded | Ensure dotenv/config imported at entry point |
Module not found | Package not installed | Run npm install or pip install again |
| Using pk- key as secret | Keys swapped | Public key starts pk-lf-, secret starts sk-lf- |
After auth is working, proceed to langfuse-hello-world for your first traced LLM call.