DocsObservabilitySDKsOverview

Langfuse SDKs

Langfuse offers two SDKs for Python and JS/TS. The Langfuse SDKs are the recommended way to integrate custom instrumentation, evaluations, and prompt tooling with Langfuse. Both SDKs are OpenTelemetry-based, async by default, and interoperate with Langfuse integrations.

Legacy documentation

This documentation is for the Python SDK v3. Documentation for the legacy Python SDK v2 can be found here.

Requirements for self-hosted Langfuse

If you are self-hosting Langfuse, the Python SDK v3 requires Langfuse platform version >= 3.63.0 for traces to be correctly processed.

Key benefits

  • Based on OpenTelemetry, so you can use any OTEL-based instrumentation library for your LLM stack.
  • Fully async requests, meaning Langfuse adds almost no latency.
  • Accurate latency tracking via synchronous timestamps.
  • IDs available for downstream use.
  • Great DX when nesting observations.
  • Cannot break your application—SDK errors are caught and logged.
  • Interoperable with Langfuse integrations.

Quickstart

Follow the path for your runtime to get the first trace into Langfuse.

Install package

Install the Langfuse Python SDK.

pip install langfuse

Add credentials

Add your Langfuse credentials as environment variables so the SDK knows which project to send your traces to.

.env
LANGFUSE_SECRET_KEY = "sk-lf-..."
LANGFUSE_PUBLIC_KEY = "pk-lf-..."
LANGFUSE_BASE_URL = "https://cloud.langfuse.com" # 🇪🇺 EU region
# LANGFUSE_BASE_URL = "https://us.cloud.langfuse.com" # 🇺🇸 US region

Choose your instrumentation approach

Instrumentation means adding code that records what’s happening in your application so it can be sent to Langfuse.

There are three main ways of instrumenting your code with the Python SDK:

The @observe decorator is the simplest way to instrument your application. It is a function decorator that can be applied to any function.

The decorator automatically starts and ends a trace around the function and automatically captures the function name, arguments, and return value.

from langfuse import observe, get_client
 
@observe
def my_function():
    return "Hello, world!" # Input/output and timings are automatically captured
 
my_function()
 
# Flush events in short-lived applications
langfuse = get_client()
langfuse.flush()

When should I call langfuse.flush()?

Install the SDK

pip install langfuse

Configure credentials

Add your Langfuse credentials to your environment.

.env
LANGFUSE_SECRET_KEY = "sk-lf-..."
LANGFUSE_PUBLIC_KEY = "pk-lf-..."
LANGFUSE_BASE_URL = "https://cloud.langfuse.com" # 🇪🇺 EU region
# LANGFUSE_BASE_URL = "https://us.cloud.langfuse.com" # 🇺🇸 US region

Initialize tracing

Initialize client
from langfuse import get_client
 
langfuse = get_client()
 
# Verify connection
if langfuse.auth_check():
    print("Langfuse client is authenticated and ready!")
else:
    print("Authentication failed. Please check your credentials and host.")
Key configuration options

All key configuration options are listed in the Python SDK reference.

Constructor ArgumentEnvironment VariableDescriptionDefault value
public_keyLANGFUSE_PUBLIC_KEYYour Langfuse project’s public API key. Required.
secret_keyLANGFUSE_SECRET_KEYYour Langfuse project’s secret API key. Required.
base_urlLANGFUSE_BASE_URLAPI host for your Langfuse instance."https://cloud.langfuse.com"
timeoutLANGFUSE_TIMEOUTTimeout in seconds for API requests.5
httpx_client-Custom httpx.Client for non-tracing HTTP requests.
debugLANGFUSE_DEBUGEnables verbose logging.False
tracing_enabledLANGFUSE_TRACING_ENABLEDToggles Langfuse instrumentation; if False, tracing calls become no-ops.True
flush_atLANGFUSE_FLUSH_ATNumber of spans to batch before sending.512
flush_intervalLANGFUSE_FLUSH_INTERVALSeconds between batch flushes.5
environmentLANGFUSE_TRACING_ENVIRONMENTEnvironment name (lowercase alphanumeric, hyphen/underscore)."default"
releaseLANGFUSE_RELEASERelease identifier for grouping analytics.
media_upload_thread_countLANGFUSE_MEDIA_UPLOAD_THREAD_COUNTBackground threads for media uploads.1
sample_rateLANGFUSE_SAMPLE_RATESampling rate between 0.0 and 1.0.1.0
mask-Mask sensitive data before export.
LANGFUSE_MEDIA_UPLOAD_ENABLEDWhether to upload media files to Langfuse storage (useful to disable when self-hosting).True

Access the client globally

The Langfuse client is a singleton. It can be accessed anywhere in your application using the get_client() function.

Optionally, you can initialize the client via Langfuse() to manually pass in the Langfuse credentials. Otherwise, it is created automatically when you call get_client() based on environment variables.

from langfuse import get_client
 
# Get the default client
client = get_client()
Alternative: configure via constructor
Initialize client
from langfuse import Langfuse
 
langfuse = Langfuse(
  public_key="pk-lf-...", 
  secret_key="sk-lf-...",
  base_url="https://cloud.langfuse.com"
)

OpenTelemetry foundation

Building on OpenTelemetry provides:

  • Standardization with the wider observability ecosystem and tooling.
  • Robust context propagation so nested spans stay connected, even across async workloads.
  • Attribute propagation to keep userId, sessionId, metadata, version, and tags aligned across observations.
  • Ecosystem interoperability meaning third-party instrumentations automatically appear inside Langfuse traces.

The following diagram shows how Langfuse maps to native OpenTelemetry concepts:

Learn more

Other languages

Via the public API you can integrate Langfuse from any runtime. For tracing specifically, send OpenTelemetry spans from your preferred instrumentation (Java, Go, etc.) to the Langfuse OTel endpoint.

Was this page helpful?