Getting StartedOverview

Getting Started with TraceCraft

Welcome to TraceCraft! This guide will help you get started with instrumenting your LLM applications for observability.

What is TraceCraft?

TraceCraft is a vendor-neutral observability SDK for LLM applications. It provides:

  • Unified Instrumentation: Single API that works across different frameworks
  • Flexible Export: Send traces to multiple backends simultaneously
  • Privacy First: Built-in PII redaction and sampling
  • Local Development: Works offline with beautiful console output
  • OpenTelemetry Native: Built on industry-standard OTel

Learning Path

Follow this learning path to master TraceCraft:

1. Installation

Start by installing TraceCraft with the features you need.

Installation Guide

2. Quick Start

Build your first instrumented application in 5 minutes.

Quick Start

3. Core Concepts

Understand the key concepts behind TraceCraft.

Core Concepts

Quick Example

The fastest way to get traces into the TUI requires zero code changes to your existing application:

# Terminal 1 — start the receiver + TUI
pip install "tracecraft[receiver,tui]"
tracecraft serve --tui
 
# Terminal 2 — run your existing app, unchanged
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
python your_app.py

Any OTLP-instrumented app (OpenLLMetry, LangChain, LlamaIndex, DSPy, or standard OTel SDK) sends traces directly to the TUI — no code changes needed.

Prefer a config file? Create .tracecraft/config.yaml and add one line to your app:

# .tracecraft/config.yaml
default:
  exporters:
    receiver: true
  instrumentation:
    auto_instrument: true
import tracecraft
tracecraft.init()   # reads from .tracecraft/config.yaml
tracecraft serve --tui && python your_app.py

Auto-instrumentation and decorators add rich structured spans — → SDK Guide

Key Features at a Glance

Decorators

Simple decorators for different trace types:

@trace_agent(name="agent")      # For agent/workflow functions
@trace_tool(name="tool")        # For tool/utility functions
@trace_llm(model="gpt-4")       # For LLM calls
@trace_retrieval(name="rag")    # For retrieval operations

Configuration

Flexible configuration via code or environment variables:

tracecraft.init(
    service_name="my-app",
    console=True,
    jsonl=True,
    otlp_endpoint="http://localhost:4317"
)

Terminal UI

Explore your traces with the interactive terminal UI:

tracecraft tui

Next Steps

Ready to dive deeper? Start with the installation guide:

Install TraceCraft