Home

TraceCraft

TraceCraft

Vendor-neutral LLM observability — instrument once, observe anywhere.

TraceCraft is a Python observability SDK with a built-in Terminal UI (TUI) that lets you visually explore, debug, and analyze your agent traces right in your terminal — no browser, no cloud dashboard, no waiting.


The fastest path: zero code changes

If your app already uses OpenAI, Anthropic, LangChain, LlamaIndex, or any OpenTelemetry-compatible framework, TraceCraft can observe it without touching a single line of application code.

Step 1 — Install and set one environment variable:

pip install "tracecraft[receiver,tui]"
 
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318

Step 2 — Start the receiver and TUI together:

tracecraft serve --tui

Step 3 — Run your existing app unchanged:

python your_app.py

Traces from any OTLP-compatible framework (OpenLLMetry, LangChain, LlamaIndex, DSPy, or any standard OpenTelemetry SDK) stream live into the TUI the moment they arrive. No init() call. No decorators. No code changes.


The Terminal UI — Your Agent’s Black Box Recorder

After traces are flowing in, the TUI gives you complete visibility into every agent run:

TraceCraft TUI - Main View

All your agent runs at a glance — name, duration, token usage, and status. Select any trace to drill down.

Select any trace to expand the full call hierarchy with timing bars. Navigate to any LLM step and press i for the prompt, o for the response, or a for all span attributes and metadata.

TraceCraft TUI - Waterfall and Detail View

Hierarchical waterfall view — agents, tools, and LLM calls with precise timing. See exactly where your agent spends its time.


Path 2 — Config file + one init() call

When you want a persistent local setup — custom service name, JSONL export, PII redaction — drop a config file into your project and add one line to your app:

.tracecraft/config.yaml:

# .tracecraft/config.yaml
default:
  exporters:
    receiver: true         # stream to tracecraft serve --tui
  instrumentation:
    auto_instrument: true  # patches OpenAI, Anthropic, LangChain, LlamaIndex

Your app:

import tracecraft
 
tracecraft.init()  # reads .tracecraft/config.yaml automatically

Then start the TUI:

tracecraft serve --tui

Or, if you prefer to write traces to a file and open the TUI separately:

tracecraft tui

Call tracecraft.init() before importing any LLM SDK

TraceCraft patches SDKs at import time. Import your LLM libraries after calling init() so the patches apply correctly.

Full Configuration Reference


Path 3 — SDK decorators and custom tracing

For fine-grained control — custom span names, explicit inputs/outputs, structured step hierarchies — TraceCraft provides @trace_agent, @trace_tool, @trace_llm, and @trace_retrieval decorators, plus a step() context manager for inline instrumentation.

SDK Guide


Why TraceCraft?

FeatureTraceCraftLangSmithLangfusePhoenix
Terminal UIYes — built-inNoNoNo
Zero-Code InstrumentationYesNoNoNo
Vendor Lock-inNoneLangChainLangfuseArize
Local DevelopmentFull offlineCloud requiredSelf-hostSelf-host
OpenTelemetry NativeBuilt on OTelProprietaryProprietaryCompatible
PII RedactionSDK-levelBackend onlyBackend onlyBackend only
CostFree & Open SourcePaid tiersPaid tiersPaid tiers

What the TUI Shows You

ViewWhat You See
Trace ListAll agent runs — name, duration, tokens, status, timestamp
WaterfallFull call hierarchy with timing bars (agent → tool → LLM)
Input ViewExact prompts, system messages, and context sent to the model
Output ViewModel responses with token counts and cost estimates
AttributesModel parameters, custom metadata, error details

Keyboard shortcuts:

KeyAction
/ Navigate traces
EnterExpand waterfall for selected trace
iView input/prompt
oView output/response
aView attributes
/Filter traces
m + CMark and compare two traces
pOpen playground for prompt editing
qQuit

Full TUI Guide


How It Works

Path 1 — Zero code changes (OTLP env var):

  1. Set OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
  2. tracecraft serve --tui — starts receiver on :4318 and opens TUI
  3. Run your existing app — traces appear live as they arrive

Path 2 — Config file + init() call:

  1. Add .tracecraft/config.yaml and call tracecraft.init() in your app
  2. Run your agent — traces go to JSONL/SQLite
  3. tracecraft serve --tui or tracecraft tui to explore

Installation

pip install "tracecraft[tui]"

Includes the Terminal UI for local trace exploration.


Next Steps


Community & Support

License

TraceCraft is licensed under the Apache-2.0 License. See LICENSE for details.