User GuideExporters

Exporters

Exporters send trace data to different backends. TraceCraft supports multiple exporters simultaneously.

Available Exporters

ExporterPurposeInstallation
ConsoleExporterRich terminal outputBuilt-in
JSONLExporterLocal file storageBuilt-in
OTLPExporterOpenTelemetry Protocoltracecraft[otlp]
MLflowExporterMLflow trackingtracecraft[mlflow]
HTMLExporterHTML reportsBuilt-in
TUI Receiver (receiver=)Stream live to tracecraft serve --tuiBuilt-in

TUI Receiver

The fastest way to get traces into the TUI during development. Use the receiver= shorthand in tracecraft.init() — no manual exporter setup needed.

# receiver=True → connects to http://localhost:4318 (default TUI receiver address)
tracecraft.init(
    auto_instrument=True,
    receiver=True,
    service_name="my-agent",
)
# Start receiver + TUI in one command
tracecraft serve --tui

You can also use a custom URL:

tracecraft.init(
    receiver="http://remote-host:4318",
    service_name="my-agent",
)

Or configure it in .tracecraft/config.yaml:

default:
  exporters:
    receiver: true
    receiver_endpoint: http://localhost:4318

The receiver uses OTLPExporter internally with protocol="http". You can combine it with other exporters:

from tracecraft.exporters.otlp import OTLPExporter
 
tracecraft.init(
    receiver=True,                                          # stream to TUI
    exporters=[OTLPExporter(endpoint="http://jaeger:4317")],  # also send to Jaeger
    console=False,
)

Console Exporter

Beautiful terminal output with Rich formatting.

from tracecraft.exporters import ConsoleExporter
 
tracecraft.init(
    exporters=[ConsoleExporter(verbose=True)]
)

Options:

  • verbose (bool): Show all span attributes. Default: False.

JSONL Exporter

Write traces to newline-delimited JSON files.

from tracecraft.exporters import JSONLExporter
 
tracecraft.init(
    exporters=[
        JSONLExporter(
            filepath="traces/app.jsonl",
            append=True,
        )
    ]
)

Options:

  • filepath (str | Path): Output file path
  • append (bool): Append to existing file. Default: True.

OTLP Exporter

Export to any OTLP-compatible backend (Jaeger, Grafana Tempo, Datadog, etc.).

from tracecraft.exporters import OTLPExporter
 
tracecraft.init(
    exporters=[
        OTLPExporter(
            endpoint="http://localhost:4317",
            insecure=True,
            headers={"Authorization": "Bearer token"},
        )
    ]
)

Options:

  • endpoint (str): OTLP gRPC endpoint
  • insecure (bool): Disable TLS. Default: True for localhost.
  • headers (dict): Custom HTTP headers

Supported Backends:

  • Jaeger
  • Grafana Tempo
  • Datadog
  • Honeycomb
  • New Relic
  • Any OTLP-compatible system

MLflow Exporter

Export traces to MLflow for experiment tracking.

from tracecraft.exporters.mlflow import MLflowExporter
 
tracecraft.init(
    exporters=[
        MLflowExporter(
            tracking_uri="http://localhost:5000",
            experiment_name="my-agent",
        )
    ]
)

Options:

  • tracking_uri (str): MLflow tracking server URL
  • experiment_name (str): MLflow experiment name

HTML Exporter

Generate standalone HTML reports.

from tracecraft.exporters import HTMLExporter
 
tracecraft.init(
    exporters=[
        HTMLExporter(
            output_dir="reports/",
            include_source=True,
        )
    ]
)

Options:

  • output_dir (str | Path): Report output directory
  • include_source (bool): Include source code snippets. Default: True.

Using Multiple Exporters

Send traces to multiple destinations:

from tracecraft.exporters import (
    ConsoleExporter,
    JSONLExporter,
    OTLPExporter,
)
 
tracecraft.init(
    exporters=[
        ConsoleExporter(),  # Development
        JSONLExporter(filepath="traces.jsonl"),  # Local storage
        OTLPExporter(endpoint="http://jaeger:4317"),  # Observability platform
    ]
)

Async Exporters

For high-throughput scenarios, use async exporters:

from tracecraft.exporters import AsyncBatchExporter, OTLPExporter
 
# Wrap exporter in async batch processor
async_exporter = AsyncBatchExporter(
    exporter=OTLPExporter(endpoint="http://localhost:4317"),
    batch_size=100,
    flush_interval_ms=5000,
)
 
tracecraft.init(exporters=[async_exporter])

Retry and Rate Limiting

Wrap exporters for reliability:

from tracecraft.exporters import RetryingExporter, RateLimitedExporter, OTLPExporter
 
# Add retry logic
reliable_exporter = RetryingExporter(
    exporter=OTLPExporter(endpoint="http://localhost:4317"),
    max_retries=3,
    backoff_factor=2.0,
)
 
# Add rate limiting
rate_limited = RateLimitedExporter(
    exporter=reliable_exporter,
    max_requests_per_second=10,
)
 
tracecraft.init(exporters=[rate_limited])

Custom Exporters

Create your own exporter:

from tracecraft.exporters import BaseExporter
from tracecraft.core.models import AgentRun
 
class MyCustomExporter(BaseExporter):
    def export(self, run: AgentRun) -> None:
        """Export trace to custom backend."""
        # Your export logic here
        print(f"Exporting run: {run.name}")
        # Send to your backend
        self.send_to_backend(run)
 
    def send_to_backend(self, run: AgentRun) -> None:
        # Implementation
        pass
 
# Use it
tracecraft.init(exporters=[MyCustomExporter()])

Environment-Based Configuration

Different exporters per environment:

import os
 
env = os.getenv("ENV", "dev")
 
if env == "production":
    exporters = [
        OTLPExporter(endpoint=os.getenv("OTLP_ENDPOINT")),
        JSONLExporter(filepath="/var/log/traces/prod.jsonl"),
    ]
elif env == "staging":
    exporters = [
        OTLPExporter(endpoint=os.getenv("OTLP_ENDPOINT")),
        ConsoleExporter(verbose=True),
    ]
else:  # development
    exporters = [
        ConsoleExporter(verbose=True),
        JSONLExporter(filepath="traces/dev.jsonl"),
    ]
 
tracecraft.init(exporters=exporters)

Best Practices

1. Use Console for Development

# Development
tracecraft.init(
    exporters=[ConsoleExporter(verbose=True)]
)

2. Use JSONL for Debugging

# Keep local copy for debugging
tracecraft.init(
    exporters=[
        OTLPExporter(endpoint="http://prod:4317"),
        JSONLExporter(filepath="backup.jsonl"),  # Local backup
    ]
)

3. Disable Console in Production

# Production - no console output
tracecraft.init(
    exporters=[
        OTLPExporter(endpoint=os.getenv("OTLP_ENDPOINT"))
    ]
)

4. Use Async for High Volume

from tracecraft.exporters import AsyncBatchExporter
 
# High-throughput production
tracecraft.init(
    exporters=[
        AsyncBatchExporter(
            exporter=OTLPExporter(...),
            batch_size=500,
            flush_interval_ms=1000,
        )
    ]
)

Next Steps