Azure AI Foundry Deployment Guide
Deploy TraceCraft-instrumented applications to Azure with AI Foundry observability.
Architecture
┌─────────────────────────────────────────────────────────────┐
│ Azure Cloud │
│ │
│ ┌─────────────────┐ ┌──────────────────────────────┐ │
│ │ Your Agent │─────▶│ Application Insights │ │
│ │ (TraceCraft │ │ (AI Foundry Observability) │ │
│ │ enabled) │ └──────────────────────────────┘ │
│ └─────────────────┘ │ │
│ ▼ │
│ ┌──────────────────────────────┐ │
│ │ Azure AI Foundry Portal │ │
│ │ (Trace Visualization) │ │
│ └──────────────────────────────┘ │
└─────────────────────────────────────────────────────────────┘Prerequisites
- Azure subscription
- Application Insights resource
- Python 3.11+
Quick Start
1. Get Connection String
- Go to Azure Portal
- Navigate to your Application Insights resource
- Click Overview > Connection String
- Copy the connection string
2. Install TraceCraft
pip install tracecraft[azure-foundry]3. Configure Exporter
import tracecraft
from tracecraft.contrib.azure import create_foundry_exporter
# Create exporter with AI Foundry features
exporter = create_foundry_exporter(
# Get from Azure Portal (or set APPLICATIONINSIGHTS_CONNECTION_STRING env var)
connection_string="InstrumentationKey=...;IngestionEndpoint=https://...",
# Service name appears in Azure traces
service_name="my-agent-service",
# Enable content recording (prompts/responses)
# WARNING: Only enable in dev or when data policies allow
enable_content_recording=False,
# Agent metadata
agent_name="research-agent",
agent_id="agent-001",
agent_description="Researches topics and synthesizes information",
)
# Initialize TraceCraft
tracecraft.init(
exporters=[exporter],
console=False, # Disable console in production
jsonl=False, # Disable local JSONL
)Environment Variables
| Variable | Description | Required |
|---|---|---|
APPLICATIONINSIGHTS_CONNECTION_STRING | Azure connection string | Yes |
TRACECRAFT_AZURE_FOUNDRY_ENABLED | Enable Azure export | No |
TRACECRAFT_AZURE_CONTENT_RECORDING | Record prompts/responses | No |
TRACECRAFT_AZURE_AGENT_NAME | Agent name for traces | No |
TRACECRAFT_AZURE_AGENT_ID | Agent ID for traces | No |
Azure Functions Deployment
# function_app.py
import azure.functions as func
import tracecraft
from tracecraft.contrib.azure import configure_for_azure_functions
# Configure at function app startup
exporter = configure_for_azure_functions(service_name="my-function")
tracecraft.init(exporters=[exporter], console=False, jsonl=False)
app = func.FunctionApp()
@app.function_name("ProcessAgent")
@app.route(route="agent")
def process_agent(req: func.HttpRequest) -> func.HttpResponse:
# Your traced agent code here
passAzure Kubernetes Service (AKS)
1. Store Connection String as Secret
# azure-secret.yaml
apiVersion: v1
kind: Secret
metadata:
name: azure-appinsights
namespace: default
type: Opaque
stringData:
connection-string: "InstrumentationKey=...;IngestionEndpoint=..."2. Deploy Application
# deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
name: my-agent
spec:
replicas: 3
template:
spec:
containers:
- name: agent
image: my-agent:latest
env:
- name: APPLICATIONINSIGHTS_CONNECTION_STRING
valueFrom:
secretKeyRef:
name: azure-appinsights
key: connection-string
- name: TRACECRAFT_AZURE_AGENT_NAME
value: "aks-research-agent"3. Application Code
from tracecraft.contrib.azure import configure_for_aks
exporter = configure_for_aks(service_name="my-aks-agent")
tracecraft.init(exporters=[exporter])OTel GenAI Semantic Conventions
TraceCraft exports traces following OTel GenAI semantic conventions:
Agent Spans
| Attribute | Description |
|---|---|
gen_ai.agent.name | Human-readable agent name |
gen_ai.agent.id | Unique agent identifier |
gen_ai.agent.description | Agent description |
gen_ai.operation.name | ”invoke_agent” |
LLM Spans
| Attribute | Description |
|---|---|
gen_ai.request.model | Model name (e.g., “gpt-4”) |
gen_ai.system | Provider (e.g., “openai”) |
gen_ai.usage.input_tokens | Input token count |
gen_ai.usage.output_tokens | Output token count |
Content Recording
When enabled, also includes:
| Attribute | Description |
|---|---|
gen_ai.request.messages | Prompt content (JSON) |
gen_ai.response.messages | Response content (JSON) |
Viewing Traces
- Go to Azure Portal > Application Insights
- Click Transaction Search
- Filter by:
- Service name
- Operation name
- Time range
- Click a trace to see the full call tree
LangChain Integration
Use AzureAITracerAdapter for LangChain compatibility:
from tracecraft.contrib.azure import AzureAITracerAdapter
adapter = AzureAITracerAdapter(
enable_content_recording=True,
agent_name="langchain-agent",
)
# Use as LangChain callback
chain.invoke({"input": "..."}, config={"callbacks": [adapter]})Best Practices
- Never enable content recording in production unless your data handling policies explicitly allow it
- Use environment variables for connection strings (never hardcode)
- Set meaningful agent names to make traces discoverable
- Use session IDs for multi-turn conversation correlation
- Monitor costs - Application Insights ingestion has costs
Troubleshooting
Traces Not Appearing
- Verify connection string is correct
- Check network connectivity to Azure
- Ensure Application Insights resource is in correct region
- Wait 2-5 minutes for traces to appear
Missing Attributes
- Verify OTel GenAI attributes are enabled
- Check that agent_name/agent_id are set
- Ensure content_recording is enabled if you need prompts