Back to all sessions
Datadog LLM Observability with Gemini
About This Session
Hands-on workshop exploring Datadog LLM Observability using a pre-instrumented SwagBot application on Vertex AI (Gemini). Participants will analyze LLM traces, prompts and responses, evaluations, token usage and latency; correlate with APM, logs, metrics, and infrastructure telemetry.
Learning Objectives
- Navigate LLM Observability applications, traces, prompts, and evaluations
- Correlate LLM traces with APM, logs, metrics, and infrastructure context
- Measure token usage, latency, and errors to assess performance and cost
- Enable LLM-as-a-Judge and managed evaluations with Vertex AI to assess quality & security
- Pinpoint prompt and workflow inefficiencies using trace data to reduce latency and cost
- Use LLM Experiments to compare models and prompt variations against production-like datasets
language
english
Availability
1 available
length
60 minutes