AI applications are quickly moving from experimentation to production, and with that comes new challenges for engineering teams. How do you monitor LLM performance, track token usage and costs, and troubleshoot issues across AI-driven services?
Join us for a conversation on observability in the AI era, where we’ll explore how Datadog LLM Observability helps teams gain visibility into AI applications while connecting those insights with their infrastructure and application data.
We’ll also share a customer case study from Malaysia’s fintech industry, highlighting how their team approaches observability today, the challenges of running modern AI workloads, and key lessons learned along the way.
By joining this session, you will learn:

Sales Engineer, Datadog

Account Executive, Datadog

Account Executive, Datadog