Observability Pipelines | Datadog
Observability Pipelines

Observability Pipelines

Take control of your observability data with the freedom to collect, transform, and route data anywhere

We want to empower our teams to use all of our systems and tools while staying within budget. Observability Pipelines gives us flexibility over how our data is processed and where it ends up, so our teams are taking action on the right information as we continue to scale.
Andreas Kasprzok

Andreas Kasprzok

Observability Tech Lead at BlockFi

We want to empower our teams to use all of our systems and tools while staying within budget. Observability Pipelines gives us flexibility over how our data is processed and where it ends up, so our teams are taking action on the right information as we continue to scale.
Andreas Kasprzok

Andreas Kasprzok

Observability Tech Lead at BlockFi

Datadog Observability Pipelines enables IT and security teams to cost-effectively collect, transform, and route logs, metrics, and traces from any source to any destination at petabyte scale. With full control of observability data, organizations can affordably manage and scale observability.


Control costs, improve visibility

  • Cost-effectively ingest and process all of your logs, metrics, and traces with flexibility to route noisy data to low-cost storage and rehydrate data as needed
  • Reduce your total data volumes through rule-based sampling and aggregation without losing access to mission-critical KPIs and trends
  • Impose rule-based throttles and reactive routing strategies for spike protection
Control costs, improve visibility

Simplify migrations, reduce lock-in

  • Orchestrate and monitor data delivery from any source to any destination, including on-prem locations
  • Adopt new technologies at your own pace without vendor lock-in
  • Automatically parse, enrich, and map data to the right schema
Simplify migrations

Protect sensitive data

  • Redact sensitive data before it leaves your infrastructure
  • Stay compliant with residency laws with full control over data routing
  • Centralized and multi-region deployments give you flexibility to choose the right strategy for your sensitive data
Protect sensitive data

Enforce data quality

  • Format, transform, and enrich all observability data using built-in processors to get more insights out of your existing systems
  • Adopt schemas to enforce consistent data quality and improve investigation and resolution times
  • Manage and preserve data quality when changes are made with type-safe transforms
Enforce data quality

Superior performance at petabyte scale

  • Process petabytes of data with Vector, a vendor-agnostic open source project with millions of monthly downloads
  • Built using an open source, secure, type- and memory-safe core
  • Prevent data loss with features like disk buffers and adaptive request concurrency to create pipelines designed for reliability and low latency
Superior performance at petabyte scale

Resources