Observability Pipelines | Datadog
Observability Pipelines

Observability Pipelines

Control log volume, reduce vendor lock-in, and secure sensitive data at scale.

Datadog Observability Pipelines enables you to make value based decisions on your logs with the ability to aggregate, process, and route logs within your own infrastructure. You can easily dual-ship, filter, redact, enrich, enforce quotas on your logs and more to control log volumes, reduce vendor lock-in, maintain compliance, and enhance downstream analytics.


Cost-effective processing and routing of all your logs

  • Filter logs before you route them, reduce the size of your logs, and retain relevant fields
  • Stay within budget by imposing rule-based daily quotas or sampling strategies without losing access to mission-critical KPIs and trends
  • Select log destination based on priority and needs, routing noisy logs to an archive
Cost-effective processing and routing of all your logs

Simplify migrations and reduce vendor lock-in

  • Dual-ship logs easily to ensure business continuity when migrating between vendors or using multiple solutions
  • Onboard new log data sources and destinations at your own pace without disrupting your existing workflows or sacrificing visibility
  • Take advantage of best-of-breed solutions with easy routing for different logging use cases
Simplify migrations

Transform and enrich logs for efficient analysis

  • Automatically identify and convert over 150 common log types into easily queryable, structured format using the Grok parser
  • Enhance security and improve data analysis using the GeoIP Parser, which helps you identify an IP address’s geographical location
  • Enrich logs with contextual data using custom tables to enhance usability for downstream analysis
  • Simplify debugging, increase searchability, and speed up Root Cause Analysis (RCA) by inserting a hostname within log messages
Transform and enrich logs for efficient analysis

Meet your organization’s security and compliance requirements

  • Fully or partially redact sensitive data—such as credit cards, email addresses, IP addresses—before it leaves your environment using Datadog Sensitive Data Scanner
  • Comply with data residency laws and support Data Loss Prevention initiatives with full control over data routing, and enable built-in or user-defined rules to stay compliant with PCI, GDPR, HIPAA, CCPA, and more
  • Eliminate gaps in access control and ensure schema standardization by easily adding, copying, and dropping relevant attributes and tags

Build, monitor, and manage log pipelines through a single control plane

  • Get started with easy-to-use templates for common use cases such as dual shipping logs, reducing log volume, and archiving data
  • Easily create, deploy, and manage remotely your pipeline instances using a point-and-click UI
  • Monitor the health and performance of all pipelines deployed in your infrastructure in one centralized view, and get alerted on key issues
Build, monitor, and manage log pipelines through a single control plane

Recognized by G2 for Log Analysis and Log Monitoring

Resources

/blog/observability-pipelines/new-observability-pipelines-hero

BLOG

Aggregate, process, and route logs easily with Datadog Observability Pipelines
/blog/observability-pipelines-log-volume-control/hero-op-log-volumes

BLOG

Control your log volumes with Datadog Observability Pipelines
/blog/observability-pipelines-dual-ship-logs/dual-ship-logs-hero

BLOG

Dual ship logs with Datadog Observability Pipelines
/blog/observability-pipelines-sensitive-data-redaction/il-1243-sensitive-data-redaction-observability-pipelines-feature-announcement-240401-v1

BLOG

Redact sensitive data from your logs on-prem by using Observability Pipelines
Get started with Observability Pipelines today with a 14-day free-trial