Take Enhanced Control of Your Log Data With Datadog Log Workspaces | Datadog

Take enhanced control of your log data with Datadog Log Workspaces

Author Aaron Kaplan
Author Jason Manson-Hing

Published: June 26, 2024

Security, operations, and development teams rely more and more on the ability to efficiently query logs. As these teams investigate incidents, troubleshoot issues, and monitor the health, performance, and usage of their systems, delving into log data can often be a matter of urgency. But it can also be a cumbersome task: Modern, distributed systems and applications churn out logs from innumerable components, and teams must frequently cross-reference logs from multiple sources in order to translate microservice events into cohesive visibility. The sheer volume of log data, combined with the inconsistent and frequently unpredictable ways in which it is structured, complicates analysis. Many teams resort to using scattered, highly specialized tools to comb through mountains of log data and tease out insights. Others adopt proprietary query languages, turning anything beyond the simplest kind of search into a specialist skill.

To address these challenges and help organizations take greater control over their log data, we’re pleased to introduce Datadog Log Workspaces. Log Workspaces expands on the powerful capabilities offered by the Datadog Log Explorer, which helps teams swiftly and easily navigate enormous volumes of log data in a point-and-click interface, with an expanded suite of tools in a fluid, collaborative environment. With Log Workspaces, teams can easily dive deep into their log data by flexibly and collaboratively querying, joining, and transforming it.

This post will explore how Log Workspaces lets you seamlessly parse and enrich log data from any number of sources, helping anyone in your organization easily analyze that data in clear and declarative terms using SQL, natural language, and Datadog’s visualizations.

Seamlessly parse and enrich log data from any number of sources

As software architectures grow in complexity and organizations collect ever-expanding volumes of logs, the ability to quickly and flexibly draw complex correlations between log data from many different sources becomes increasingly important. But this type of analysis often entails composing lengthy, convoluted queries—a time-consuming process that can create confusion as teams work together to capture insights from their data.

Log Workspaces helps teams query their logs flexibly and in depth, without constraint. It empowers virtually anyone in your organization to home in on log data from any number of sources, enrich it, and mold it into easily queryable forms for open-ended and declarative analysis.

Let’s say you’re overseeing transactions for a financial trading platform. You want to take a closer look at failed transactions and understand the business impact, in dollars, of these failures. To do so, you need to compare logs from two separate services: trade-receiver, which is responsible for fielding trade requests, and trade-finalizer, which fulfills them. By examining logs from trade-receiver in the Log Explorer, you discover that there is a log for each trade request with a message that begins Streaming trade request…. By querying trade-receiver logs containing this pattern, you’re able to isolate a record of all trade requests.

Using the Log Explorer to obtain a record of streaming events from a microservice.

From here, you open your query in Log Workspaces to continue your investigation. Log Workspaces provides a fluid, declarative, and collaboration-friendly interface for delving deeply into your log data. You can create a new Workspace from scratch or by exporting a query from the Log Explorer.

In each Workspace, you begin by defining one or more data sources, starting from either a log query or a reference table. Data source cells use the results of your log queries to generate data schemas in order to facilitate further querying. You can customize these schemas using Log Facets to facilitate further analysis using SQL or natural language. You can then build on and process your data sources using up to 20 analysis, transformation, visualization, and text cells.

Data source, transformation, and analysis cells can help you dynamically parse your data. Each of these cells yields a discrete dataset that can be individually tweaked, queried, and visualized. You can also assign them descriptive names for clarity, which can help facilitate complex and collaborative analysis.

Let’s continue with our example from above. Having imported the logs from trade-receiver, you add the logs from trade-finalizer as a second data source to round out your visibility into the trade fulfillment pipeline.

Creating data sources based on log queries in Log Workspaces.

We can see that the results of these queries are broken down into data schemas including values for timestamp, host, service, and message. Next, you extract the ID of each transaction in order to correlate the data from these sources. You can select any log from your query results in order to inspect it in greater detail in a side panel. From here, you can select any attribute to add it as a column to your dataset in order to make it easily queryable during subsequent analysis.

Adding a log attribute as a column in a dataset in order to simplify subsequent querying.

Often, however, you may need to extract information from your logs that isn’t neatly nested under tags or attributes. In the case of our example, the transaction IDs you need are buried in the unstructured data of the log messages from the trade-finalizer service.

Log Workspaces makes it easy to add granularity and structure to your log data at query time, helping to facilitate any type of analysis. You can add transformation cells to your Workspace to:

  • Extract values from your log content at query time
  • Filter your datasets using log facets
  • Aggregate datasets based on specific column values

In the case of our example, you’re able to isolate the transaction IDs from the trade execution logs by creating a new transformation cell and adding a column extraction. In column extractions, you write custom grok rules that are applied at query time. As shown in the following screenshot, your extraction adds a transaction_id column to the resulting dataset.

Using a transformation to add structure to log data by extracting variables from messages.

So far, we’ve seen how Log Workspaces lets you dynamically refine the structure and granularity of your log data at query time, helping you create clear-cut foundations for analysis by defining data sources and transforming your data. With these foundations in place, anyone in your organization—regardless of their level of experience—can use Log Workspaces to seamlessly correlate and delve deeply into log data from many different sources.

Conduct complex analysis of log data expressively and declaratively

Let’s continue with our example from above—you’re overseeing transactions for a financial trading platform, and now you want to dig deeper into the failed transactions you’ve isolated. In order to establish a clearer picture of their business impact, you want to determine where these failures have had the greatest effect on customers. To do so, you need to identify the highest-value failed trades, the customer accounts associated with those trades, and the countries in which those accounts are registered.

For flexible querying of your data sources, you can add analysis cells to your Workspaces. By allowing you to query your data using either SQL or Bits AI-powered natural language queries, analysis cells put deep insights into log data within reach for any Datadog user. The following screenshot illustrates how SQL queries enable highly flexible control over your data. With this query, you correlate the data from the datasets you’ve created by performing a table join and extract the precise information you need in a clear-cut form, creating the failed_transaction_record dataset.

Analyzing log data in depth by querying bespoke datasets using SQL.

From here, you can build a clearer picture of the business impact of these failures in a few quick steps. To get the names associated with the accounts behind these trades, you add a reference table with all of your customer data as a new data source and perform a join with your failed_transaction_record dataset.

TODO:

Finally, to paint a clear picture, you can visualize your data. Log Workspaces provides several options for visualization—including tables, toplists, timeseries, treemaps, pie charts, and scatterplots—and allows you to filter your datasets by status, environment, and other variables.

TODO:

Dive deeper into your logs

Datadog Log Workspaces provides a flexible tool kit that enables anyone in your organization to easily construct complex queries—and extract critical insights—using clear, declarative building blocks. This can help you derive richer insights from your logs and speed up investigations into incidents, security issues, performance trends, and more by extracting critical information faster.

By coupling Log Workspaces with Datadog Logging without Limits™ and Flex Logs—which allow you to ingest and index large volumes of logs without multiplying your storage costs—you can collect, store, archive, forward, and query all of your logs without restrictions.

Datadog users can sign up for Log Workspaces, currently in Preview, here. If you’re new to Datadog, you can get started with a 14-day .