Skip to main content


Datadog is a cloud-scale application observability solution that monitors servers, databases, tools, and services. Nobl9 connects with Datadog to collect SLI measurements and compare them to SLO targets. Nobl9 can activate processes and notifications when the error burn rate is too high or has been surpassed because it calculates error budgets of acceptable thresholds.

Users can pass business context through monitoring data, developing and measuring reliability targets, and aligning activities against the error budget's priorities using the Datadog Nobl9 integration.


When deploying the Nobl9 Agent, user needs to provide an API Key and Application Key with DD_API_KEY and DD_APPLICATION_KEY environment variables. Alternatively, credentials can be passed as using a local config file with keys api_key and application_key under n9datadog (or n9datadog_v2) section.

The procedure to obtain both keys is documented in the API and Application Keys | Datadog Documentation.

Adding Datadog as a Data Source in the UI

To add Datadog as a data source in Nobl9 using the Agent or Direct connection method, follow these steps:

  1. Navigate to Integrations > Sources.
  2. Click the button.
  3. Click the relevant Source icon.
  4. Choose a relevant connection method (Agent or Direct), then configure the source as described below.

Datadog Direct

Direct Configuration in the UI

Direct configuration for Datadog requires users to enter their credentials which Nobl9 stores safely. To set up this type of connection:

  1. Enter the Datadog API endpoint to connect to your data source.

  2. Enter the API Key.

  3. Enter the Application Key.

  1. Select a Project.
    Specifying a Project is helpful when multiple users are spread across multiple teams or projects. When the Project field is left blank then object is assigned to project default.
  2. Enter a Display Name.
    You can enter a friendly name with spaces in this field.
  3. Enter a Name.
    The name is mandatory and can only contain lowercase, alphanumeric characters and dashes (for example, my-project-name). This field is populated automatically when you enter a display name, but you can edit the result.
  4. Enter a Description.
    Here you can add details such as who is responsible for the integration (team/owner) and the purpose of creating it.
  5. Enter a Maximum Period for Historical Data Retrieval.
    • This value defines how far back in the past your data will be retrieved.
    • The maximum Period of retrieving the data for this source cannot exceed 30 days.
    • Entering a more extended Period might slow down the loading time when creating an SLO.
      • The value must be a positive integer.
  6. Enter a Default Period for Historical Data Retrieval.
    • It is the Period that will be used by the SLO connected to this data source.
    • The value must be a positive integer.
  7. Click the Add Data Source button.

Datadog Agent

Agent Configuration in the UI

Follow the instructions below to create your DataDog Agent configuration. Refer to the section above for the description of the fields.

  1. Enter the Datadog API endpoint to connect to your data source.
  1. Enter a Project.
  2. Enter a Name.
  3. Create a Description.
  4. In the Advanced Settings you can:
    1. Enter a Maximum Period for Historical Data Retrieval.
    2. Enter a Default Period for Historical Data Retrieval.
  5. Click the Add Data Source button.

Agent Using CLI - YAML

The YAML for setting up an Agent connection to Datadog looks like this:

apiVersion: n9/v1alpha
kind: Agent
name: datadog
project: datadog
site: com
- Metrics
- Services
value: 30 # integer greater than or equal to 0
unit: Day # accepted values: Minute, Hour, Day
defaultDuration: # value must be less than or equal to value of maxDuration
value: 7 # integer greater than or equal to 0
unit: Day # accepted values: Minute, Hour, Day

Important notes:

Agent specification from Datadog has the following fields:

  • spec[n].datadog.sitestring: com or eu, Datadog SaaS instance, which corresponds to one of their two locations (<> in the U.S. or <> in the European Union)
  • spec[n].historicalDataRetrieval - refer to Replay Documentation | Nobl9 Documentation for more details.

You can deploy only one Agent in one YAML file by using the sloctl apply command.

Deploying Datadog Agent

When you add the data source, Nobl9 automatically generates a Kubernetes configuration and a Docker command line for you to use to deploy the Agent. Both of these are available in the web UI, under the Agent Configuration section. Be sure to swap in your credentials (e.g., replace the <DATADOG_API_KEY> and <DATADOG_APPLICATION_KEY> with your organization keys).

If you use Kubernetes, you can apply the supplied YAML config file to a Kubernetes cluster to deploy the Agent. It will look something like this:

# DISCLAIMER: This deployment description contains only the fields necessary for the purpose of this demo.
# It is not a ready-to-apply k8s deployment description, and the client_id and client_secret are only exemplary values.

apiVersion: v1
kind: Secret
name: nobl9-agent-nobl9-dev-datadog-month-g
namespace: default
type: Opaque
datadog_api_key: "<DATADOG_API_KEY>"
datadog_application_key: "<DATADOG_APPLICATION_KEY>"
client_id: "unique_client_id"
client_secret: "unique_client_secret"
apiVersion: apps/v1
kind: Deployment
name: nobl9-agent-nobl9-dev-datadog-month-datadogagent
namespace: default
replicas: 1
nobl9-agent-name: "datadogagent"
nobl9-agent-project: "datadog-month"
nobl9-agent-organization: "nobl9-dev"
nobl9-agent-name: "datadogagent"
nobl9-agent-project: "datadog-month"
nobl9-agent-organization: "nobl9-dev"
- name: agent-container
image: nobl9/agent:latest
memory: "350Mi"
cpu: "0.1"
- name: N9_CLIENT_ID
key: client_id
name: nobl9-agent-nobl9-dev-datadog-month-datadogagent
key: client_secret
name: nobl9-agent-nobl9-dev-datadog-month-datadogagent
- name: DD_API_KEY
key: datadog_api_key
name: nobl9-agent-nobl9-dev-datadog-month-datadogagent
key: datadog_application_key
name: nobl9-agent-nobl9-dev-datadog-month-datadogagent
# The N9_METRICS_PORT is a variable specifying the port to which the /metrics and /health endpoints are exposed.
# The 9090 is the default value and can be changed.
# If you don’t want the metrics to be exposed, comment out or delete the N9_METRICS_PORT variable.
value: "9090"

Creating SLOs with Datadog

Creating SLOs in the UI

Follow the instructions below to create your SLOs with Datadog in the UI:

  1. Navigate to Service Level Objectives.

  2. Click the button.
  3. In step 1 of the SLO wizard, select the Service the SLO will be associated with.

  4. In step 2, select Datadog as the Data Source for your SLO, then specify the Metric. You can choose either a Threshold Metric, where a single time series is evaluated against a threshold, or a Ratio Metric, which allows you to enter two time series to compare (for example, a count of good requests and total requests).


    For the Ratio Metric, you can choose the Data Count Method:

    • For the Non-incremental method, we expect it to be the components of the sum.
    • For the Incremental method, we expect the value of a metric to be the current sum of some numerator.

    For more information, refer to the SLO Calculations Guide.

  5. Enter a Query or Good Query and Total Query for the metric you selected. The following are query examples:

    • Threshold metric for Datadog:
      Query: avg:trace.http.request.duration{service:my-service}.as_count()

    • Ratio metric for Datadog:
      Good Query: avg:trace.http.request.hits.by_http_status{service:my-service,!http.status_class:5xx}.as_count()

      Total Query: avg:trace.http.request.hits.by_http_status{service:my-service}.as_count()

  6. In step 3, define a Time Window for the SLO.

  7. In step 4, specify the Error Budget Calculation Method and your Objective(s).

  8. In step 5, add a Name, Description, and other details about your SLO. You can also select Alert Policies and Labels on this screen.

  9. When you’re done, click Create SLO.

SLOs using Datadog - YAML samples

Here’s an example of Datadog using a rawMetric (Threshold metric):

apiVersion: n9/v1alpha
kind: SLO
displayName: datadog-calendar-occurrences-threshold
name: datadog-calendar-occurrences-threshold
project: datadog
budgetingMethod: Occurrences
description: ""
name: datadog
service: datadog-n9
- target: 0.8
op: lte
query: avg:trace.http.request.duration{*}
displayName: awesome
value: 0.04
- target: 0.99
op: lte
query: avg:trace.http.request.duration{*}
displayName: so-so
value: 0.1
- calendar:
startTime: "2020-11-14 12:30:00"
timeZone: Etc/UTC
count: 1
isRolling: false
unit: Day

Important notes:

Metric queries in Datadog are described in the Querying metrics | Datadog Documentation.


It is important to define queries in such a way that they return only one time series.


❌ Grouping metrics will often result in a multiple time series:

  • avg:system.load.1{*} by {host}
Image 1: Sample query with grouping metrics

✔ Same query without grouping

  • avg:system.load.1{*}
Image 2: Same query without grouping metrics

It is strongly suggested to not use .rollup() or .moving_rollup() functions in your queries (see Rollup | Datadog Documentation).

Nobl9 Agent uses enforced rollup described in the Rollup Interval: Enforced vs Custom | Datadog Documentation to control the number of points returned from the queries. Using .rollup() or .moving_rollup() can affect the number of returned points or the way they are aggregated. This fact, in conjunction with the time range of each query Nobl9 agent, makes, can skew calculated error budgets.

Querying the Datadog Server

The Nobl9 Agent leverages the Query Timeseries API | Datadog Documentation parameters at a two-minute interval.

Nobl9 sends an API request containing a batch of queries against the Datadog API. The API request can contain multiple queries separated by a comma with a limit of 1024 characters per request. If the character limit is exceeded, the Nobl9 Agent will create another API request.


One incorrectly defined query has an impact on other SLOs with correct query definitions. Providing an invalid query causes loss in results for all other queries batched in the same API request.
For example, when you define an SLO with an incorrect Datadog query, the results for the other SLOs using Datadog as a data source are lost.

Nobl9 Agent tries to optimize the number of requests made to Datadog. In an optimistic scenario, it uses 60 requests per hour per data source. In a pessimistic scenario, it can use 60 requests per unique query.

Datadog API Rate Limits

Requests to Datadog’s API are rate limited. For more information, refer to the Rate Limits | Datadog Documentation.

The default rate limit for the Query Timeseries API call is 1600 per hour per organization. This limits the single-query integration to query up to 26 metrics with 1 minute interval.

Nobl9 Integration with Datadog | Datadog Documentation

Rollup Interval: Enforced vs Custom | Datadog Documentation

Rate Limits | Datadog Documentation

Query Timeseries API | Datadog Documentation

Agent Metrics | Nobl9 Documentation

Creating SLOs via Terraform | Nobl9 Terraform Documentation

Creating Agents via Terraform | Nobl9 Terraform Documentation