Skip to main content

Data Export

Nobl9 Enterprise users can export their SLO and SLI metrics in a CSV format to AWS S3, or Google Cloud Storage bucket.

The Data Export feature allows Nobl9 Enterprise users to combine their SLO data with business metrics such as revenue, conversions, or other KPIs to quantify the impact of reliability of their services.

Feature Overview

You can leverage the Data Export feature to:

  • Merge Nobl9 data with data from in-house telemetry systems to enable holistic business metrics reporting.

  • Conduct audits and post mortem analyses to investigate when the Services started to break, when the Error budget got exhausted or whether anyone reacted when the Alert was triggered.

  • Improve internal reporting by recreating reports, graphs, or historical data.

  • Enhance debugging by analyzing whether their system generates correct data on their end and whether that data is sent correctly to Nobl9.

The Data Export job runs every 60 minutes (half past every hour) and takes about one minute to run, so you may need to wait up to 61 minutes to see the output.

Nobl9 outputs time-series data and SLO details into a single CSV file. The files are compressed using gzip, and no encryption is applied to them. On the storage level for S3 buckets, Server-Side Encryption with Amazon S3-Managed Keys (SSE-S3) is used.

Prerequisites

To use Data Export, you need to own an S3 bucket or a Google Cloud Storage bucket on your end. Connecting it to Nobl9 will allow you full access to your Nobl9 data to manage your data retention policies or the cost of storage.

Configuration

To connect Nobl9 to your S3 bucket, you must configure an IAM Role that Nobl9 can Assume Role to gain permissions to feed the data to the bucket.

To do that:

  1. Obtain the AWS external ID for your organization in Nobl9 UI (in Settings > Account) or with the sloctl command-line tool.

  2. Run the following command in sloctl:
    sloctl get dataexport --aws-external-id
    Output:
    <EXTERNAL_ID_FOR_ORGANIZATION>

  3. Download the AWS Terraform module from here - (this is a private repository, available upon request).

  4. Enter the variables for Terraform. For example, create the file input.auto.tfvars in the root module with the following content:

    aws_region = "<AWS_REGION_WHERE_TO_DEPLOY_RESOURCES>"  # Region where Terraform provisions the S3 bucket
    external_id_provided_by_nobl9 = "<EXTERNAL_ID_FOR_ORGANIZATION>" # You can obtain the ID from sloctl; see the section above the snippet for details
    s3_bucket_name = "<S3_BUCKET_FOR_N9_DATA_NAME>" # Specify the desired name for the bucket. If omitted, Nobl9 will generate a random bucket name

    # Optionally, you can add tags for every created terraform resource

    tags = {
    "key": "value"
    }

    # Other available variables

    # Specify the desired name for the IAM role, which gives Nobl9 access to the created bucket
    # When omitted, Nobl9 will use the default name "nobl9-exporter"

    iam_role_to_assume_by_nobl9_name = "<NAME_OF_CREATED_ROLE_FOR_N9>"

    # Specify whether all objects should be deleted from the previously created S3 bucket when using terraform destroy
    # This will allow destroying the non-empty S3 bucket without errors
    # When omitted, Nobl9 will use the default value "false"

    s3_bucket_force_destroy = <S3_BUCKET_FOR_N9_FORCE_DESTROY>
  5. You will then need to create a yaml file with the kind: DataExport defined, named e.g. nobl9-export.yaml

- apiVersion: n9/v1alpha
kind: DataExport
metadata:
name: "kube permissible name"
displayName: S3 data export
project: default
spec:
exportType: S3
spec:
bucketName: "bucket name"
roleArn: arn:aws:iam::<AWS_ACCOUNT_ID>:role/<NAME_OF_CREATED_ROLE_FOR_N9>
  1. Apply the yaml files with the sloctl apply command:

    sloctl apply -f nobl9-export.yaml

Constraints

A single organization can configure up to 2 data exports (1 data export per export type), which means an organization can have one data export configured for AWS S3, and the other one that exports data to the GCP.

Output Schema

Schema of the exported data

Column nameData typeDescription
timestampDATETIMEEvent Time in UTC
organizationSTRINGOrganization identifier
projectSTRINGProject name
measurementSTRINGMeasurement type. One of the following values:
raw_metric, good_total_ratio, burn_rate, good_count_metric, total_count_metric, remaining_budget_duration, remaining_budget, ratio_extrapolation_window.
For exact usage, see Sample queries
valueDOUBLENumeric value, the meaning depends on the measurement
time_window_startDATETIMETime window start time in UTC, precalculated for the measurement
time_window_endDATETIMETime window end time in UTC, precalculated for the measurement
slo_nameSTRINGSLO name
slo_descriptionSTRINGSLO description
error_budgeting_methodSTRINGError budget calculation method
budget_targetDOUBLEObjective's target value (percentage in its decimal form) or composite's target value.
objective_display_nameSTRINGObjective's display name
objective_valueDOUBLEObjective value or composite's burn rate condition value
objective_operatorSTRINGThe operator used with raw metrics or composite's burn rate condition operator. One of the following values: lte - less than or equal lt - less than get - greater than or equal gt - greater than
serviceSTRINGService name
service_display_nameSTRINGService display name
service_descriptionSTRINGService description
slo_time_window_typeSTRINGType of time window. One of the following values: Rolling Calendar
slo_time_window_duration_unitSTRINGTime window duration unit. One of the following values: Second Minute Hour Day Week Month Quarter Year
slo_time_window_duration_countINTTime window duration
slo_time_window_start_timeTIMESTAMP_TZStart time of time window from the SLO definition. This is a DateTime with the timezone information defined in the SLO. Only for Calendar-Aligned time windows.
compositeBOOLEANIndicates whether the row contains composite-related data (true means a row contains composite data).
objective_nameSTRINGObjective's name

Sample queries

Service Level Objectives details

Let's say we have a sample service WebApp Service with two SLOs (streaming-latency-SLO and streaming-other-slo), where each SLO contains two objectives:

data export slo example
Image 1: SLO example

Given the CSV data is imported do the data warehouse (i.e. Snowflake) we can retrieve similar information using the following query:

select distinct
service_display_name,
service,
project,
slo_name,
objective_name,
objective_value,
budget_target * 100 as target
from nobl9_data
order by service, slo_name

Service Level Indicator query

Count metric example

select timestamp, value, measurement from nobl9_data where
slo_name = 'streaming-latency-slo'
and (measurement = 'good_count_metric' or measurement = 'total_count_metric')
and objective_value = 1
and timestamp >= '2022-03-10 16:00:00'
and timestamp <= '2022-03-10 16:30:00'
and project = 'default'
order by timestamp, measurement

Raw metric example

select timestamp, value from nobl9_data where
slo_name = 'newrelic-server-requests-slo'
and measurement = 'raw_metric'
and objective_value = 7.5
and objective_operator = 'lte'
and timestamp >= '2022-03-10 16:00:00'
and timestamp <= '2022-03-10 16:30:00'
and project = 'default'
order by timestamp

Reliability Burn Down query

select timestamp, value from nobl9_data where
measurement = 'good_total_ratio'
and slo_name = 'streaming-latency-slo'
and objective_value = 7.5
and timestamp >= '2022-03-10 15:00:00'
and timestamp <= '2022-03-10 17:00:00'
and project = 'default'
order by timestamp

Error Budget Burn Rate query

select timestamp, value, objective_value from nobl9_data where
measurement = 'burn_rate'
and slo_name = 'streaming-latency-slo'
and timestamp >= '2022-03-10 15:00:00'
and timestamp <= '2022-03-10 17:00:00'
and project = 'default'
order by timestamp

Error Budget Remaining

Remaining budget duration in seconds

select timestamp, value from nobl9_data where
measurement = 'remaining_budget_duration'
and slo_name = 'streaming-latency-slo'
and objective_value = 9.5
and project = 'default'
order by timestamp desc
limit 1

Remaining budget percentage (percentage value in decimal form)

select timestamp, value from nobl9_data where
measurement = 'remaining_budget'
and slo_name = 'streaming-latency-slo'
and objective_value = 9.5
and project = 'default'
order by timestamp desc
limit 1

Burn Rate (last x minutes)

select
value as burn_rate,
(select value / 60 from nobl9_data where
measurement = 'ratio_extrapolation_window'
and slo_name = 'streaming-latency-slo'
and objective_value = 9.5
and project = 'default'
order by timestamp desc
limit 1) as extrapolation_window_in_minutes
from nobl9_data where
measurement = 'burn_rate'
and slo_name = 'streaming-latency-slo'
and objective_value = 9.5
and project = 'default'
order by timestamp desc
limit 1