DataDogs Webhooks Integration: 2 Easy Steps

Last Modified: December 29th, 2022

datadog webhook: featured image

Datadog is a SaaS-based monitoring and analytics platform for large-scale applications and infrastructure. Combining real-time metrics from servers, containers, databases, and applications with end-to-end tracing, Datadog delivers actionable alerts and powerful visualizations to provide full-stack observability. Datadog includes over 200 vendor-supported integrations and APM libraries for several languages. DataDogs Webhooks is one such supported Integration.

webhook (also called a web callback or HTTP push API) is a way for an app to provide other applications with real-time information. A webhook delivers data to other applications as it happens, meaning you get data immediately. Unlike typical APIs where you would need to poll for data very frequently in order to get it real-time. This makes webhooks much more efficient for both provider and consumer. The only drawback to webhooks is the difficulty of initially setting them up.

This article gives a step-by-step guide on setting up Datadogs Webhooks Integration.

Table of Contents

What is Datadogs?

datadogs webhooks: datadogs logo
Image Source: images.ctfassets.net

Datadog is a monitoring and analytics tool for information technology (IT) and DevOps teams that can be used to determine performance metrics as well as event monitoring for infrastructure and cloud services. The software can monitor services such as servers, databases, and tools.

Datadog monitoring software is available for deployment on-premises or as software as a service (SaaS). Datadog supports Windows, Linux, and Mac operating systems. Support for cloud service providers includes AWS, Microsoft Azure, Red Hat OpenShift, and Google Cloud Platform.

Datadog uses a Go-based agent and its backend is made from Apache Cassandra, PostgreSQL, and Kafka. A Rest application program interface (API) is used to allow Datadog to integrate with numerous services, tools, and programming languages. Integrations such as Kubernetes, Chef, Puppet, Ansible, Ubuntu, and Bitbucket.

The user interface includes customizable dashboards that can show graphs composed of multiple data sources in real-time. Datadog can also send users notifications of performance issues on any set metric, such as compute rates. Users are notified through means such as email, Slack, or PagerDuty.

Features of Datadogs

The features that Datadog offers include:

  • Provides an IT/DevOps team with a single view of their infrastructure (including servers, apps, metrics, and other services).
  • Customizable dashboards.
  • Alerts based on critical issues.
  • Support for over 250 product integrations.
  • Automatically collects and analyzes logs, latency, and error rates.
  • Allows for access to the API.
  • Supports applications written in languages such as Java, Python, PHP, .NET, Go, Node and Ruby.

Simplify Data Analysis with Hevo’s No-code Data Pipeline

Hevo Data, a No-code Data Pipeline helps to load data from any data source such as Databases, SaaS applications, Cloud Storage, SDKs, and Streaming Services and simplifies the ETL process. It supports 100+ data sources (including 30+ free data sources) like Asana and is a 3-step process by just selecting the data source, providing valid credentials, and choosing the destination. Hevo not only loads the data onto the desired Data Warehouse/destination but also enriches the data and transforms it into an analysis-ready form without having to write a single line of code.

GET STARTED WITH HEVO FOR FREE[/hevoButton]

Its completely automated pipeline offers data to be delivered in real-time without any loss from source to destination. Its fault-tolerant and scalable architecture ensure that the data is handled in a secure, consistent manner with zero data loss and supports different forms of data. The solutions provided are consistent and work with different BI tools as well.

Check out why Hevo is the Best:

  • Secure: Hevo has a fault-tolerant architecture that ensures that the data is handled in a secure, consistent manner with zero data loss.
  • Schema Management: Hevo takes away the tedious task of schema management & automatically detects the schema of incoming data and maps it to the destination schema.
  • Minimal Learning: Hevo, with its simple and interactive UI, is extremely simple for new customers to work on and perform operations.
  • Hevo Is Built To Scale: As the number of sources and the volume of your data grows, Hevo scales horizontally, handling millions of records per minute with very little latency.
  • Incremental Data Load: Hevo allows the transfer of data that has been modified in real-time. This ensures efficient utilization of bandwidth on both ends.
  • Live Support: The Hevo team is available round the clock to extend exceptional support to its customers through chat, email, and support calls.
  • Live Monitoring: Hevo allows you to monitor the data flow and check where your data is at a particular point in time.
SIGN UP HERE FOR A 14-DAY FREE TRIAL

What is Webhooks?

datadogs webhook: webhooks logo
Image Source: miro.medium.com

Webhooks are automated messages sent from apps to your devices when something gets triggered inside the application interface. They have a message—or payload—and are sent to a unique URL—essentially the app’s phone number or address. Webhooks are almost always faster than polling and require less work on your end. They’re much like SMS notifications.

A webhook is a software architecture approach that allows applications and services to submit a web-based notification to other applications whenever a specific event occurs. The application provides a way for users to register or connect application programming interface calls to certain events under specific conditions, such as when a new user, account, or order is created or an order ships out of a warehouse.

Webhooks use requirements similar to If This Then That (IFTTT) statements. When an event triggers the webhook to “fire,” it calls another external API. This allows the application to “push” the knowledge of the event to any third party, such as an enterprise architecture integration (EAI) platform or directly to other applications.

Webhooks are transmitted via HTTP or HTTPS, usually as a POST request over a specific URL. The POST data is interpreted by the receiving application’s API, which triggers the requested action and sends a message to the original application to confirm the task is complete. The data sent is commonly formatted using JSON or XML.

Webhooks are user-defined HTTP callbacks. They are usually triggered by some event, such as receiving an SMS message or an incoming phone call. When that event occurs, Twilio makes an HTTP request (usually a POST or a GET) to the URL configured for the webhook.

To handle a webhook, you only need to build a small web application that can accept HTTP requests. Almost all server-side programming languages offer some framework for you to do this. Examples across languages include ASP.NET MVC for C#, Servlets and Spark for Java, Express for Node.js, Django and Flask for Python, and Rails and Sinatra for Ruby. PHP has its web app framework built-in, although frameworks like Laravel, Symfony, and Yii are also popular.

Benefits of webhooks

Because webhooks notify other systems when an event occurs, changes are event-driven and made in near-real-time. A continuous integration system using webhooks might send a message to the security team only when a build failure is due to a security problem, for example. This can “push” information onto the screens of the security team, instead of causing them to search for it periodically in other systems. It can also allow systems to synchronize in near-real-time, instead of overnight or in a batch.

An alternative to a webhook is a polling process. A polling process has a program wait five or 10 minutes, then calls an API for a list of recent transactions, adding any new ones to a list and processing them. This assumes the API provides a list of recent transactions.

Uses and examples of webhooks

Webhooks are a simple way to make an API accessible to make or receive calls or send text-based responses to users when specific events occur within the application. Platforms that support webhooks include Facebook, Github, Trello, Confluence, and Google Calendar.

Practical uses of webhooks can include:

  • Daily automatic email reminders for meetings
  • Confirmations of processed and completed payments
  • Sync changes in customer data between applications

Webhooks are very common on the internet of things (IoT), where a sensor might notice a change in temperature and call the air conditioning system. A motion sensor could send a message to a security system or generate noise to simulate a barking dog, for example.

Datadogs Webhooks Integration

datadogs webhooks: datadogs webhooks Integration
Image Source: help.ns1.com

Step 1: Enable Webhooks for Datadogs Webhooks Integration

Datadogs Webhooks enable you to:

  • Connect to your services.
  • Alert your services when a metric alert is triggered.

Setup

Go to the Webhooks integration tile and enter the URL and name of the Datadogs webhook you want to use.

Usage

To use your Datadogs webhook, add @webhook-<WEBHOOK_NAME> in the text of the metric alert, you want to trigger the Datadogs webhook. It triggers a POST request to the URL you set with the following content in JSON format. The timeout for any individual request is 15 seconds. Datadog only issues a retry if there is an internal error (badly formed notification message), or if Datadog receives a 5XX response from the Datadogs webhook endpoint. Missed connections are retried 5 times.

Note: Custom headers must be in JSON format.

To add your own custom fields to the request, you can also specify your own payload in the Payload field. If you want your payload to be URL-encoded, check the Encode as a form checkbox and specify your payload in JSON format. Use the following variables for Datadogs Webhooks:

$AGGREG_KEY
    ID to aggregate events belonging together.
    Example: 9bd4ac313a4d1e8fae2482df7b77628

$ALERT_CYCLE_KEY
    ID to link events from the time an alert triggers until it resolves.

$ALERT_ID
    ID of alert.
    Example: 1234

$ALERT_METRIC
    Name of the metric if it’s an alert.
    Example: system.load.1

$ALERT_PRIORITY
    Priority of the alerting monitor.
    Example: P1, P2

$ALERT_QUERY
    Query of the monitor that triggered the webhook.

$ALERT_SCOPE
    Comma-separated list of tags triggering the alert.
    Example: availability-zone:us-east-1a, role:computing-node

$ALERT_STATUS
    Summary of the alert status.
    Example: system.load.1 over host:my-host was > 0 at least once during the last 1m

$ALERT_TITLE
    Title of the alert.

$ALERT_TRANSITION
    Type of alert notification.
    Example: Recovered, Triggered/Re-Triggered, No Data/Re-No Data, Warn/Re-Warn, Renotify

$ALERT_TYPE
    Type of the alert.

$DATE
    Date (epoch) where the event happened.
    Example: 1406662672000

$EMAIL
    Email of the user posting the event that triggered the webhook.

$EVENT_MSG
    Text of the event.
    Example: @webhook-url Sending to the webhook

$EVENT_TITLE
    Title of the event.
    Example: [Triggered] [Memory Alert]

$EVENT_TYPE
    Type of the event.
    Example: metric_alert_monitor, event_alert, or service_check.

$HOSTNAME
    The hostname of the server associated with the event, if there is one.

$ID
    ID of the event.
    Example: 1234567

$INCIDENT_ATTACHMENTS
    List of JSON objects with the incident’s attachments, such as postmortem and documents.
    Example: [{"attachment_type": "postmortem", "attachment": {"url": "https://app.datadoghq.com/notebook/123","title": "Postmortem IR-1"}}]

$INCIDENT_COMMANDER
    JSON object with the incident commander’s handle, uuid, name, email, and icon.

$INCIDENT_CUSTOMER_IMPACT
    JSON object with an incident’s customer impact status, duration, and scope.
    Example: {"customer_impacted": true, "customer_impact_duration": 300 ,"customer_impact_scope": "scope here"}

$INCIDENT_FIELDS
    JSON object mapping each of an incident’s fields to its values.
    Example: {"state": "active", "datacenter": ["eu1", "us1"]}

$INCIDENT_PUBLIC_ID
    Public ID of the associated incident.
    Example: 123

$INCIDENT_TITLE
    Title of the incident.

$INCIDENT_URL
    URL of the incident.
    Example: https://app.datadoghq.com/incidents/1

$INCIDENT_MSG
    The message of the incident notification.

$LAST_UPDATED
    Date when the event was last updated.

$LINK
    URL of the event.
    Example: https://app.datadoghq.com/event/jump_to?event_id=123456

$LOGS_SAMPLE
    Logs sample from log monitor alerts.

$METRIC_NAMESPACE
    Namespace of the metric if it’s an alert.

$ORG_ID
    ID of your organization.
    Example: 11023

$ORG_NAME
    Name of your organization.
    Example: Datadog

$PRIORITY
    Priority of the event.
    Example: normal or low

$SECURITY_RULE_NAME
    The name of the security rule.

$SECURITY_SIGNAL_ID
    The unique identifier of the signal.
    Example: AAAAA-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

$SECURITY_SIGNAL_SEVERITY
    The severity of the security signal.
    Example: medium

$SECURITY_SIGNAL_TITLE
    The title of the security signal.

$SECURITY_SIGNAL_MSG
    The message of the security signal.

$SECURITY_SIGNAL_ATTRIBUTES
    The security signal attributes.
    Example: {"network":{"client":{"ip":"1.2.3.4"}}}

$SECURITY_RULE_ID
    The security rule ID.
    Example: aaa-aaa-aaa

$SECURITY_RULE_QUERY
    The query or queries associated with the security rule.
    Example: ["@evt.name:authentication"]

$SECURITY_RULE_GROUP_BY_FIELDS
    The security group by key value pairs.
    Example: {"@usr.name":"john.doe@your_domain.com"}

$SECURITY_RULE_TYPE
    The security rule type.
    Example: log_detection

$SNAPSHOT
    URL of the image if the event contains a snapshot.
    Example: https://p.datadoghq.com/path-to-snapshot

$SYNTHETICS_TEST_NAME
    Name of the Synthetics test.

$SYNTHETICS_FIRST_FAILING_STEP_NAME
    Name of the first failing step of the Synthetics test.

$TAGS
    Comma-separated list of the event tags.
    Example: monitor, name:myService, role:computing-node

$TEXT_ONLY_MSG
    Text of the event without Markdown formatting.

$USER
    User posting the event that triggered the webhook.
    Example: rudy

$USERNAME
    Username of the user posting the event that triggered the webhook.

Authentication

If you want to post your Datadogs webhooks to a service requiring authentication, you can use basic HTTP authentication by modifying your URL from https://my.service.example.com to https://<USERNAME>:<PASSWORD>@my.service.example.com.

Multiple webhooks

In a monitor alert, if 2 or more Datadogs webhook endpoints are notified, then a Datadogs webhook queue is created on a per-service level. For instance, if you reach out to PagerDuty and Slack, a retry on the Slack webhook does not affect the PagerDuty one.

However, in the PagerDuty scope, certain events always go before others—specifically, an “Acknowledge” payload always goes before “Resolution”. If an “Acknowledge” ping fails, the “Resolution” ping is queued due to the retry logic

Step 2: Configure webhook in Datadog

Configure Datadogs webhook endpoint in Datadog so that SR Ops can communicate with Datadog using the endpoint.

Prerequisites

Ensure you have done the following actions:

Role required: evt_mgmt_integration

Steps to perform Datadogs Webhooks Integration

  1. Log in to the Datadog console.
  2. From the left navigation pane, click Integrations > Integrations.
  3. Search for Webhook and click to open the Webhook card.
  4. Click the Configuration tab.
  5. To create a Datadogs webhook, scroll down to the Webhooks section and click New.
  6. On the New Webhook form, fill the fields.
FieldDescription
NameName of the webhook.
URLEnter that URL that you have already created. Note: You need to enter the username and password at the beginning of the webhook URL. For example, https://event:admin@kgsrontop.service-now.com/api/sn_em_connector/em/inbound_event?source=datadog&integration_id=f9a3f75a1b16e090e96c631abc4bcbbb
PayloadA json object that describes the file or folder (target) that triggered the webhook, as well as the event that has been triggered.Note: Custom headers must be in JSON format.
Custom HeadersOption to send custom header fields along with the outgoing request.Note: Custom headers must be in JSON format.
Encode as formOption to convert payload into valid URL format.
  1. Click Save.
  2. From the left navigation pane, click Monitors > Manage Monitors.
  3. Click New Monitor.
  4. Under the Custom Monitors tab, select the monitor type as Metric.
  5. Choose the detection method, define the metric, and set alert conditions.
  6. In the Say what’s happening section, under the Edit tab, enter a name for the monitor.
  7. In the Notify your team section, in the first field, select the Datadogs webhook integration that you created.
  8. To verify if the Datadogs webhook integration is working fine with SR Ops, click Test Notifications.
  9. Click Save.

Conclusion

This article gave a comprehensive guide on Datadogs and Webhooks. It also gave a 2 step process to perform Datadogs Webhooks Integration.

While Datadogs Webhooks Integration is insightful, it is a hectic task to Set Up the proper environment. To make things easier, Hevo comes into the picture. Hevo Data is a No-code Data Pipeline and has awesome 100+ pre-built Integrations that you can choose from.

visit our website to explore hevo[/hevoButton]

Hevo can help you Integrate your data from numerous sources like Datadogs and load them into a destination to Analyze real-time data with a BI tool such as Tableau. It will make your life easier and data migration hassle-free. It is user-friendly, reliable, and secure.

SIGN UP for a 14-day free trial and see the difference!

Share your experience of learning about Datadogs Webhooks Integration in the comments section below.

mm
Former Research Analyst, Hevo Data

Arsalan is a data science enthusiast with a keen interest towards data analysis and architecture and is interested in writing highly technical content. He has experience writing around 100 articles on various topics related to data industry.

No-code Data Pipeline For Your Data Warehouse