Skip to main content

Overview

When sending alerts to CauseFlow via the Receive alert endpoint, the request body must match the expected format for your provider. This page documents the exact payload schema for each supported integration. All payloads must be sent as application/json with a valid X-API-Key and X-Webhook-Signature header.

Datadog

Datadog webhooks are configured via Integrations → Webhooks in the Datadog UI. Use the payload template below and set the webhook URL to https://api.causeflow.ai/v1/webhooks/{tenantId}/datadog.
{
  "alert_id": "1234567890",
  "title": "P1 - Checkout service error rate above 5% threshold",
  "text": "The error rate for checkout-service has exceeded 5% for 5 consecutive minutes. Current value: 7.3%. Threshold: 5.0%.",
  "alert_type": "error",
  "priority": "P1",
  "tags": ["env:production", "service:checkout", "team:payments"]
}
FieldTypeRequiredDescription
alert_idstringYesUnique Datadog alert ID used for deduplication
titlestringYesAlert title as configured in the Datadog monitor
textstringYesFull alert message body including threshold values and context
alert_typestringYesOne of: error, warning, info, success
prioritystringNoDatadog priority: P1P4. Used to seed severity classification
tagsarrayNoArray of Datadog tag strings (e.g. ["env:production", "service:checkout"])

Grafana

Grafana alert webhooks are configured under Alerting → Contact points in Grafana. Select Webhook as the contact point type and set the URL to https://api.causeflow.ai/v1/webhooks/{tenantId}/grafana.
{
  "evalMatches": [
    {
      "metric": "error_rate",
      "value": 7.3,
      "tags": {
        "env": "production",
        "service": "checkout"
      }
    }
  ],
  "ruleId": 42,
  "ruleName": "Checkout error rate elevated",
  "ruleUrl": "https://grafana.example.com/alerting/42",
  "state": "alerting",
  "message": "Error rate has exceeded 5% threshold for checkout service in production.",
  "tags": {
    "env": "production",
    "team": "payments"
  }
}
FieldTypeRequiredDescription
evalMatchesarrayNoMetrics that triggered the alert, with current values and tag dimensions
ruleIdintegerYesGrafana rule identifier used for deduplication
ruleNamestringYesHuman-readable name of the alerting rule
ruleUrlstringNoLink back to the rule in Grafana
statestringYesOne of: alerting, no_data, ok, paused
messagestringNoAdditional context message from the rule annotation
tagsobjectNoKey-value tag pairs for environment and service routing

CloudWatch

CloudWatch alerts are delivered as SNS notifications. Configure an SNS topic subscription to forward to https://api.causeflow.ai/v1/webhooks/{tenantId}/cloudwatch. CauseFlow handles both the SNS subscription confirmation handshake and alarm state change notifications.
{
  "Type": "Notification",
  "MessageId": "a1b2c3d4-e5f6-7890-abcd-ef1234567890",
  "TopicArn": "arn:aws:sns:us-east-1:123456789012:causeflow-alarms",
  "Subject": "ALARM: CheckoutServiceErrorRate in US East (N. Virginia)",
  "Message": "{\"AlarmName\":\"CheckoutServiceErrorRate\",\"AlarmDescription\":\"Error rate exceeds 5% threshold\",\"NewStateValue\":\"ALARM\",\"NewStateReason\":\"Threshold Crossed: 1 out of the last 1 datapoints [7.3 (01/04/24 14:30:00)] was greater than the threshold (5.0).\",\"StateChangeTime\":\"2024-04-01T14:31:00Z\",\"Region\":\"US East (N. Virginia)\",\"AlarmArn\":\"arn:aws:cloudwatch:us-east-1:123456789012:alarm:CheckoutServiceErrorRate\",\"OldStateValue\":\"OK\",\"Trigger\":{\"MetricName\":\"ErrorRate\",\"Namespace\":\"CauseFlow/Checkout\",\"Dimensions\":[{\"value\":\"checkout-service\",\"name\":\"ServiceName\"}]}}",
  "Timestamp": "2024-04-01T14:31:05Z",
  "SignatureVersion": "1",
  "Signature": "EXAMPLE...",
  "SigningCertURL": "https://sns.us-east-1.amazonaws.com/...",
  "UnsubscribeURL": "https://sns.us-east-1.amazonaws.com/..."
}
The Message field is a JSON string containing the CloudWatch alarm details:
FieldTypeDescription
AlarmNamestringCloudWatch alarm name — used as the incident title
AlarmDescriptionstringAlarm description for additional context
NewStateValuestringNew alarm state: ALARM, OK, or INSUFFICIENT_DATA
NewStateReasonstringHuman-readable explanation of the state transition
StateChangeTimestringISO 8601 timestamp of the state change
RegionstringAWS region where the alarm is defined
OldStateValuestringPrevious alarm state
TriggerobjectMetric and dimension details that define the alarm threshold

Sentry

Sentry webhooks are configured via Settings → Integrations → WebHooks in your Sentry organization. Select the issue event type and set the URL to https://api.causeflow.ai/v1/webhooks/{tenantId}/sentry.
{
  "action": "created",
  "data": {
    "issue": {
      "id": "sentry-issue-12345",
      "title": "TypeError: Cannot read properties of undefined (reading 'price')",
      "culprit": "checkout/cart.js in calculateTotal",
      "level": "error",
      "status": "unresolved",
      "project": {
        "id": "1",
        "name": "checkout-service",
        "slug": "checkout-service"
      },
      "platform": "javascript",
      "url": "https://sentry.io/organizations/acme/issues/12345/",
      "count": "234",
      "userCount": 89,
      "firstSeen": "2024-04-01T14:28:00Z",
      "lastSeen": "2024-04-01T14:35:00Z"
    }
  },
  "installation": {
    "uuid": "inst-abc123"
  }
}
FieldTypeRequiredDescription
actionstringYesSentry webhook action: created, resolved, assigned
data.issue.idstringYesSentry issue ID used for deduplication
data.issue.titlestringYesIssue title — becomes the incident title in CauseFlow
data.issue.culpritstringNoFile and function where the error occurred
data.issue.levelstringYesSentry level: error, warning, info, fatal
data.issue.projectobjectYesProject details including name and slug
data.issue.urlstringNoDirect link to the issue in Sentry
data.issue.countstringNoTotal event count for this issue

Custom

The custom provider accepts a flexible schema for alerts from tools not natively supported. Set the webhook URL to https://api.causeflow.ai/v1/webhooks/{tenantId}/custom.
{
  "title": "Payment processor connection pool exhausted",
  "description": "All 50 connections in the payment processor pool are in use. New requests are being queued and timing out after 30s. Affecting: checkout, subscriptions, refunds services.",
  "severity": "high",
  "externalId": "alert-pagerduty-abc123",
  "metadata": {
    "source": "pagerduty",
    "runbookUrl": "https://wiki.example.com/runbooks/payment-pool",
    "affectedServices": ["checkout", "subscriptions", "refunds"]
  }
}
FieldTypeRequiredDescription
titlestringYesShort, descriptive title for the incident
descriptionstringYesFull description with as much context as possible for AI analysis
severitystringNoOne of: critical, high, medium, low. Defaults to medium if not provided
externalIdstringNoYour system’s unique ID for this alert — used for deduplication
metadataobjectNoArbitrary key-value pairs for additional context. Passed through to the investigation