All Integrations
AlertingREST Endpoint Integration

VictorOps / Splunk On-Call Integration

Route TigerOps alerts through Splunk On-Call with AI summaries, full incident context, and escalation chain metrics. On-call engineers get everything they need to respond faster from the first page.

Setup

How It Works

01

Get Your Splunk On-Call API ID and Key

In Splunk On-Call, go to Integrations > REST Endpoint and note your API ID and routing key. You will use these to authenticate TigerOps outbound alerts.

02

Add Splunk On-Call in TigerOps

Enter your Splunk On-Call API ID, REST endpoint URL, and routing keys in the TigerOps integration panel. TigerOps sends a test incident to verify delivery.

03

Map Routing Keys to Alert Policies

Assign Splunk On-Call routing keys to TigerOps alert policies. Critical production alerts route to the primary on-call team; infrastructure alerts route to the ops team.

04

Alerts Arrive with Full Incident Context

On-call engineers receive TigerOps alerts in Splunk On-Call with AI summaries, metric links, trace IDs, and one-click deep links — reducing investigation time from the first page.

Capabilities

What You Get Out of the Box

AI Summary in Every Page

Splunk On-Call notifications include a TigerOps AI-generated summary of what broke and probable cause — so on-call engineers start with context, not just an alert title.

Routing Key Flexibility

Map multiple Splunk On-Call routing keys to different TigerOps alert policies, severities, or service tags for granular on-call team targeting.

Automatic Incident Recovery

When TigerOps auto-resolves an incident, a RECOVERY message is sent to Splunk On-Call, automatically resolving the incident and halting any active escalation policies.

Escalation Chain Metrics

TigerOps tracks escalation chain depth and time-to-acknowledge per Splunk On-Call routing key, surfacing which on-call rotations have the highest MTTA in your analytics.

Acknowledgment Reflection

When a Splunk On-Call engineer acknowledges a page, TigerOps reflects the acknowledgment and assignee in the TigerOps incident timeline, stopping further escalation.

Payload Annotations

TigerOps enriches the Splunk On-Call payload with custom annotations: SLO burn rate, last deployment SHA, impacted endpoint, and error budget remaining.

Configuration

Splunk On-Call Routing Config

Configure routing keys and alert payload enrichment for Splunk On-Call integration.

victorops-integration.yaml
# TigerOps — Splunk On-Call (VictorOps) Integration Config

integrations:
  victorops:
    api_id: "${VICTOROPS_API_ID}"
    rest_endpoint_url: "https://alert.victorops.com/integrations/generic/20131114/alert/${ROUTING_KEY}"
    min_severity: warning
    auto_resolve: true

    routing_key_mappings:
      - severity: critical
        tags:
          team: platform
        routing_key: "${VICTOROPS_ROUTING_KEY_PLATFORM}"
      - severity: critical
        tags:
          team: payments
        routing_key: "${VICTOROPS_ROUTING_KEY_PAYMENTS}"
      - severity: warning
        routing_key: "${VICTOROPS_ROUTING_KEY_GENERAL}"

    payload_annotations:
      vo_entity_id: "tigerops-{{incident_id}}"
      vo_entity_display_name: "{{service_name}}: {{incident_title}}"
      vo_message_type: "CRITICAL"    # CRITICAL | WARNING | INFO | ACKNOWLEDGEMENT | RECOVERY
      state_message: "{{ai_summary}}

Metric: {{metric_name}} = {{metric_value}}
Trace: {{trace_url}}"
      slo_burn_rate: "{{slo_burn_rate}}"
      deployment_sha: "{{last_deploy_sha}}"

    inbound_webhook:
      url: "https://ingest.tigerops.io/webhooks/victorops"
      events: [acknowledge, resolve]
FAQ

Common Questions

Does TigerOps use the Splunk On-Call REST endpoint or the Alert Ingestion API?

TigerOps uses the Splunk On-Call REST Endpoint Integration, which supports TRIGGER, ACKNOWLEDGE, and RECOVERY message types. This is the standard and most broadly supported integration method for Splunk On-Call.

Can TigerOps route to different Splunk On-Call teams based on service ownership?

Yes. Each TigerOps alert policy can specify a different Splunk On-Call routing key. You can configure service-based routing so that, for example, payments service alerts go to the payments team routing key and API alerts go to the platform team.

What happens in Splunk On-Call when TigerOps resolves an incident?

TigerOps sends a RECOVERY message type to Splunk On-Call with the same entity_id used when the incident was triggered. Splunk On-Call will auto-resolve the alert and stop any active escalation chains.

Can I see Splunk On-Call acknowledgment data inside TigerOps?

Yes. When you configure the TigerOps webhook URL in Splunk On-Call's outbound webhook settings, acknowledgment and resolution events from Splunk On-Call are received by TigerOps and reflected in the incident timeline.

Does TigerOps support Splunk On-Call's "On-Call Take" feature for manual routing overrides?

TigerOps controls which routing key receives the alert at the time of dispatch. Manual overrides within Splunk On-Call (such as taking on-call) are handled natively by Splunk On-Call and are independent of TigerOps routing configuration.

Get Started

Page On-Call Teams with Full AI Context

No credit card required. Connect Splunk On-Call in minutes. Context-rich pages from the first alert.