Integrating with Splunk On-Call (VictorOps)

FireHydrant's Splunk On-Call (formerly VictorOps) integration enables several capabilities within the platform:

  • Automatically create FireHydrant incidents from Splunk On-Call alerts
  • Notify different channels based on incoming alert content such as routing keys
  • Link Services and Functionalities to routing keys in Splunk On-Call, as well as page out to them
  • Automatically pull in on-call responders from routing keys to incident channels

Installation

Connecting Splunk On-Call to FireHydrant

To get started:

  1. Go to Settings > Integrations list.

  2. Search for the Splunk On-Call integration and click the '+' (this tile is still labeled "VictorOps" in the FireHydrant UI, although the product was acquired and renamed to "Splunk On-Call.")

  3. Enter your Splunk On-Call Organization name.

  4. Enter your Splunk On-Call API ID and an API Key. You can find and generate these on the Splunk On-Call Integrations page under the API tab. API ID is at the top, and then the API Key is generated and shown in the list of API keys below.

    • If you don't see this as an option in the UI, reach out to your Spunk On-Call administrator.
  5. Enter your REST Endpoint Token.

    • To find your REST Endpoint Token, go to Splunk On-Call's Integrations page and locate the REST integration. If the REST endpoint integration has not been enabled, click the blue Enable button to generate your endpoint destination URL. For details, see the Splunk On-Call documentation.
    • Under URL to notify, your routing key is included in the destination URL, between /alert/ and /$routing_key. It should look like a UUID. For example:
    Splunk On-Call routing key
    Splunk On-Call routing key
  6. Click Update Configuration. Now Splunk On-Call will show up in your connected and available integrations list.

Setting up Outgoing Webhooks

Note: Splunk On-Call's Outgoing Webhooks require an Enterprise plan with Splunk On-Call. In addition, to view or modify them, you must have administrative credentials.

After you configure your integration in FireHydrant, you need to set up an outgoing Splunk On-Call webhook so FireHydrant can receive alerts and hooks.

  1. Return to the configuration page on FireHydrant by clicking the Splunk On-Call tile on the Integrations page. This opens your Splunk On-Call configuration details page, where a webhook URL has been generated for you.
  2. Copy the webhook address provided.
Splunk On Call FireHydrant Webhook
Splunk On Call FireHydrant Webhook
  1. Go to your Splunk On-Call account at https://portal.victorops.com.

  2. Click Integrations > Outgoing Webhooks.

    • Note: This is not to be mistaken with Integrations > 3rd Party Integrations > Webhooks.
    • If you don't see Outgoing Webhooks as an available tab under Integrations, try visiting this link directly: https://portal.victorops.com/dash/YOUR_ORG/outgoing-webhooks.
    • If that also doesn't work, you may need to reach out to your Splunk On-Call admin and/or Splunk On-Call support.
    Splunk On-Call Outgoing Webhooks
    Splunk On-Call Outgoing Webhooks
  3. Click Add Webhook and enter in the following information:

    • Name: Replace Outgoing Webhook with a more descriptive name, if you'd like.
    • Event: Select Any-Incident.
    • Content Type: Leave as application/json.
    • Custom Headers: No custom headers are needed.
    • To: Paste the webhook URL copied in step #2 of this section.
    • Payload: If you leave the payload empty, it should use the default.
    • Description: Provide a detailed description for this webhook, if you'd like.
  4. Click Save to create the new webhook.

Checking the installation

Once you've completed the Installation and Webhooks setup sections above, you should have a default routing rule already in place for Splunk On-Call to notify the configured Slack alert channel.

Splunk On-Call Default Alert Route
Splunk On-Call Default Alert Route

In your Slack integration settings, make sure you have a channel set for alerts and that the @FireHydrant bot is invited to it.

Slack alerts channel config
Slack alerts channel config

With the above prerequisites, creating an alert in Splunk On-Call should now post an alert to the appropriate Slack channel!

Splunk On-Call Default Route Test
Splunk On-Call Default Route Test

In Splunk On-Call, you submit an alert to a Routing Key. That Routing Key dictates one or more escalation policies that will be alerted. FireHydrant allows you to import Routing Keys and link them to specific Services or Functionalities, which unlocks capabilities like:

Refer to Splunk On-Call's documentation for details on managing alert Routing Keys.

To configure the link and import process:

  1. In the FireHydrant left nav, go to Service Catalog > Services.
  2. Click Add service > Import from third party.
  3. On this screen, click on Import next to Splunk On-Call.
Importing from 3rd-party sources like Splunk On-Call (VictorOps)
Importing from 3rd-party sources like Splunk On-Call (VictorOps)
  1. Next, you can choose to Import all as new services, which effectively imports all of your routing keys and creates corresponding services for all of them in FireHydrant. Or, you can choose Select services to import to selectively choose which keys you'd like to import.
  2. (If selecting) Choose the routing keys you wish to import and either create new services from them or link them to existing services by specifying in the dropdown.
  3. To confirm your newly imported or newly-linked service, head to the service's details and ensure an entry shows up for your routing key under Integration links.
Integration link for Splunk On-Call (VictorOps)
Integration link for Splunk On-Call (VictorOps)

Next Steps

Now that you've configured the Splunk On-Call integration, you can take the next steps to get the full power of FireHydrant + Splunk On-Call:


Splunk On-Call Alert Routing

Once your Splunk On-Call instance is configured, you can setup Alert Routes to take action on your alerts based on the data included in the alert. You can automatically open new incidents, send alerts to any Slack channel, log an alert in FireHydrant, or simply ignore it.

To learn more, read about Alert Routes.

Parameter Mappings

Here is the table of routable parameters on FireHydrant and the corresponding key/value from the inbound Splunk On-Call webhook(s). The $ refers to the webhook body content as a JSON object.

An explanation of Splunk On-Call's Outgoing Webhooks can be found in their docs here.

Parameter Name Notes
request.body.ALERT.fh_catalog_ids An additional parameter you can configure from the Splunk On-Call Outgoing Webhooks. People generally will use this to send over UUIDs of catalog items, but this does require explicit configuration rather than just using the default. See Step #5 of the Outgoing Webhooks section above.
request.body.INCIDENT.INCIDENT_ID The incident number/ID as assigned by Splunk On-Call.
request.body.INCIDENT.CURRENT_PHASE The current phase of the incident as specified by Splunk On-Call. Values (as of this document's writing) are UNACKED, ACKED, and RESOLVED.
request.body.INCIDENT.INCIDENT_TIMESTAMP The time of incident creation in millisecond epoch time.
request.body.ALERT.routing_key The routing key of the alert created in Splunk On-Call.
request.body.ALERT.state_message The title of the alert as it was created in Splunk On-Call.
request.body.INCIDENT.SERVICE Should be the title of the alert, same as state_message above.

The following table is a mapping of our overall Alert Routing mapping object - these parameters are common across all Alert Routing/Monitoring integrations.

Parameter Name Notes
Alert Summary The same as INCIDENT.SERVICE above
Alert Description The same as ALERT.state_message above
Alert Priority Maps to nothing for Splunk On-Call.
Alert Status Aligned with INCIDENT.CURRENT_PHASE above. If the incident is ACKED in Splunk On-Call, then this value is Acknowledged on FireHydrant. If it is RESOLVED on Splunk On-Call, then it is Resolved on FireHydrant.
Alert Associated Infrastructure The associated Service/Functionality in FireHydrant. In order for us to detect this, you must have linked a Service or Functionality to a Routing Key in Splunk On-Call.

Last updated on 2/9/2024