Skip to content

Configure Google Security Command Center logs for the Splunk Add-on for Google Cloud Platform

Use this procedure to configure Google Security Command Center (SCC) findings for ingestion into Splunk through the HTTP Event Collector (HEC).

Integration workflow

This integration uses Google Cloud Pub/Sub and Dataflow to deliver SCC findings to Splunk:

  1. Security Command Center exports findings to a Pub/Sub topic.
  2. A Pub/Sub subscription attached to that topic serves as the input queue.
  3. A Dataflow job based on the Google‑provided “Pub/Sub to Splunk” template:

    • Pulls messages from the Pub/Sub subscription.
    • Sends them over HTTPS to Splunk HEC.
  4. If Dataflow cannot deliver an event to Splunk (for example, due to HEC connectivity or authentication issues), it sends the event to a dead‑letter Pub/Sub topic.

Resulting events appear in Splunk with the google:gcp:security:alerts sourcetype.

Note (Push-based ingestion)

This integration uses a push-based mechanism to ingest data into Splunk. The Dataflow job running in Google Cloud initiates HTTPS connections to push events directly into the Splunk HTTP Event Collector (HEC).

Prerequisites

Before you begin, ensure that:

  1. You have a Splunk deployment (Splunk Cloud Platform or Splunk Enterprise) with:
  • HTTP Event Collector (HEC) enabled.
  • A dedicated HEC token created for this data flow.
    • Network access that allows inbound HTTPS traffic to the HEC endpoint from Google Cloud.
  1. You have a Google Cloud project where:
  • Security Command Center is enabled and producing findings.
  • The following APIs are enabled:
    • Cloud Pub/Sub API
    • Cloud Dataflow API
  1. You have the necessary Google Cloud IAM permissions to:
  • Create Pub/Sub topics and subscriptions.
  • Create and run Dataflow jobs.
  • Configure SCC continuous export.

Dataflow requirements

To run the Pub/Sub to Splunk pipeline, Google Dataflow requires:

  • Dataflow worker service account permissions

  • The service account used by the Dataflow job must have, at minimum:

    • roles/pubsub.subscriber on the input Pub/Sub subscription.
    • roles/pubsub.publisher on the dead‑letter Pub/Sub topic (if used by the template).
  • Additional roles may be required as documented in the Google Pub/Sub to Splunk template documentation.

  • Network connectivity

    Dataflow workers must be able to make outbound HTTPS connections to your Splunk HEC endpoint:

    • For Splunk Cloud: https://http-inputs-<stack>.splunkcloud.com:443/services/collector
    • For Splunk Enterprise: https://<splunk-hec-host>:8088/services/collector (or your configured HEC port).

    Ensure the VPC, firewall rules, and any proxies allow this traffic.

  • HEC configuration

    • HEC is enabled on your Splunk deployment.
    • A HEC token dedicated to this pipeline is created and active.
    • The token is configured to send data to the correct index and use the google:gcp:security:alerts sourcetype, or you have props/transforms to assign this sourcetype.
  • Template

    This procedure assumes you are using the Google‑provided Pub/Sub to Splunk Dataflow template. For the most current template parameters and options, see the Pub/Sub to Splunk template documentation at https://cloud.google.com/dataflow/docs/guides/templates/provided-templates#pubsub_to_splunk.

Configuration process

Follow this process to configure Google Security Command Center logs for the Splunk Add-on for Google Cloud Platform:

  1. Configure HEC in Splunk
  2. Create Pub/Sub topics
  3. Create Pub/Sub subscriptions
  4. Export SCC findings to Pub/Sub
  5. Create a Dataflow pipeline (Pub/Sub to Splunk)

Configure HEC in Splunk

  1. In your Splunk deployment, navigate to Settings > Data inputs > HTTP Event Collector.
  2. Create a new HEC token (or identify an existing one) dedicated to SCC findings.
  3. Ensure that events sent with this token use the google:gcp:security:alerts sourcetype, for example configure one of the following:

    • Set the sourcetype on the token.
    • Configure props.conf / transforms.conf to assign google:gcp:security:alerts.
  4. Note the following information for later:

    • HEC URL (for example: https://http-inputs-<stack>)
    • HEC token
    • Index (optional, if you want to reference it in your own configuration or searches)

Create Pub/Sub topics

Use this procedure to create two Pub/Sub topics in your Google Cloud project. You need two topics for different purposes:

  • The primary topic receives findings exported from Security Command Center.
  • The dead-letter topic stores messages that Dataflow cannot successfully send to Splunk HEC (for example, HEC SSL misconfiguration, disabled HEC token, or message processing error by Dataflow). This lets you detect and troubleshoot delivery problems without silently losing data.
  1. In the Google Cloud console, go to Pub/Sub > Topics.
  2. Create a primary topic to hold SCC findings, for example:

    • Name: scc-findings-topic
  3. Create a dead‑letter topic for undeliverable events, for example:

    • Name: scc-findings-dlq
  4. Use default settings unless your organization has specific requirements.

Create Pub/Sub subscriptions

Create subscriptions for both topics:

  1. In the Google Cloud console, go to Pub/Sub > Subscriptions and click Create subscription.
  2. Configure the subscription for the primary topic:

    • Subscription ID: for example, scc-findings-sub
    • Topic: select the primary topic created in the previous step, for example scc-findings-topic
    • Delivery type: leave Pull as the default (recommended for Dataflow)
    • Leave other values at their defaults or adjust to your organization’s requirements (for example, message retention duration, acknowledgement deadline).
  3. Click Create.

  4. Repeat this task to create a subscription for the dead‑letter topic, for example:
    • Subscription ID: scc-findings-dlq-sub
    • Topic: scc-findings-dlq

Export SCC findings to Pub/Sub

Configure Security Command Center to continuously export findings to the primary Pub/Sub topic.

  1. In the Google Cloud console, go to Security > Security Command Center > Findings.
  2. Create or edit a continuous export:

    • Select the scope and any filters required for the findings you want to export.
    • Choose Pub/Sub as the export destination.
    • Select the primary topic created earlier, for example scc-findings-topic.
  3. For detailed instructions, see “Create a continuous export to Pub/Sub” at https://cloud.google.com/security-command-center/docs/how-to-continuous-export in the Google Security Command Center documentation.

Once configured, SCC findings matching your filters will be published to the primary Pub/Sub topic.

Create a Dataflow pipeline (Pub/Sub to Splunk)

Create and run a Dataflow streaming job that pulls from the Pub/Sub subscription and sends data to Splunk HEC.

  1. In the Google Cloud console, go to Dataflow > Jobs and click Create job from template.
  2. Provide a Job name, for example: scc-findings-to-splunk
  3. Select a Region for the Dataflow job, for example, us-central1.
  4. From Dataflow template, choose one of the following:

    • Pub/Sub to Splunk (Streaming)
    • the exact template name as shown in the Google UI.

Example configuration

In the Job parameters section, you must configure the following parameters:

  1. Pub/Sub input subscription, for example projects/<PROJECT_ID>/subscriptions/scc-findings-sub

  2. Splunk HEC URL, for example:

    • Splunk Cloud: https://http-inputs-<stack>.splunkcloud.com:443/services/collector
    • Splunk Enterprise: https://<splunk-hec-host>:8088/services/collector
  3. Splunk HEC authentication token: the HEC token created in Configure HEC in Splunk.

  4. Dead‑letter Pub/Sub topic, for example: projects/<PROJECT_ID>/topics/scc-findings-dlq

  5. (Optional) Additional parameters: You can configure additional parameters. Depending on the template version, you may see parameters such as:

    • Batch size
    • Number of workers / max workers
    • Whether to disable certificate validation
    • Retry and backoff settings

Use defaults unless you have specific performance or security requirements. Refer to the Pub/Sub to Splunk template documentation at https://cloud.google.com/dataflow/docs/guides/templates/provided-templates#pubsub_to_splunk for details.

  1. Create Dataflow job from template for SCC findings Create Dataflow job from template for SCC findings

    • Job name
    • Region
    • Template set to Pub/Sub to Splunk (Streaming)
  2. Dataflow job parameters for SCC to Splunk Dataflow job parameters for SCC to Splunk

  • Pub/Sub input subscription
  • Splunk HEC URL
  • HEC token (obscured)
  • Dead‑letter Pub/Sub topic
  1. Confirm the worker service account has the necessary Pub/Sub and network permissions described in Dataflow requirements.
  2. Click Run job.

Dataflow starts a streaming job that pulls SCC findings from Pub/Sub and forwards them to Splunk HEC.

Verify data in Splunk and troubleshoot

  1. In your Splunk deployment, search for events with the SCC sourcetype, for example:
sourcetype="google:gcp:security:alerts"