Configure Google Workspace audit logs for the Splunk Add-on for Google Cloud Platform¶
Configure the HTTP Event Collector (HEC) to ingest Google Workspace (GWS) audit logs.
To Configure Google GWS audit logs for the Splunk Add-on for Google Cloud Platform, perform the following steps.
- Configure, view, and route audit logs for Google Workspace to Google Cloud. See the View and manage audit logs for Google Workspace topic in the Google Cloud documentation.
- Share data from your Google Workspace account with services in your organization’s Google Cloud Platform (GCP) account. See the Share data with Google Cloud Platform services topic in the Google Workspace Admin Help documentation.
- Export your audit logs to your Google Cloud Pub/Sub. See the Configure and manage sinks topic in the Operations Suite manual in the Google Cloud documentation.
- Configure Cloud Pub/Sub inputs for Splunk Add-on for Google Cloud Platform. See the Configure Cloud Pub/Sub inputs for Splunk Add-on for Google Cloud Platform topic in this manual.
Export your Google Workspace directory user list to the Splunk Add-on for Google Cloud Platform¶
- Download the users list and export it to their pubsub topic. See the Download a list of users topic in the Advanced user management section of the Google Workspace Admin Help documentation.
- Configure Cloud Pub/Sub inputs for Splunk Add-on for Google Cloud Platform. See the Configure Cloud Pub/Sub inputs for Splunk Add-on for Google Cloud Platform topic in this manual.
For information on Assets and Identity extractions and configurations, see the Collect and extract asset and identity data in Splunk Enterprise Security topic in the Administer Splunk Enterprise Security chapter in the Splunk Enterprise Security manual.
Configure Pub/Sub topics in Google Cloud¶
Configure Pub/Sub topics in Google Cloud to ingest data into your Splunk platform deployment.
- Navigate to the Google Cloud project you’ve configured to be used for the log aggregation across your organization.
- Create the Pub/Sub Topics. Navigate to Pub/Sub in your project and
create two topics:
- A primary topic to hold messages to be delivered.
- A secondary, dead-letter topic, to store undeliverable messages
when Dataflow cannot stream to the HTTP Event Collector (HEC).
For example, a misconfigured HEC SSL certificate, disabled HEC
token, or message processing error by Dataflow.
- Primary:
<topic-name>
- Secondary:
<topic-name>
- Primary:
- Create your subscription to query the both topics created in the
last step.
- Enter any name for your subscription
- Select the Pub/Sub primary topic created in the previous step
- Leave the rest of the values default or customize to your organization’s preference
- Repeat the same steps for your dead-letter topic
-
Create an organization-level aggregated log sink. This lets administrators configure one aggregated sink, and capture all logs across an organization, and projects that should be sent to the Pub/Sub topic created above.
You cannot create aggregated sinks through the Google Cloud Console. They must be configured and managed through either the API or gcloud CLI tool. Only project-level (non-aggregated) sinks show up in Google Cloud Console at this time.
-
Open a Cloud Shell in the active project
-
Enter the following in the Cloud Shell to create the aggregated sink:
(Optional) If you want to export more than GSuite events, modify thegcloud logging sinks create <sample-sink> \ pubsub.googleapis.com/projects/<sample-project-sink>/topics/topic-name --include-children \ --organization=[organization_id] \ --log-filter='logName:"organizations/<unique organization identifier>/logs/cloudaudit.googleapis.com"'
--log-filter
to capture any additional logs you want to export.See the Google Cloud documentation for more information on creating aggregated log sinks.
-
Update permissions for the service account that you created in the previous step. Updating permissions on your service account allows the sink service account to publish messages to your previously created Pub/Sub input topics. To update the permissions, copy the entire name and run the following in the Google Cloud Console:
- Open a cloud shell in the active project, or use the existing shell.
- Enter the following into the shell.
gcloud pubsub topics add-iam-policy-binding my-logs \ --member serviceAccount:<service account name from previous step>\ --role roles/pubsub.publisher
- (Optional)Validate the service account and permission
association with the following command:
gcloud logging sinks describe kitchen-sink --organization=organization_id
-
Referencing the logging configurations that you set up in the previous steps, configure the Dataflow template to output the logs to the Splunk HEC.
- Navigate to Dataflow and select Create New Job From Template
-
Populate the following fields:
- Job name (Any)
- Preferred Region
- Cloud Dataflow Template: Cloud Pub/Sub to Splunk
- Pub/Sub Subscription name created in previous steps
- HEC token, created in previous steps
- HEC URL, created in previous steps
- DLT Topic, created in previous steps
- Source, The token default source value. The Splunk Software assigns this value to data that doesn’t already have a source value set.
- Sourcetype, The token default sourcetype value. The Splunk software assigns this value to data that doesn’t already have a sourcetype value set.
- Any bucket name. If you have not created a bucket, navigate
to Storage and create a new bucket. The syntax for the
bucket name is
gs://bucketName
. -
Expand Optional Parameters
- Set Batch size for sending multiple events to Splunk HEC to 2 (can be adjusted later depending on your volume)
- Set Maximum Number of Parallel Requests to 8 (can be adjusted later depending on your volume)
- Set Max workers to 2 (can be adjusted later depending on your volume).
The default is 20 which will incur unnecessary total Persistent Disk cost if not fully utilized.
-
Enter any additional settings pertinent to your organization.
- Run job.