Logging


Intro

My notes on GCP’s logging capabilities

 


Documentation


Publish Events Extracted From GCP Logs

A topic can receive events from GCP’s logging services. To do this, you need to create a sink which monitors the logs for the events you desire and specifies the topic as the destination.

gcloud logging sinks create

  • The sink's destination can be a Cloud Storage bucket, a BigQuery dataset, or a Cloud Pub/Sub topic.

  • The destination must already exist and Cloud Logging must have permission to write to it

The example below publishes an event anytime an instance is created.

gcloud logging sinks create <sink name> "pubsub.googleapis.com/projects/rogercruz/topics/<topic name>" --folder=<folder ID> --log-filter="resource.type=gce_instance AND jsonPayload.event_subtype=compute.instances.insert AND jsonPayload.event_type=GCE_OPERATION_DONE" --include-children

 

The best way to figure out what to put for the filter string is to create the resource and use the log viewer to see the values you should search for.


Destination permissions

  • When you create a sink, Logging creates a new service account for the sink, called a unique writer identity. This service accont is returned as a result of the create operation.

  • You cannot manage this service account directly as it is owned and managed by Cloud Logging.

  • The service account is deleted if the sink gets deleted.

  • Your export destination must permit (roles/pubsub.publisher) this service account to write log entries.

Example from: https://cloud.google.com/logging/docs/reference/tools/gcloud-logging

gcloud pubsub topics create syslog-sink-topic gcloud logging sinks create syslog-sink pubsub.googleapis.com/projects/MY-PROJECT/topics/syslog-sink-topic \ --log-filter="severity>=WARNING" gcloud pubsub topics add-iam-policy-binding syslog-sink-topic \ --member serviceAccount:LOG-SINK-SERVICE-ACCOUNT --role roles/pubsub.publisher