All Integrations
StandardsPub/Sub subscriber / log sink

GCP Cloud Logging Integration

Route Google Cloud Logging entries via log sinks to TigerOps. Monitor GKE, Cloud Run, and Cloud Functions logs with AI anomaly detection and full GCP resource correlation.

Setup

How It Works

01

Create a Log Sink

Create a Cloud Logging log sink in your GCP project targeting a Pub/Sub topic. Use a log filter to forward only the log entries you need to TigerOps.

02

Deploy TigerOps Pub/Sub Subscriber

Deploy the TigerOps Cloud Run service or Cloud Function that subscribes to your Pub/Sub topic and forwards log entries to the TigerOps ingestion endpoint in real time.

03

Configure Log Entry Parsing

TigerOps automatically parses the Cloud Logging LogEntry proto fields: resource, labels, httpRequest, operation, sourceLocation, and jsonPayload. Define additional extraction rules for custom payloads.

04

Correlate with Cloud Monitoring

TigerOps joins Cloud Logging entries with Cloud Monitoring metrics via resource labels. Correlate GKE pod restarts in logs with container CPU and memory spikes from the same workload.

Capabilities

What You Get Out of the Box

Log Sink and Pub/Sub Streaming

Real-time log routing via Cloud Logging log sinks to Pub/Sub. TigerOps subscribers process Pub/Sub messages with automatic retries and dead-letter queue support for guaranteed delivery.

GCP Resource Label Preservation

GCP resource labels (project_id, zone, cluster_name, namespace_name, pod_name) from LogEntry.resource are stored as searchable dimensions. Filter and aggregate logs by any GCP resource attribute.

GKE Workload Log Parsing

Parse GKE container logs with automatic namespace, pod, and container_name enrichment from Cloud Logging resource labels. Track per-workload error rates and log volumes across your GKE clusters.

Cloud Run and Cloud Functions Monitoring

Track Cloud Run revision latency, concurrency, and error rates from structured log entries. Monitor Cloud Functions execution duration, memory usage, and cold start frequency per function.

Organization-Wide Log Aggregation

Create organization-level log sinks to aggregate logs from all GCP projects and folders into a single TigerOps workspace. Manage multi-project observability without per-project setup.

Cloud Audit Log Security Monitoring

Ingest Cloud Audit Logs (Admin Activity, Data Access, System Event) into TigerOps. AI detects anomalous IAM changes, unusual API call patterns, and suspicious data access events.

Configuration

Terraform: GCP Log Sink to Pub/Sub

Create a Cloud Logging sink and Pub/Sub topic for TigerOps log ingestion using Terraform.

gcp-tigerops-logging.tf
# Pub/Sub topic for TigerOps log delivery
resource "google_pubsub_topic" "tigerops_logs" {
  name    = "tigerops-logs"
  project = var.project_id

  message_retention_duration = "86400s"  # 24h retention
}

# Cloud Logging sink routing to Pub/Sub
resource "google_logging_project_sink" "tigerops" {
  name        = "tigerops-log-sink"
  project     = var.project_id
  destination = "pubsub.googleapis.com/${google_pubsub_topic.tigerops_logs.id}"

  # Forward all logs except debug-level
  filter = "severity >= WARNING OR (severity = DEFAULT AND resource.type = k8s_container)"

  unique_writer_identity = true
}

# Grant the sink writer access to Pub/Sub
resource "google_pubsub_topic_iam_member" "sink_writer" {
  topic  = google_pubsub_topic.tigerops_logs.id
  role   = "roles/pubsub.publisher"
  member = google_logging_project_sink.tigerops.writer_identity
}

# TigerOps Pub/Sub subscription
resource "google_pubsub_subscription" "tigerops" {
  name    = "tigerops-logs-sub"
  topic   = google_pubsub_topic.tigerops_logs.id
  project = var.project_id

  push_config {
    push_endpoint = "https://ingest.atatus.net/gcp/pubsub"
    oidc_token {
      service_account_email = var.tigerops_sa_email
    }
    attributes = {
      x-goog-version = "v1"
    }
  }

  ack_deadline_seconds       = 60
  message_retention_duration = "3600s"
}
FAQ

Common Questions

What is the recommended GCP log sink destination for TigerOps?

Use a Pub/Sub topic as the sink destination with a TigerOps Cloud Run subscriber. This architecture handles variable log volumes automatically, supports retries, and provides exactly-once semantics via Pub/Sub acknowledgments.

Can I create an organization-level sink to capture logs from all GCP projects?

Yes. Create a log sink at the Organization or Folder level with includeChildren: true. All matching log entries from member projects flow through the Pub/Sub topic to TigerOps, tagged with the source project_id.

How do I filter Cloud Logging entries before they reach TigerOps?

Apply a log filter on the Cloud Logging sink using the advanced filter syntax (resource.type="k8s_container" AND severity>=WARNING). Only matching entries are published to Pub/Sub, reducing cost and noise in TigerOps.

Does TigerOps support Cloud Logging Log-Based Metrics migration?

Yes. TigerOps provides a migration tool that converts your Cloud Logging Log-Based Metric filters and alerting policies to equivalent TigerOps metric extraction rules and AI alert conditions.

What permissions does the TigerOps Pub/Sub subscriber need?

The TigerOps Cloud Run service account needs roles/pubsub.subscriber on the subscription and roles/logging.logWriter is not needed (log sink writes to Pub/Sub directly). Use Workload Identity for keyless authentication.

Get Started

Get More from GCP Cloud Logging with TigerOps

Organization-wide log aggregation, AI anomaly detection, and audit log security monitoring. Deploy in 15 minutes.