All Integrations
CloudGCP Service Account + Cloud Monitoring API

Google BigQuery Integration

Track query performance, slot utilization, and cost metrics for your BigQuery data warehouse. Detect expensive query anomalies and optimize slot usage with AI-driven insights.

Setup

How It Works

01

Create a GCP Service Account

Create a service account with the BigQuery Resource Viewer and Monitoring Viewer roles. This grants TigerOps read-only access to BigQuery job and reservation metrics.

02

Enable Required APIs

Enable the Cloud Monitoring API and BigQuery API in your project. If using reservations, also enable the BigQuery Reservation API to expose slot metrics.

03

Configure TigerOps BigQuery

Add your project credentials to TigerOps and specify which datasets and reservation slots to monitor. TigerOps begins collecting job metrics and slot utilization immediately.

04

Set Cost and Performance Alerts

Define slot utilization thresholds, query duration SLOs, and estimated cost alerts. TigerOps fires alerts when expensive or long-running queries are detected in real time.

Capabilities

What You Get Out of the Box

Query Performance Tracking

Monitor query execution times, bytes processed, rows returned, and slot milliseconds consumed per job. TigerOps identifies slow queries and anomalous byte scans before they impact your cost.

Slot Utilization Monitoring

Track slot utilization across flat-rate and reservation-based BigQuery projects. TigerOps alerts when slots approach capacity and identifies which jobs are consuming the most resources.

Cost Anomaly Detection

TigerOps AI baseline models query costs and alerts on unexpected cost spikes. Correlate cost anomalies with specific users, service accounts, or query patterns in one click.

Job Queue Depth

Monitor queued job counts, job failure rates, and job type distribution (query, load, export, copy). TigerOps surfaces bottlenecks in your data pipeline processing flow.

Dataset & Table Metrics

Track table sizes, storage billing models (active vs. long-term), and dataset-level query rates. Monitor streaming insert rates and error counts for real-time data ingestion pipelines.

Reservation & BI Engine

Monitor BigQuery Reservations capacity commitment utilization, assignment distribution, and BI Engine cache hit rates. Optimize your reservation allocations with usage-driven recommendations.

Configuration

BigQuery Integration Setup

Configure your GCP service account and BigQuery monitoring settings for TigerOps.

tigerops-bigquery.yaml
# TigerOps BigQuery Integration
# Required IAM roles:
#   roles/bigquery.resourceViewer
#   roles/monitoring.viewer

integrations:
  gcp_bigquery:
    project_id: "your-gcp-project-id"
    credentials_file: "./tigerops-sa-key.json"

    # Job monitoring settings
    jobs:
      lookback_window: 5m
      include_job_types:
        - QUERY
        - LOAD
        - EXTRACT
        - COPY

    # Reservation monitoring (flat-rate only)
    reservations:
      enabled: true
      admin_project: "your-gcp-project-id"

    # Metric collection
    metrics:
      - bigquery.googleapis.com/storage/table_count
      - bigquery.googleapis.com/storage/stored_bytes
      - bigquery.googleapis.com/job_count

    # Cost and performance alerts
    alerts:
      query_duration_seconds: 300
      bytes_billed_gb: 100
      slot_utilization_percent: 90
      daily_cost_usd: 500
      failed_job_count: 5
FAQ

Common Questions

Does TigerOps access my BigQuery data or only metadata?

TigerOps only accesses job metadata and system metrics via the Cloud Monitoring API and BigQuery Jobs API. It never queries your tables or reads your data. The required service account role (BigQuery Resource Viewer) does not grant data access.

Can TigerOps alert me when a query exceeds a cost threshold?

Yes. TigerOps monitors bytes billed per job and can fire alerts when a single query exceeds a configurable cost threshold based on your BigQuery pricing tier. You can also set alerts on cumulative daily or monthly estimated spend.

How does TigerOps handle on-demand vs. flat-rate BigQuery pricing?

TigerOps supports both pricing models. For on-demand projects it monitors bytes billed per job for cost tracking. For flat-rate and reservation-based projects it monitors slot utilization and reservation capacity to optimize your commitment spend.

Can TigerOps identify which users or service accounts are running expensive queries?

Yes. BigQuery job metadata includes the user email and service account that submitted each job. TigerOps aggregates cost and slot consumption by identity, making it easy to identify top consumers and enforce query cost policies.

Does TigerOps support BigQuery Omni for multi-cloud analytics?

TigerOps monitors BigQuery Omni jobs that appear in your GCP project job history via the BigQuery Jobs API. Cross-cloud query execution metrics are included in the same dashboard as standard BigQuery jobs.

Get Started

Stop Discovering BigQuery Cost Spikes After the Bill Arrives

Real-time query cost monitoring, slot utilization alerts, and AI anomaly detection for BigQuery. Connect in minutes.