arrow_back

Analyze BigQuery Usage with Log Analytics

Sign in Join
Get access to 700+ labs and courses

Analyze BigQuery Usage with Log Analytics

Lab 45 minutes universal_currency_alt 1 Credit show_chart Introductory
info This lab may incorporate AI tools to support your learning.
Get access to 700+ labs and courses

GSP1250

Overview

While BigQuery offers built-in observability capabilities like the INFORMATION_SCHEMA views, detailed logging remains crucial for in-depth usage analysis, auditing, and troubleshooting potential issues. Enabling Log Analytics lets you query and analyze your log data using familiar SQL queries, then view or chart the query results.

BigQuery is Google Cloud's fully managed enterprise data warehouse that helps you manage and analyze your data with built-in features like machine learning, geospatial analysis, and business intelligence.

In this lab, you enable Log Analytics on Logs Storage buckets, then view the BigQuery logs inside Cloud Logging. You also use SQL to analyze the logs using Log Analytics.

Objectives

In this lab you learn how to:

  • Use Cloud Logging effectively to get insight about BigQuery usage.
  • Effectively build and run SQL queries using Log Analytics.
  • View and chart the results.

Setup and requirements

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources are made available to you.

This hands-on lab lets you do the lab activities in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito (recommended) or private browser window to run this lab. This prevents conflicts between your personal account and the student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab—remember, once you start, you cannot pause a lab.
Note: Use only the student account for this lab. If you use a different Google Cloud account, you may incur charges to that account.

How to start your lab and sign in to the Google Cloud console

  1. Click the Start Lab button. If you need to pay for the lab, a dialog opens for you to select your payment method. On the left is the Lab Details pane with the following:

    • The Open Google Cloud console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Cloud console (or right-click and select Open Link in Incognito Window if you are running the Chrome browser).

    The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username below and paste it into the Sign in dialog.

    {{{user_0.username | "Username"}}}

    You can also find the Username in the Lab Details pane.

  4. Click Next.

  5. Copy the Password below and paste it into the Welcome dialog.

    {{{user_0.password | "Password"}}}

    You can also find the Password in the Lab Details pane.

  6. Click Next.

    Important: You must use the credentials the lab provides you. Do not use your Google Cloud account credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  7. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Google Cloud console opens in this tab.

Note: To access Google Cloud products and services, click the Navigation menu or type the service or product name in the Search field.

Activate Cloud Shell

Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.

  1. Click Activate Cloud Shell at the top of the Google Cloud console.

  2. Click through the following windows:

    • Continue through the Cloud Shell information window.
    • Authorize Cloud Shell to use your credentials to make Google Cloud API calls.

When you are connected, you are already authenticated, and the project is set to your Project_ID, . The output contains a line that declares the Project_ID for this session:

Your Cloud Platform project in this session is set to {{{project_0.project_id | "PROJECT_ID"}}}

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  1. (Optional) You can list the active account name with this command:
gcloud auth list
  1. Click Authorize.

Output:

ACTIVE: * ACCOUNT: {{{user_0.username | "ACCOUNT"}}} To set the active account, run: $ gcloud config set account `ACCOUNT`
  1. (Optional) You can list the project ID with this command:
gcloud config list project

Output:

[core] project = {{{project_0.project_id | "PROJECT_ID"}}} Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview guide.

Task 1. Configure log buckets

In this task, you configure Cloud Logging to upgrade the existing log buckets with Log Analytics enabled.

To upgrade existing buckets to use Log Analytics:

  1. In the Google Cloud console title bar, type Logs storage in the Search field, and then click Logs storage in the search results.

The Logs Storage page opens. Two buckets, _Default and _Required are listed in the Log buckets list.

  1. For the _Required log bucket, click Upgrade under the Log Analytics available column.
  2. Click Upgrade again to confirm the upgrade to use Log Analytics.
  3. Do the same thing for the _Default log bucket. Click Upgrade under the Log Analytics available column.
  4. Click Upgrade again to confirm the upgrade to use Log Analytics.

Click Check my progress to verify the objective. Upgrade the log buckets.

Task 2. Perform BigQuery activities

In this task, you generate BigQuery logs. To do this, you use the BigQuery command line tool bq.

  1. In Cloud Shell, enter the bq mk command to create new datasets named bq_logs and bq_logs_test in your project:
bq mk bq_logs bq mk bq_logs_test
  1. Enter the bq ls command to list the datasets:
bq ls
  1. Enter the bq rm command to delete the a dataset (select Y when prompted):
bq rm bq_logs_test
  1. Create a new table:
bq mk \ --table \ --expiration 3600 \ --description "This is a test table" \ bq_logs.test_table \ id:STRING,name:STRING,address:STRING

You should see a success message that a new empty table named test_table has been created for your dataset.

Now run some queries to generate logs.

  1. Enter the following query in Cloud Shell:
bq query --use_legacy_sql=false 'SELECT current_date'
  1. Enter the following command into Cloud Shell:
bq query --use_legacy_sql=false \ 'SELECT gsod2021.date, stations.usaf, stations.wban, stations.name, stations.country, stations.state, stations.lat, stations.lon, stations.elev, gsod2021.temp, gsod2021.max, gsod2021.min, gsod2021.mxpsd, gsod2021.gust, gsod2021.fog, gsod2021.hail FROM `bigquery-public-data.noaa_gsod.gsod2021` gsod2021 INNER JOIN `bigquery-public-data.noaa_gsod.stations` stations ON gsod2021.stn = stations.usaf AND gsod2021.wban = stations.wban WHERE stations.country = "US" AND gsod2021.date = "2021-12-15" AND stations.state IS NOT NULL AND gsod2021.max != 9999.9 ORDER BY gsod2021.min;'

(This query uses weather data from the National Oceanic and Atmospheric Administration (NOAA))

Click Check my progress to verify the objective. Perform BigQuery Activities.

Task 3. Perform log analysis

This task analyzes the log data you created in the previous task.

Challenge (optional): Before running a query, can you predict the result?

Query for a specific operation

A common task when analyzing BigQuery usage is to search specific operations for your datasets. In this task you query for the create and delete operations.

  1. In the left pane, click Log Analytics and find the Query field in the Log Analytics section.

  2. To find the created and deleted operations, type or paste the following query into the Query field and click Run query:

SELECT timestamp, severity, resource.type, proto_payload.audit_log.authentication_info.principal_email, proto_payload.audit_log.method_name, proto_payload.audit_log.resource_name, FROM `{{{ project_0.project_id | "Project ID" }}}.global._Required._AllLogs` WHERE log_id = 'cloudaudit.googleapis.com/activity' AND proto_payload.audit_log.method_name LIKE 'datasetservice%' LIMIT 100

Your query should output three results.

You can also limit your search with more specifics such as method_name = 'datasetservice.delete'.

Query for operations for BigQuery tables

  1. To find the operations for BigQuery tables, change the query condition for table services. Run this query to find operations on tables that are created or deleted:
SELECT timestamp, severity, resource.type, proto_payload.audit_log.authentication_info.principal_email, proto_payload.audit_log.method_name, proto_payload.audit_log.resource_name, FROM `{{{ project_0.project_id | "Project ID" }}}.global._Required._AllLogs` WHERE log_id = 'cloudaudit.googleapis.com/activity' AND proto_payload.audit_log.method_name LIKE '%TableService%' LIMIT 100

This query outputs one result.

  1. To view completed BigQuery queries, search the data_access log based on the jobCompletedEvent. For example, run the following SQL query:
SELECT timestamp, resource.labels.project_id, proto_payload.audit_log.authentication_info.principal_email, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobConfiguration.query.query) AS query, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobConfiguration.query.statementType) AS statementType, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatus.error.message) AS message, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.startTime) AS startTime, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.endTime) AS endTime, CAST(TIMESTAMP_DIFF( CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.endTime) AS TIMESTAMP), CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.startTime) AS TIMESTAMP), MILLISECOND)/1000 AS INT64) AS run_seconds, CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.totalProcessedBytes) AS INT64) AS totalProcessedBytes, CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.totalSlotMs) AS INT64) AS totalSlotMs, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.referencedTables) AS tables_ref, CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.totalTablesProcessed) AS INT64) AS totalTablesProcessed, CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.queryOutputRowCount) AS INT64) AS queryOutputRowCount, severity FROM `{{{ project_0.project_id | "Project ID" }}}.global._Default._AllLogs` WHERE log_id = "cloudaudit.googleapis.com/data_access" AND proto_payload.audit_log.service_data.jobCompletedEvent IS NOT NULL ORDER BY startTime

Scroll through the two results and look at the query column. This lists the completed BigQuery queries. Since the query strings are included in the SQL query, the results might be large.

Task 4. Create a chart

Instead of using a table to see the results, Log Analytics supports creating charts for visualization.

  1. Click the Chart button in the result view,
  2. Select Pie chart as the chart type and query as the column.
  3. You should see the chart similar to the following:

Congratulations!

You now have hands-on experience utilizing Cloud Logging and Log Analytics to build queries, chart results, and gain insights into BigQuery utilization.

Next steps / Learn more

  • Log Analytics documentation.
  • Learn more about visualizing your Log Analytics results with the charts documentation.

Google Cloud training and certification

...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.

Manual Last Updated May 2, 2025

Lab Last Tested May 2, 2025

Copyright 2025 Google LLC. All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.

Before you begin

  1. Labs create a Google Cloud project and resources for a fixed time
  2. Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
  3. On the top left of your screen, click Start lab to begin

This content is not currently available

We will notify you via email when it becomes available

Great!

We will contact you via email if it becomes available

One lab at a time

Confirm to end all existing labs and start this one

Use private browsing to run the lab

Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.