arrow_back

Analyze BigQuery Usage with Log Analytics

Test and share your knowledge with our community!
done
Get access to over 700 hands-on labs, skill badges, and courses

Analyze BigQuery Usage with Log Analytics

Lab 1 heure 30 minutes universal_currency_alt 1 crédit show_chart Débutant
Test and share your knowledge with our community!
done
Get access to over 700 hands-on labs, skill badges, and courses

GSP1250

Google Cloud self-paced labs logo

Overview

While BigQuery offers built-in observability capabilities like the INFORMATION_SCHEMA views, detailed logging remains crucial for in-depth usage analysis, auditing, and troubleshooting potential issues. Enabling Log Analytics lets you query and analyze your log data using familiar SQL queries, then view or chart the query results.

BigQuery is Google Cloud's fully managed enterprise data warehouse that helps you manage and analyze your data with built-in features like machine learning, geospatial analysis, and business intelligence.

In this lab you will enable Log Analytics on Logs Storage buckets, then view the BigQuery logs inside Cloud Logging, and use SQL to analyze the logs using Log Analytics.

Objectives

In this lab you will learn how to:

  • Use Cloud Logging effectively to get insight about BigQuery usage
  • Effectively build and run SQL queries using Log Analytics
  • View and chart the results

Setup and Requirements

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab---remember, once you start, you cannot pause a lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab to avoid extra charges to your account.

How to start your lab and sign in to the Google Cloud console

  1. Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is the Lab Details panel with the following:

    • The Open Google Cloud console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Cloud console (or right-click and select Open Link in Incognito Window if you are running the Chrome browser).

    The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username below and paste it into the Sign in dialog.

    {{{user_0.username | "Username"}}}

    You can also find the Username in the Lab Details panel.

  4. Click Next.

  5. Copy the Password below and paste it into the Welcome dialog.

    {{{user_0.password | "Password"}}}

    You can also find the Password in the Lab Details panel.

  6. Click Next.

    Important: You must use the credentials the lab provides you. Do not use your Google Cloud account credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  7. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Google Cloud console opens in this tab.

Note: To view a menu with a list of Google Cloud products and services, click the Navigation menu at the top-left. Navigation menu icon

Activate Cloud Shell

Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.

  1. Click Activate Cloud Shell Activate Cloud Shell icon at the top of the Google Cloud console.

When you are connected, you are already authenticated, and the project is set to your Project_ID, . The output contains a line that declares the Project_ID for this session:

Your Cloud Platform project in this session is set to {{{project_0.project_id | "PROJECT_ID"}}}

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  1. (Optional) You can list the active account name with this command:
gcloud auth list
  1. Click Authorize.

Output:

ACTIVE: * ACCOUNT: {{{user_0.username | "ACCOUNT"}}} To set the active account, run: $ gcloud config set account `ACCOUNT`
  1. (Optional) You can list the project ID with this command:
gcloud config list project

Output:

[core] project = {{{project_0.project_id | "PROJECT_ID"}}} Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview guide.

Task 1. Configure log buckets

Upgrade log bucket

You will configure Cloud Logging to upgrade all the existing log buckets with Log Analytics enabled.

To upgrade existing buckets to use Log Analytics:

  1. From the navigation menu, select Logging, and then select Logs Storage:
  2. Select the bucket _Required.
  3. When the Log Analytics available column displays Upgrade, you can upgrade the log bucket to use Log Analytics. Click Upgrade.
  4. Click Confirm.
  5. Repeat the same steps for the log bucket _Default.

Task 2. Perform BigQuery Activities

Complete the following tasks to generate some BigQuery logs. In the tasks, the BigQuery command line tool bq is used.

  1. Use the bq mk command to create new datasets named bq_logs and bq_logs_test in your project:
bq mk bq_logs bq mk bq_logs_test
  1. Use the bq ls command to list the datasets:
bq ls
  1. Use the bq rm command to delete the a dataset (select y when prompted):
bq rm bq_logs_test
  1. Create a new table:
bq mk \ --table \ --expiration 3600 \ --description "This is a test table" \ bq_logs.test_table \ id:STRING,name:STRING,address:STRING

You should see a success message that a new empty table named test_table has been created for your dataset.

Now you'll run some queries to generate logs.

  1. Run the following query in Cloud Shell:
bq query --use_legacy_sql=false 'SELECT current_date'
  1. Copy the following command into Cloud Shell and click RUN:
bq query --use_legacy_sql=false \ 'SELECT gsod2021.date, stations.usaf, stations.wban, stations.name, stations.country, stations.state, stations.lat, stations.lon, stations.elev, gsod2021.temp, gsod2021.max, gsod2021.min, gsod2021.mxpsd, gsod2021.gust, gsod2021.fog, gsod2021.hail FROM `bigquery-public-data.noaa_gsod.gsod2021` gsod2021 INNER JOIN `bigquery-public-data.noaa_gsod.stations` stations ON gsod2021.stn = stations.usaf AND gsod2021.wban = stations.wban WHERE stations.country = "US" AND gsod2021.date = "2021-12-15" AND stations.state IS NOT NULL AND gsod2021.max != 9999.9 ORDER BY gsod2021.min;'

(This query uses weather data from the National Oceanic and Atmospheric Administration (NOAA))

Task 3. Perform log analysis

One common task when analyzing BigQuery usage is to search specific operations for your datasets. For each query you run, remember to replace [Your Project Id] with the project id provisioned for this lab.

  1. In the left navigation, under Logging, click Log Analytics to access the feature. You should see something like the following:
Log Analytics results

If your query field is empty or you forget which table you want to use, you can click the Query button to get the sample query back.

  1. To find operations such as created or deleted, run the following query:
SELECT timestamp, severity, resource.type, proto_payload.audit_log.authentication_info.principal_email, proto_payload.audit_log.method_name, proto_payload.audit_log.resource_name, FROM `[Your Project Id].global._Required._AllLogs` WHERE log_id = 'cloudaudit.googleapis.com/activity' AND proto_payload.audit_log.method_name LIKE 'datasetservice%' LIMIT 100
  1. You should see the output like the following. You can also limit your search with more specifics such as method_name = 'datasetservice.delete'.

results for datasetservice query

Results for datasetservice query

4. To find the operations for BigQuery tables, change the query condition for table services. Run this query to find operations on tables that are created or deleted:

SELECT timestamp, severity, resource.type, proto_payload.audit_log.authentication_info.principal_email, proto_payload.audit_log.method_name, proto_payload.audit_log.resource_name, FROM `[Your Project Id].global._Required._AllLogs` WHERE log_id = 'cloudaudit.googleapis.com/activity' AND proto_payload.audit_log.method_name LIKE '%TableService%' LIMIT 100

You should see the output like the following:

Results for TableService query

  1. To view completed BigQuery queries, search the data_access log based on the jobCompletedEvent. For example, run the following SQL query:
SELECT timestamp, resource.labels.project_id, proto_payload.audit_log.authentication_info.principal_email, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobConfiguration.query.query) AS query, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobConfiguration.query.statementType) AS statementType, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatus.error.message) AS message, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.startTime) AS startTime, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.endTime) AS endTime, CAST(TIMESTAMP_DIFF( CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.endTime) AS TIMESTAMP), CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.startTime) AS TIMESTAMP), MILLISECOND)/1000 AS INT64) AS run_seconds, CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.totalProcessedBytes) AS INT64) AS totalProcessedBytes, CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.totalSlotMs) AS INT64) AS totalSlotMs, JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.referencedTables) AS tables_ref, CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.totalTablesProcessed) AS INT64) AS totalTablesProcessed, CAST(JSON_VALUE(proto_payload.audit_log.service_data.jobCompletedEvent.job.jobStatistics.queryOutputRowCount) AS INT64) AS queryOutputRowCount, severity FROM `[Your Project Id].global._Default._Default` WHERE log_id = "cloudaudit.googleapis.com/data_access" AND proto_payload.audit_log.service_data.jobCompletedEvent IS NOT NULL ORDER BY startTime

After the SQL query completes, you should see the output like the following:

Results of data_access and jobCompletedEvent query

Since the query strings are included in the SQL query, the results might be large. Scroll through the results and look for the completed BigQuery queries.

Task 4. Create a chart

Instead of using a table to see the results, Log Analytics supports creating charts for visualization.

  1. Click the Chart button in the result view,
  2. Select Pie chart as the chart type and query as the column.
  3. You should see the chart similar to the following.

pie chart of Log Analytics results

Congratulations

You now have hands-on experience utilizing Cloud Logging and Log Analytics to build queries, chart results, and gain insights into BigQuery utilization.

Next steps / Learn more

  • Log Analytics documentation.
  • Learn more about visualizing your Log Analytics results with the charts documentation.

[[/fragments/TrainingCertificationOverview]]

Manual Last Updated June 7, 2024

Lab Last Tested April 23, 2024

Copyright 2024 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.