Punkty kontrolne
Create a Cloud function
/ 30
Create a Cloud Storage bucket and BigQuery dataset
/ 30
Deploy your function
/ 40
Use Cloud Run Functions to Load BigQuery
Overview
A Cloud Run function is a piece of code that runs in response to an event, such as an HTTP request, a message from a messaging service, or a file upload. Cloud events are things that happen in your cloud environment. These might be things like changes to data in a database, files added to a storage system, or a new virtual machine instance being created.
Since Cloud Run functions are event-driven, they only run when something happens. This makes them a good choice for tasks that need to be done quickly or that don't need to be running all the time.
This hands-on lab shows you how to create, deploy, and test a Cloud Run function which will load a BigQuery table using the Google Cloud SDK.
What you'll do
- Create a Cloud Run function
- Deploy and test the Cloud Run function
- View data in BigQuery and Cloud Run function logs
Setup
Before you click the Start Lab button
This Qwiklabs hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.
What you need
To complete this lab, you need:
- Access to a standard internet browser (Chrome browser recommended).
- Time to complete the lab.
How to start your lab and sign in to the Console
-
Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is a panel populated with the temporary credentials that you must use for this lab.
-
Copy the username, and then click Open Google Console. The lab spins up resources, and then opens another tab that shows the Choose an account page.
Note: Open the tabs in separate windows, side-by-side. -
On the Choose an account page, click Use Another Account. The Sign in page opens.
-
Paste the username that you copied from the Connection Details panel. Then copy and paste the password.
- Click through the subsequent pages:
- Accept the terms and conditions.
- Do not add recovery options or two-factor authentication (because this is a temporary account).
- Do not sign up for free trials.
After a few moments, the Cloud console opens in this tab.
Task 1. Create a function
First, you're going to create a simple function named loadBigQueryFromAvro. This function reads an Avro file that is uploaded to Cloud Storage and then creates and loads a table in BigQuery.
To create a cloud function:
- In the Cloud Console, on the Navigation menu (), click Compute Engine. You should see a provisioned linux instance:
-
Click on the SSH button. You will be brought to an interactive shell.
-
In the SSH terminal, run the following command to set the Cloud Run functions region:
- Create and open
index.js
to edit:
- Copy the following into the
index.js
file; this the code for the Cloud Function.
- Exit nano (Ctrl+x) and save (Y) the file.
Task 2. Create a Cloud Storage bucket and BigQuery dataset
Next, you're going to set up background infrastructure to store assets used to invoke the Cloud Run function (a Cloud Storage bucket) and store the output in BigQuery when it completes.
- Use the following command to create a new Cloud Storage bucket as a staging location:
- Create a BQ dataset to store the data.
Task 3. Deploy your function
Next, you're going to deploy the new Cloud Run function and trigger it so that the data is loaded into BigQuery.
- In order to ensure all system settings are in place disable and re-enable the Cloud Run functions API. First you need to disable the Cloud Run functions API:
- Re-enable the Cloud Run functions API:
- For the Cloud Run function to process you must add the
artifactregistry.reader
permission to your appspot service account.
- Two javascript libraries must be installed to read from Cloud Storage and store the output in BigQuery.
- Deploy the function using the command below.
- In the termial output of the function deployment you will see status showing as Active. This indicates that the function has been successfully deployed.
- Next, download the Avro file that will be processed by the Cloud Run function for storage in BigQuery.
- Move the Avro file to the staging Cloud Storage bucket you created earlier. This action will trigger the Cloud Run function.
Task 4. Confirm that the data was loaded into BigQuery
Now that you have deployed and triggered the Cloud Run function, it is time to examine the results in BigQuery.
- To view the data in the new table in BigQuery, run the following query in the SSH terminal using the bq command.
- The query should return results similiar to the following:
Example output:
Task 5. View logs
Any time that a Cloud Run function is executed detailed logs are maintained. As the final step in this lab you are going to examine the logs for your Cloud Run function.
- To check the logs and see your messages in the log history run the following command in the SSH terminal.
- Messages in the log appear similar to the following:
Congratulations!
You used the Google Cloud SDK to create, deploy, and test a Cloud Run function that created and loaded a BigQuery table.
Copyright 2022 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.