arrow_back

Dataflow: Qwik Start - Python

Sign in Join
Test and share your knowledge with our community!
done
Get access to over 700 hands-on labs, skill badges, and courses

Dataflow: Qwik Start - Python

Lab 30 minutes universal_currency_alt 1 Credit show_chart Introductory
info This lab may incorporate AI tools to support your learning.
Test and share your knowledge with our community!
done
Get access to over 700 hands-on labs, skill badges, and courses

GSP207

Google Cloud self-paced labs logo

Overview

The Apache Beam SDK is an open source programming model for data pipelines. In Google Cloud, you can define a pipeline with an Apache Beam program and then use Dataflow to run your pipeline.

In this lab, you set up your Python development environment for Dataflow (using the Apache Beam SDK for Python) and run an example Dataflow pipeline.

What you'll do

In this lab, you learn how to:

  • Create a Cloud Storage bucket to store results of a Dataflow pipeline
  • Install the Apache Beam SDK for Python
  • Run a Dataflow pipeline remotely

Setup and requirements

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab---remember, once you start, you cannot pause a lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab to avoid extra charges to your account.

How to start your lab and sign in to the Google Cloud console

  1. Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is the Lab Details panel with the following:

    • The Open Google Cloud console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Cloud console (or right-click and select Open Link in Incognito Window if you are running the Chrome browser).

    The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username below and paste it into the Sign in dialog.

    {{{user_0.username | "Username"}}}

    You can also find the Username in the Lab Details panel.

  4. Click Next.

  5. Copy the Password below and paste it into the Welcome dialog.

    {{{user_0.password | "Password"}}}

    You can also find the Password in the Lab Details panel.

  6. Click Next.

    Important: You must use the credentials the lab provides you. Do not use your Google Cloud account credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  7. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Google Cloud console opens in this tab.

Note: To view a menu with a list of Google Cloud products and services, click the Navigation menu at the top-left. Navigation menu icon

Activate Cloud Shell

Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.

  1. Click Activate Cloud Shell Activate Cloud Shell icon at the top of the Google Cloud console.

When you are connected, you are already authenticated, and the project is set to your Project_ID, . The output contains a line that declares the Project_ID for this session:

Your Cloud Platform project in this session is set to {{{project_0.project_id | "PROJECT_ID"}}}

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  1. (Optional) You can list the active account name with this command:
gcloud auth list
  1. Click Authorize.

Output:

ACTIVE: * ACCOUNT: {{{user_0.username | "ACCOUNT"}}} To set the active account, run: $ gcloud config set account `ACCOUNT`
  1. (Optional) You can list the project ID with this command:
gcloud config list project

Output:

[core] project = {{{project_0.project_id | "PROJECT_ID"}}} Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview guide.

Set the region

  • In Cloud Shell, run the following command to set the project region for this lab:
gcloud config set compute/region {{{project_0.default_region | "REGION"}}}

Ensure that the Dataflow API is successfully enabled

To ensure access to the necessary API, restart the connection to the Dataflow API.

  1. In the Cloud Console, enter "Dataflow API" in the top search bar. Click on the result for Dataflow API.

  2. Click Manage.

  3. Click Disable API.

If asked to confirm, click Disable.

  1. Click Enable.

When the API has been enabled again, the page will show the option to disable.

Task 1. Create a Cloud Storage bucket

When you run a pipeline using Dataflow, your results are stored in a Cloud Storage bucket. In this task, you create a Cloud Storage bucket for the results of the pipeline that you run in a later task.

  1. On the Navigation menu (Navigation menu icon), click Cloud Storage > Buckets.
  2. Click Create bucket.
  3. In the Create bucket dialog, specify the following attributes:
  • Name: To ensure a unique bucket name, use the following name: -bucket. Note that this name does not include sensitive information in the bucket name, as the bucket namespace is global and publicly visible.
  • Location type: Multi-region
  • Location: us
  • A location where bucket data will be stored.
  1. Click Create.

  2. If Prompted Public access will be prevented, click Confirm.

Test completed task

Click Check my progress to verify your performed task. If you have completed the task successfully you will be granted an assessment score.

Create a Cloud Storage bucket.

Task 2. Install the Apache Beam SDK for Python

  1. To ensure that you use a supported Python version, begin by running the Python3.9 Docker Image:
docker run -it -e DEVSHELL_PROJECT_ID=$DEVSHELL_PROJECT_ID python:3.9 /bin/bash

This command pulls a Docker container with the latest stable version of Python 3.9 and then opens up a command shell for you to run the following commands inside your container.

  1. After the container is running, install the latest version of the Apache Beam SDK for Python by running the following command from a virtual environment:
pip install 'apache-beam[gcp]'==2.42.0

You will see some warnings returned that are related to dependencies. It is safe to ignore them for this lab.

  1. Run the wordcount.py example locally by running the following command:
python -m apache_beam.examples.wordcount --output OUTPUT_FILE

You may see a message similar to the following:

INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner. INFO:oauth2client.client:Attempting refresh to obtain initial access_token

This message can be ignored.

  1. You can now list the files that are on your local cloud environment to get the name of the OUTPUT_FILE:
ls
  1. Copy the name of the OUTPUT_FILE and cat into it:
cat <file name>

Your results show each word in the file and how many times it appears.

Task 3. Run an example Dataflow pipeline remotely

  1. Set the BUCKET environment variable to the bucket you created earlier:
BUCKET=gs://<bucket name provided earlier>
  1. Now you'll run the wordcount.py example remotely:
python -m apache_beam.examples.wordcount --project $DEVSHELL_PROJECT_ID \ --runner DataflowRunner \ --staging_location $BUCKET/staging \ --temp_location $BUCKET/temp \ --output $BUCKET/results/output \ --region {{{project_0.default_region | "filled in at lab start"}}}

In your output, wait until you see the message:

JOB_MESSAGE_DETAILED: Workers have started successfully.

Then continue with the lab.

Task 4. Check that your Dataflow job succeeded

  1. Open the Navigation menu and click Dataflow from the list of services.

You should see your wordcount job with a status of Running at first.

  1. Click on the name to watch the process. When all the boxes are checked off, you can continue watching the logs in Cloud Shell.

The process is complete when the status is Succeeded.

Test completed task

Click Check my progress to verify your performed task. If you have completed the task successfully you will be granted with an assessment score.

Run an Example Pipeline Remotely.
  1. Click Navigation menu > Cloud Storage in the Cloud Console.

  2. Click on the name of your bucket. In your bucket, you should see the results and staging directories.

  3. Click on the results folder and you should see the output files that your job created:

  4. Click on a file to see the word counts it contains.

Task 5. Test your understanding

Below is a multiple choice question to reinforce your understanding of this lab's concepts. Answer it to the best of your abilities.

Congratulations!

You learned how to set up your Python development environment for Dataflow (using the Apache Beam SDK for Python) and ran an example Dataflow pipeline.

Next steps / Learn more

This lab is part of a series of labs called Qwik Starts. These labs are designed to give you a little taste of the many features available with Google Cloud. Search for "Qwik Starts" in the Google Cloud Skills Boost catalog to find the next lab you'd like to take!

To get your own copy of the book this lab is based on: Data Science on the Google Cloud Platform: O'Reilly Media, Inc.

Google Cloud training and certification

...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.

Manual Last Updated: February 04, 2024

Lab Last Tested: May 4, 2023

Copyright 2024 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.

This content is not currently available

We will notify you via email when it becomes available

Great!

We will contact you via email if it becomes available