arrow_back

Introduction to Cloud Dataproc: Hadoop and Spark on Google Cloud

로그인 가입
700개 이상의 실습 및 과정 이용하기

Introduction to Cloud Dataproc: Hadoop and Spark on Google Cloud

실습 30분 universal_currency_alt 크레딧 1개 show_chart 입문
info 이 실습에는 학습을 지원하는 AI 도구가 통합되어 있을 수 있습니다.
700개 이상의 실습 및 과정 이용하기

GSP123

Overview

Cloud Dataproc is a managed Spark and Hadoop service that lets you take advantage of open source data tools for batch processing, querying, streaming, and machine learning. Cloud Dataproc automation helps you create clusters quickly, manage them easily, and save money by turning clusters off when you don't need them. With less time and money spent on administration, you can focus on your jobs and your data.

This lab is adapted from the Create a Dataproc cluster by using the Google Cloud console guide.

What you'll learn

  • How to create a managed Cloud Dataproc cluster (with Apache Spark pre-installed).
  • How to submit a Spark job
  • How to shut down your cluster

Setup and requirements

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources are made available to you.

This hands-on lab lets you do the lab activities in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito (recommended) or private browser window to run this lab. This prevents conflicts between your personal account and the student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab—remember, once you start, you cannot pause a lab.
Note: Use only the student account for this lab. If you use a different Google Cloud account, you may incur charges to that account.

How to start your lab and sign in to the Google Cloud console

  1. Click the Start Lab button. If you need to pay for the lab, a dialog opens for you to select your payment method. On the left is the Lab Details pane with the following:

    • The Open Google Cloud console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Cloud console (or right-click and select Open Link in Incognito Window if you are running the Chrome browser).

    The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username below and paste it into the Sign in dialog.

    {{{user_0.username | "Username"}}}

    You can also find the Username in the Lab Details pane.

  4. Click Next.

  5. Copy the Password below and paste it into the Welcome dialog.

    {{{user_0.password | "Password"}}}

    You can also find the Password in the Lab Details pane.

  6. Click Next.

    Important: You must use the credentials the lab provides you. Do not use your Google Cloud account credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  7. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Google Cloud console opens in this tab.

Note: To access Google Cloud products and services, click the Navigation menu or type the service or product name in the Search field.

Permission to Service Account

To assign storage permission to the service account, which is required for creating a cluster:

  1. Go to Navigation menu > IAM & Admin > IAM.

  2. Click the pencil icon on the compute@developer.gserviceaccount.com service account.

  3. Click on the + ADD ANOTHER ROLE button. Select role Storage Admin

Once you've selected the Storage Admin role, click on Save

Task 1. Create a Cloud Dataproc cluster

  1. In the console, open the navigation menu () > View All Products. Under Analytics section, click on Dataproc.

  1. To create a new cluster, click on Clusters > Create cluster. In the dialog box, click Create for Cluster on Compute Engine.

  1. There are many parameters you can configure when creating a new cluster. Set values for the parameters listed below, and leave the default settings for the other parameters:
Parameter Value
Name
Region
Zone
Click Configure nodes, for Manager node - Primary disk type Standard Persistent Disk
Manager node - Series E2
Manager node - Machine type
Worker node - Primary disk size 100
Worker node - Primary disk type Standard Persistent Disk
Worker node - Series E2
Worker node - Machine type
Click Customize cluster, for Internal IP only Uncheck Configure all instances to have only internal IP addresses
  1. Click on Create to create the new cluster. You will see the Status go from Provisioning to Running and move on to the next step once your output resembles the following:

Test completed task

Click Check my progress to verify your performed task. If you have completed the task successfully you will be granted with an assessment score.

Create a Cloud Dataproc cluster.

Task 2. Submit a Spark job to your cluster

  1. Select Jobs to switch to Dataproc's jobs view:

  1. Click Submit job:

  1. Set values for the parameters listed below, leave the default settings for the other parameters:
Parameter Value
Region
Cluster
Job type
Main class or jar
Jar files
Arguments
  1. Click Submit.

Your job should appear in the Jobs list, which shows all your project's jobs with their cluster, type, and current status. The new job displays as "Running"—move on once you see "Succeeded" as the Status.

Test completed task

Click Check my progress to verify your performed task. If you have completed the task successfully you will granted with an assessment score.

Submit a Spark job to your cluster.
  1. To see your completed job's output, click the job ID in the Jobs list:

  1. To avoid scrolling, select Line Wrap to ON:

You should see that your job has successfully calculated a rough value for pi!

Task 3. Shut down your cluster

  1. You can shut down a cluster on the Clusters page:

  1. Select the checkbox next to the qlab cluster and click Delete:

  1. Click CONFIRM to confirm deletion.

Task 4. Test your understanding

Below are multiple-choice questions to reinforce your understanding of this lab's concepts. Answer them to the best of your abilities.

Congratulations!

You learned how to create a Dataproc cluster, submit a Spark job, and shut down your cluster.

Next steps / Learn more

Google Cloud training and certification

...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.

Manual Last Updated April 07, 2025

Lab Last Tested April 07, 2025

Copyright 2025 Google LLC. All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.

시작하기 전에

  1. 실습에서는 정해진 기간 동안 Google Cloud 프로젝트와 리소스를 만듭니다.
  2. 실습에는 시간 제한이 있으며 일시중지 기능이 없습니다. 실습을 종료하면 처음부터 다시 시작해야 합니다.
  3. 화면 왼쪽 상단에서 실습 시작을 클릭하여 시작합니다.

현재 이 콘텐츠를 이용할 수 없습니다

이용할 수 있게 되면 이메일로 알려드리겠습니다.

감사합니다

이용할 수 있게 되면 이메일로 알려드리겠습니다.

한 번에 실습 1개만 가능

모든 기존 실습을 종료하고 이 실습을 시작할지 확인하세요.

시크릿 브라우징을 사용하여 실습 실행하기

이 실습을 실행하려면 시크릿 모드 또는 시크릿 브라우저 창을 사용하세요. 개인 계정과 학생 계정 간의 충돌로 개인 계정에 추가 요금이 발생하는 일을 방지해 줍니다.