Checkpoints
Create a Cloud Storage bucket
/ 50
Initialize Cloud Dataprep
/ 50
Dataprep: Qwik Start
- GSP105
- Overview
- Setup and requirements
- Task 1. Create a Cloud Storage bucket in your project
- Task 2. Initialize Cloud Dataprep
- Task 3. Create a flow
- Task 4. Import datasets
- Task 5. Prep the candidate file
- Task 6. Wrangle the Contributions file and join it to the Candidates file
- Task 7. Summary of data
- Task 8. Rename columns
- Congratulations!
This lab was developed with our partner, Trifacta. Your personal information may be shared with Trifacta, the lab sponsor, if you have opted-in to receive product updates, announcements, and offers in your Account Profile.
GSP105
Overview
Cloud Dataprep by Trifacta is an intelligent data service for visually exploring, cleaning, and preparing data for analysis. Cloud Dataprep is serverless and works at any scale. There is no infrastructure to deploy or manage. Easy data preparation with clicks and no code!
In this lab, you use Dataprep to manipulate a dataset. You import datasets, correct mismatched data, transform data, and join data. If this is new to you, you'll know what it all is by the end of this lab.
What you'll do
In this lab, you learn how to use Dataprep to complete the following tasks:
- Import data
- Correct mismatched data
- Transform data
- Join data
Setup and requirements
Before you click the Start Lab button
Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.
This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.
To complete this lab, you need:
- Access to a standard internet browser (Chrome browser recommended).
- Time to complete the lab---remember, once you start, you cannot pause a lab.
How to start your lab and sign in to the Google Cloud console
-
Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is the Lab Details panel with the following:
- The Open Google Cloud console button
- Time remaining
- The temporary credentials that you must use for this lab
- Other information, if needed, to step through this lab
-
Click Open Google Cloud console (or right-click and select Open Link in Incognito Window if you are running the Chrome browser).
The lab spins up resources, and then opens another tab that shows the Sign in page.
Tip: Arrange the tabs in separate windows, side-by-side.
Note: If you see the Choose an account dialog, click Use Another Account. -
If necessary, copy the Username below and paste it into the Sign in dialog.
{{{user_0.username | "Username"}}} You can also find the Username in the Lab Details panel.
-
Click Next.
-
Copy the Password below and paste it into the Welcome dialog.
{{{user_0.password | "Password"}}} You can also find the Password in the Lab Details panel.
-
Click Next.
Important: You must use the credentials the lab provides you. Do not use your Google Cloud account credentials. Note: Using your own Google Cloud account for this lab may incur extra charges. -
Click through the subsequent pages:
- Accept the terms and conditions.
- Do not add recovery options or two-factor authentication (because this is a temporary account).
- Do not sign up for free trials.
After a few moments, the Google Cloud console opens in this tab.
Activate Cloud Shell
Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.
- Click Activate Cloud Shell at the top of the Google Cloud console.
When you are connected, you are already authenticated, and the project is set to your Project_ID,
gcloud
is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.
- (Optional) You can list the active account name with this command:
- Click Authorize.
Output:
- (Optional) You can list the project ID with this command:
Output:
gcloud
, in Google Cloud, refer to the gcloud CLI overview guide.
Task 1. Create a Cloud Storage bucket in your project
-
In the Cloud Console, select Navigation menu() > Cloud Storage > Buckets.
-
Click Create bucket.
-
In the Create a bucket dialog, Name the bucket a unique name. Leave other settings at their default value.
-
Uncheck Enforce public access prevention on this bucket for
Choose how to control access to objects
. -
Click Create.
You created your bucket. Remember the bucket name for later steps.
Test completed task
Click Check my progress to verify your performed task. If you have successfully created a Cloud Storage bucket, you see an assessment score.
Task 2. Initialize Cloud Dataprep
- Open Cloud Shell and run the following command:
You should see a message saying the service identity was created.
-
Select Navigation menu > Dataprep.
-
Check to accept the Google Dataprep Terms of Service, then click Accept.
-
Check to authorize sharing your account information with Trifacta, then click Agree and Continue.
-
Click Allow to allow Trifacta to access project data.
-
Click your student username to sign in to Cloud Dataprep by Trifacta. Your username is the Username in the left panel in your lab.
-
Click Allow to grant Cloud Dataprep access to your Google Cloud lab account.
-
Check to agree to Trifacta Terms of Service, and then click Accept.
-
Click Continue on the First time setup screen to create the default storage location.
Dataprep opens.
Test completed task
Click Check my progress to verify your performed task. If you have successfully initialized Cloud Dataprep with default storage location, you see an assessment score.
Task 3. Create a flow
Cloud Dataprep uses a flow
workspace to access and manipulate datasets.
- Click Flows icon, then the Create button, then select Blank Flow :
- Click on Untitled Flow, then name and describe the flow. Since this lab uses 2016 data from the United States Federal Elections Commission 2016, name the flow "FEC-2016", and then describe the flow as "United States Federal Elections Commission 2016".
- Click OK.
The FEC-2016 flow page opens.
Task 4. Import datasets
In this section you import and add data to the FEC-2016 flow.
-
Click Add Datasets, then select the Import Datasets link.
-
In the left menu pane, select Cloud Storage to import datasets from Cloud Storage, then click on the pencil to edit the file path.
- Type
gs://spls/gsp105
in the Choose a file or folder text box, then click Go.
You may have to widen the browser window to see the Go and Cancel buttons.
-
Click us-fec/.
-
Click the + icon next to
cn-2016.txt
to create a dataset shown in the right pane. Click on the title in the dataset in the right pane and rename it "Candidate Master 2016". -
In the same way add the
itcont-2016-orig.txt
dataset, and rename it "Campaign Contributions 2016". -
Both datasets are listed in the right pane; click Import & Add to Flow.
You see both datasets listed as a flow.
Task 5. Prep the candidate file
- By default, the Candidate Master 2016 dataset is selected. In the right pane, click Edit Recipe.
The Candidate Master 2016 Transformer page opens in the grid view.
The Transformer page is where you build your transformation recipe and see the results applied to the sample. When you are satisfied with what you see, execute the job against your dataset.
- Each of the column heads have a Name and value that specify the data type. To see data types, click the column icon:
-
Notice also that when you click the name of the column, a Details panel opens on the right.
-
Click X in the top right of the Details panel to close the Details panel.
In the following steps you explore data in the grid view and apply transformation steps to your recipe.
- Column5 provides data from 1990-2064. Widen column5 (like you would on a spreadsheet) to separate each year. Click to select the tallest bin, which represents the year 2016.
This creates a step where these values are selected.
- In the Suggestions panel on the right, in the Keep rows section, click Add to add this step to your recipe.
The Recipe panel on the right now has the following step:
Keep rows where(DATE(2016, 1, 1) <= column5) && (column5 < DATE(2018, 1, 1))
- In Column6 (State), hover over and click on the mismatched (red) portion of the header to select the mismatched rows.
Scroll down to the bottom (highlighted in red) find the mismatched values and notice how most of these records have the value "P" in column7, and "US" in column6. The mismatch occurs because column6 is marked as a "State" column (indicated by the flag icon), but there are non-state (such as "US") values.
- To correct the mismatch, click X in the top of the Suggestions panel to cancel the transformation, then click on the flag icon in Column6 and change it to a "String" column.
There is no longer a mismatch and the column marker is now green.
- Filter on just the presidential candidates, which are those records that have the value "P" in column7. In the histogram for column7, hover over the two bins to see which is "H" and which is "P". Click the "P" bin.
- In the right Suggestions panel, click Add to accept the step to the recipe.
Task 6. Wrangle the Contributions file and join it to the Candidates file
On the Join page, you can add your current dataset to another dataset or recipe based on information that is common to both datasets.
Before you join the Contributions file to the Candidates file, clean up the Contributions file.
- Click on FEC-2016 (the dataset selector) at the top of the grid view page.
-
Click to select the grayed out Campaign Contributions 2016.
-
In the right pane, click Add > Recipe, then click Edit Recipe.
-
Click the recipe icon at the top right of the page, then click Add New Step.
Remove extra delimiters in the dataset.
- Insert the following Wrangle language command in the Search box:
The Transformation Builder parses the Wrangle command and populates the Find and Replace transformation fields.
-
Click Add to add the transform to the recipe.
-
Add another new step to the recipe. Click New Step, then type "Join" in the Search box.
-
Click Join datasets to open the Joins page.
-
Click on "Candidate Master 2016" to join with Campaign Contributions 2016, then Accept in the bottom right.
- On the right side, hover in the Join keys section, then click on the pencil (Edit icon).
Dataprep infers common keys. There are many common values that Dataprep suggests as Join Keys.
- In the Add Key panel, in the Suggested join keys section, click column2 = column11.
- Click Save and Continue.
Columns 2 and 11 open for your review.
- Click Next, then check the checkbox to the left of the "Column" label to add all columns of both datasets to the joined dataset.
- Click Review, and then Add to Recipe to return to the grid view.
Task 7. Summary of data
Generate a useful summary by aggregating, averaging, and counting the contributions in Column 16 and grouping the candidates by IDs, names, and party affiliation in Columns 2, 24, 8 respectively.
- At the top of the Recipe panel on the right, click on New Step and enter the following formula in the Transformation search box to preview the aggregated data.
An initial sample of the joined and aggregated data is displayed, representing a summary table of US presidential candidates and their 2016 campaign contribution metrics.
- Click Add to open a summary table of major US presidential candidates and their 2016 campaign contribution metrics.
Task 8. Rename columns
You can make the data easier to interpret by renaming the columns.
- Add each of the renaming and rounding steps individually to the recipe by clicking New Step, then enter:
-
Then click Add.
-
Add in this last New Step to round the Average Contribution amount:
- Then click Add.
Your results look something like this:
Congratulations!
You used Dataprep to add a dataset and created recipes to wrangle the data into meaningful results.
Next steps / Learn more
This lab is part of a series of labs called Qwik Starts. These labs are designed to give you a little taste of the many features available with Google Cloud. Search for "Qwik Starts" in the lab catalog to find the next lab you'd like to take!
Google Cloud training and certification
...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.
Manual Last Updated June 06, 2024
Lab Last Tested June 06, 2024
Copyright 2024 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.