
Before you begin
- Labs create a Google Cloud project and resources for a fixed time
- Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
- On the top left of your screen, click Start lab to begin
Create BigQuery dataset
/ 33
Create Dataflow Pipeline
/ 33
Build model with BQML
/ 34
This lab was developed with our partner, MongoDB. Your personal information may be shared with MongoDB, the lab sponsor, if you have opted in to receive product updates, announcements, and offers in your Account Profile
In this lab you configure MongoDB Atlas, create a M0 Cluster, make the cluster accessible from the outside world, and create an admin user. You will use the Sample_mflix database and movies collection on MongoDB Atlas that consists of details on Movies like cast, release date etc. You then create a dataflow pipeline to load the data from MongoDB cluster to Google BigQuery. Thereafter, you query the data on BigQuery using Google Cloud console and create a ML model from the data.
In this lab, you learn how to:
Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.
This Qwiklabs hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.
To complete this lab, you need:
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab.
Note: If you are using a Pixelbook, open an Incognito window to run this lab.
Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is a panel populated with the temporary credentials that you must use for this lab.
Copy the username, and then click Open Google Console. The lab spins up resources, and then opens another tab that shows the Sign in page.
Tip: Open the tabs in separate windows, side-by-side.
In the Sign in page, paste the username that you copied from the Connection Details panel. Then copy and paste the password.
Important: You must use the credentials from the Connection Details panel. Do not use your Qwiklabs credentials. If you have your own Google Cloud account, do not use it for this lab (avoids incurring charges).
Click through the subsequent pages:
After a few moments, the Cloud Console opens in this tab.
Log in to your MongoDB Atlas account.
You can deploy only one free tier cluster per project. If you already have a free cluster, you will need to create a new project to deploy an additional free cluster. To do this, open the dropdown menu located in the top left corner, just below the Atlas logo, and click New Project. Then, enter a project name, click Next and then Create Project.
Click Create in the Create a cluster section.
Sandbox
.You will be prompted to complete the security setup of your deployment.
sample_mflix
database from MongoDB to BigQuery.Before you run the Dataflow pipeline, you need a BigQuery dataset to store the data. Open the Google Cloud Console and sign in with the lab credentials if you haven't already.
Open BigQuery from Google Cloud console, either by searching for BigQuery in the search bar on top or from the Navigation menu by navigating to BigQuery under the ANALYTICS task.
In the Explorer panel, select the project where you want to create the dataset.
Click on the three vertical dots next to your project ID under Explorer section, click View actions option and then click Create dataset.
On the Create dataset page:
For Dataset ID, enter sample_mflix
For Location type, select Region and under region select
For Default table expiration, leave the selection empty.
Click Create dataset.
Verify the dataset is listed under your project name on BigQuery explorer.
Now, create a Dataflow pipeline from the Dataflow UI.
To create a Dataflow pipeline with Dataflow, go to Dataflow Create job from the templates page.
In the Job name field, enter a unique job name: mongodb-to-bigquery-batch.
For Regional endpoint, select
From the Dataflow template drop-down menu, select the MongoDB to BigQuery template under Process Data in Bulk (batch).
In Required parameter enter the following parameters:
MongoDB Connection URI
the MongoDB connection string you copied earlier. If you haven't copied it, navigate back to MongoDB Atlas and click the Connect button on your Atlas database. Then, click Drivers and copy the connection string.
Paste the URI in MongoDB Connection URI in your dataflow UI. Replace the <password> with the password you used when creating the cluster.
MongoDB Database: sample_mflix
You are using the sample_mflix
database from the sample dataset for this lab.
MongoDB Collection: movies
BigQuery output table:
Leave NONE for user Option.
Expand Optional Parameters and de-select Use default machine type. Choose E2 from the Series dropdown. Then, choose e2-standard-2 from the Machine type dropdown.
Click Run Job.
View the Dataflow Job running on dataflow on the Cloud Console.
Once the job execution is complete, all nodes in the job graph will turn green.
For more details on the values passed refer to the MongoDB Dataflow Templates.
5-10 minutes
. Check your progress once your job has fully completed.Once the pipeline runs successfully, open the BigQuery Explorer.
Expand the project, click on the dataset sample_mflix
, double click on the movies table.
Run the below queries one at a time.
a. Query the first 1000 rows from the table:
b. Create a movies-unpacked table from parsed imported data:
c. Query and visualize the table movies-unpacked:
d. Create model samplemflix.rating using the KMEANS clustering algorithm:
Finally, navigate to the Model created under your Model section as shown below.
In this lab, you have provisioned your first MongoDB Atlas cluster, loaded a sample dataset and you have setup a BigQuery dataset. Also you have used Dataflow UI to build a MongoDB to BigQuery pipeline and perform clustering on data using the k-means clustering algorithm.
You can deploy this model in VertexAI to manage your data models and scale your data model, which is out of scope for this lab.
To learn more about MongoDB, refer to MongoDB on the Google Cloud Marketplace.
To learn more about building modern data-driven applications, be sure to check out the MongoDB Atlas documentation.
Get free $500 credits for MongoDB on Google Cloud Marketplace - Applicable only for new customers.
...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.
Manual Last Updated July 18, 2024
Lab Last Tested July 18, 2024
Copyright 2024 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.
This content is not currently available
We will notify you via email when it becomes available
Great!
We will contact you via email if it becomes available
One lab at a time
Confirm to end all existing labs and start this one