arrow_back

Loading Data into Cloud SQL

Login Gabung
Uji dan bagikan pengetahuan Anda kepada komunitas kami.
done
Dapatkan akses ke lebih dari 700 lab praktik, badge keahlian, dan kursus

Loading Data into Cloud SQL

Lab 1 jam universal_currency_alt 5 Kredit show_chart Menengah
info Lab ini mungkin menggabungkan alat AI untuk mendukung pembelajaran Anda.
Uji dan bagikan pengetahuan Anda kepada komunitas kami.
done
Dapatkan akses ke lebih dari 700 lab praktik, badge keahlian, dan kursus

GSP196

Google Cloud self-paced labs logo

Overview

In this lab, you import data from CSV text files into Cloud SQL and then carry out some basic data analysis using simple queries.

The dataset used in this lab comes from the US Bureau of Transport Statistics and contains historical information about internal flights in the United States. This dataset can be used to demonstrate a wide range of data science concepts and techniques.

Objectives

  • Create Cloud SQL instance
  • Create a Cloud SQL database
  • Import text data into Cloud SQL
  • Build an initial data model using queries

Setup and requirements

Before you click the Start Lab button

Read these instructions. Labs are timed and you cannot pause them. The timer, which starts when you click Start Lab, shows how long Google Cloud resources will be made available to you.

This hands-on lab lets you do the lab activities yourself in a real cloud environment, not in a simulation or demo environment. It does so by giving you new, temporary credentials that you use to sign in and access Google Cloud for the duration of the lab.

To complete this lab, you need:

  • Access to a standard internet browser (Chrome browser recommended).
Note: Use an Incognito or private browser window to run this lab. This prevents any conflicts between your personal account and the Student account, which may cause extra charges incurred to your personal account.
  • Time to complete the lab---remember, once you start, you cannot pause a lab.
Note: If you already have your own personal Google Cloud account or project, do not use it for this lab to avoid extra charges to your account.

How to start your lab and sign in to the Google Cloud console

  1. Click the Start Lab button. If you need to pay for the lab, a pop-up opens for you to select your payment method. On the left is the Lab Details panel with the following:

    • The Open Google Cloud console button
    • Time remaining
    • The temporary credentials that you must use for this lab
    • Other information, if needed, to step through this lab
  2. Click Open Google Cloud console (or right-click and select Open Link in Incognito Window if you are running the Chrome browser).

    The lab spins up resources, and then opens another tab that shows the Sign in page.

    Tip: Arrange the tabs in separate windows, side-by-side.

    Note: If you see the Choose an account dialog, click Use Another Account.
  3. If necessary, copy the Username below and paste it into the Sign in dialog.

    {{{user_0.username | "Username"}}}

    You can also find the Username in the Lab Details panel.

  4. Click Next.

  5. Copy the Password below and paste it into the Welcome dialog.

    {{{user_0.password | "Password"}}}

    You can also find the Password in the Lab Details panel.

  6. Click Next.

    Important: You must use the credentials the lab provides you. Do not use your Google Cloud account credentials. Note: Using your own Google Cloud account for this lab may incur extra charges.
  7. Click through the subsequent pages:

    • Accept the terms and conditions.
    • Do not add recovery options or two-factor authentication (because this is a temporary account).
    • Do not sign up for free trials.

After a few moments, the Google Cloud console opens in this tab.

Note: To view a menu with a list of Google Cloud products and services, click the Navigation menu at the top-left. Navigation menu icon

Activate Cloud Shell

Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources.

  1. Click Activate Cloud Shell Activate Cloud Shell icon at the top of the Google Cloud console.

When you are connected, you are already authenticated, and the project is set to your Project_ID, . The output contains a line that declares the Project_ID for this session:

Your Cloud Platform project in this session is set to {{{project_0.project_id | "PROJECT_ID"}}}

gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.

  1. (Optional) You can list the active account name with this command:
gcloud auth list
  1. Click Authorize.

Output:

ACTIVE: * ACCOUNT: {{{user_0.username | "ACCOUNT"}}} To set the active account, run: $ gcloud config set account `ACCOUNT`
  1. (Optional) You can list the project ID with this command:
gcloud config list project

Output:

[core] project = {{{project_0.project_id | "PROJECT_ID"}}} Note: For full documentation of gcloud, in Google Cloud, refer to the gcloud CLI overview guide.

Task 1. Prepare your environment

This lab uses a set of code samples and scripts developed for the Data Science on the Google Cloud Platform, 2nd Edition book from O'Reilly Media, Inc. It covers the configuration of Cloud SQL and importing data tasks covered in the first part of Chapter 3, "Creating Compelling Dashboards". You clone the sample repository used in Chapter 2 from Github to the Cloud Shell and carry out all of the lab tasks from there.

Clone the Data Science on Google Cloud repository

  1. In Cloud Shell enter the following commands to clone the repository:
git clone \ https://github.com/GoogleCloudPlatform/data-science-on-gcp/
  1. Change to the repository directory:
cd data-science-on-gcp/03_sqlstudio
  1. Create the environment variables used later in the lab for your project ID and the storage bucket that contains your data:
export PROJECT_ID=$(gcloud info --format='value(config.project)') export BUCKET=${PROJECT_ID}-ml
  1. Enter following command to stage the file into Cloud Storage bucket:
gsutil cp create_table.sql \ gs://$BUCKET/create_table.sql

Task 2. Create a Cloud SQL instance

  1. Enter the following commands to create a Cloud SQL instance:
gcloud sql instances create flights \ --database-version=POSTGRES_13 --cpu=2 --memory=8GiB \ --region={{{project_0.default_region | "REGION"}}} --root-password=Passw0rd

This takes a few minutes to complete.

Test completed task

Click Check my progress to verify your performed task. If you have successfully created a Cloud SQL instance, you will see an assessment score.

Create a Cloud SQL instance.
  1. Create an environment variable with the Cloud Shell IP address:
export ADDRESS=$(curl -s http://ipecho.net/plain)/32
  1. Allowlist the Cloud Shell instance for management access to your SQL instance:
gcloud sql instances patch flights --authorized-networks $ADDRESS
  1. When prompted press, Y to accept the change.

Test completed task

Click Check my progress to verify your performed task. If you have successfully allowlisted Cloud Shell to access the SQL instance, you will see an assessment score.

Allowlist the Cloud Shell instance to access your SQL instance.

Create database and table

To import data into a Postgres table, you first create an empty database and a table with the correct schema.

  1. In the Cloud Console, on the Navigation menu (Navigation menu icon), click SQL.

  2. To open the Overview page of an instance, click the instance name flights.

  3. Select Databases from the SQL navigation menu on the left.

  4. Click Create database.

  5. In the New database dialog, name the database bts.

  6. Click Create.

  7. To open the Overview page of an instance, select Overview from the SQL navigation menu.

  8. Click IMPORT on the top.

  9. In the Cloud Storage file field, click Browse.

  10. In the Buckets section, click the arrow opposite your bucket name.

  11. Select the file create_table.sql.

  12. Click Select.

  13. In the File format section, select SQL.

  14. Specify the Database bts in your Cloud SQL instance.

  15. Click Import start the import.

A few seconds later, the empty table will be created.

Test completed task

Click Check my progress to verify your performed task. If you have successfully created a bts database and flights table using the create_table.sql file, you will see an assessment score.

Create a bts database and flights table using the create_table.sql file.

Task 3. Add data to a Cloud SQL instance

You created the empty database and table, now load the CSV files into this table. Loading the January data by browsing to 201501.csv in your bucket and specifying CSV as the format, bts as the database, and flights as the table.

  1. In your Cloud SQL instance page, click IMPORT.

  2. In the Cloud Storage file field, click Browse, and then click the arrow opposite your bucket name, and then click 201501.csv.

  3. Click Select.

  4. Select CSV as File format.

  5. Select the bts database and type in flights as your table.

  6. Click IMPORT.

Task 4. Interact with the database

  1. Connect to the Cloud SQL instance from Cloud Shell using:
gcloud sql connect flights --user=postgres
  1. When prompted for a password enter Passw0rd. You may not see the letters as you type.

  2. In the prompt that comes up, connect to the bts database:

\c bts;
  1. When prompted for a password enter Passw0rd.

  2. Then, run a query to obtain the 5 busiest airports:

SELECT "Origin", COUNT(*) AS num_flights FROM flights GROUP BY "Origin" ORDER BY num_flights DESC LIMIT 5;

While this query is performant because the dataset is relatively small (only January!), the database will slow as you add more months.

Relational databases are suited to smallish datasets on which you perform ad hoc queries that return a small a subset of the data. For larger datasets, you tune the performance of a relational database by indexing the columns of interest. Further, because relational databases typically support transactions and guarantee strong consistency, they are an excellent choice for data that will be updated often.

However, a relational database is a poor choice if:

  • Your data is primarily read-only
  • If your dataset sizes go into the terabyte range
  • You have a need to scan the full table (such as to compute the maximum value of a column) or if your data streams in at high rates.

This describes the flight delay use case. For this case you would switch from a relational database to an analytics data warehouse – BigQuery. The analytics data warehouse will allow us to use SQL and is much more capable of dealing with large datasets and ad hoc queries (i.e. doesn’t need the columns to be indexed).

Congratulations!

Now you know how to create tables and import text data that has been stored on Cloud Storage into Cloud SQL.

Next steps / learn more

Here are some follow-up steps:

Google Cloud training and certification

...helps you make the most of Google Cloud technologies. Our classes include technical skills and best practices to help you get up to speed quickly and continue your learning journey. We offer fundamental to advanced level training, with on-demand, live, and virtual options to suit your busy schedule. Certifications help you validate and prove your skill and expertise in Google Cloud technologies.

Manual Last Updated July 9, 2024

Lab Last Tested July 16, 2024

Copyright 2024 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.

Konten ini tidak tersedia untuk saat ini

Kami akan memberi tahu Anda melalui email saat konten tersedia

Bagus!

Kami akan menghubungi Anda melalui email saat konten tersedia