
Before you begin
- Labs create a Google Cloud project and resources for a fixed time
- Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
- On the top left of your screen, click Start lab to begin
Upload training images to Cloud Storage
/ 25
Create a dataset
/ 25
Train your model
/ 25
Create the prediction request
/ 25
Vertex AI brings together the Google Cloud services for building ML under one, unified UI and API. In Vertex AI, you can now easily train and compare models using AutoML or custom code training and all your models are stored in one central model repository. These models can now be deployed to the same endpoints on Vertex AI.
AutoML Vision helps anyone with limited Machine Learning (ML) expertise train high quality image classification models. In this hands-on lab, you will learn how to produce a custom ML model that automatically recognizes damaged car parts. Since the time it takes to train the model is above the time limit of the lab, you will interact and request predictions from a hosted model in a different project trained on the same dataset. You will then tweak the values of the data for the prediction request and examine how it changes the resulting prediction from the model.
In this lab, you learn how to:
For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.
Sign in to Qwiklabs using an incognito window.
Note the lab's access time (for example, 1:15:00
), and make sure you can finish within that time.
There is no pause feature. You can restart if needed, but you have to start at the beginning.
When ready, click Start lab.
Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.
Click Open Google Console.
Click Use another account and copy/paste credentials for this lab into the prompts.
If you use other credentials, you'll receive errors or incur charges.
Accept the terms and skip the recovery resource page.
Cloud Shell is a virtual machine that contains development tools. It offers a persistent 5-GB home directory and runs on Google Cloud. Cloud Shell provides command-line access to your Google Cloud resources. gcloud
is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab completion.
Click the Activate Cloud Shell button () at the top right of the console.
Click Continue.
It takes a few moments to provision and connect to the environment. When you are connected, you are also authenticated, and the project is set to your PROJECT_ID.
(Output)
(Example output)
(Output)
(Example output)
In this task you will upload the training images you want to use to Cloud Storage. This will make it easier to import the data into Vertex AI later.
To train a model to classify images of damaged car parts, you need to provide the machine with labeled training data. The model will use the data to develop an understanding of each image, differentiating between car parts and those with damages on them.
In this example, your model will learn to classify five different damaged car parts: bumper, engine compartment, hood, lateral, and windshield.
The training images are publicly available in a Cloud Storage bucket. Again, copy and paste the script template below into Cloud Shell to copy the images into your own bucket.
In the navigation pane, click Cloud Storage > Buckets.
Click the Refresh button at the top of the Cloud Storage browser.
Click on your bucket name. You should see five folders of photos for each of the five different damaged car parts to be classified:
Great! Your car images are now organized and ready for training.
Click Check my progress to verify the objective.
In this task you create a new dataset and connect your dataset to your training images to allow Vertex AI to access them.
Normally, you would create a CSV file where each row contains a URL to a training image and the associated label for that image. In this case, the CSV file has been created for you; you just need to update it with your bucket name and upload the CSV file to your Cloud Storage bucket.
Copy and paste the script templates below into Cloud Shell and press Enter to update, and upload the CSV file.
Once the command completes, click the Refresh button at the top of the Cloud Storage bucket and open your bucket.
Confirm that the data.csv
file is listed in your bucket.
In the Google Cloud Console, on the Navigation menu () click Vertex AI > Dashboard.
Click Enable all recommended API.
From the Vertex AI navigation menu on the left, click Datasets.
At the top of the console, click + Create.
For Dataset name, type damaged_car_parts
.
Select Image classification (Single label). (Note: in your own projects, you may want to check the "Multi-label Classification" box if you're doing multi-class classification).
Select region as
Click Create.
In this section, you will choose the location of your training images that you uploaded in the previous step.
In the Select an import method section, click Select import files from Cloud Storage.
In the Select import files from Cloud Storage section, click Browse.
Follow the prompts to navigate to your storage bucket and click your data.csv
file. Click Select.
Once you've properly selected your file, a green checkbox appears to the left of the file path. Click Continue to proceed.
Click Check my progress to verify the objective.
In this task, you examine the images to ensure there are no errors in your dataset.
If your browser page has refreshed, click Datasets , select your image name, and then click Browse.
Under Filter labels, click any one of the labels to view the specific training images.
You're ready to start training your model! Vertex AI handles this for you automatically, without requiring you to write any of the model code.
From the right-hand side, click Train New Model.
From the Training method window, leave the default configurations and select AutoML as the training method. Click Continue.
From the Model details window, enter a name for your model, use: damaged_car_parts_model
. Click Continue.
From the Training options window, click Continue.
From the Explainability window, click continue and for Compute and pricing window, set your budget to 8 maximum node hours.
Click Start Training.
Click Check my progress to verify the objective.
For the purposes of this lab, a model trained on the exact same dataset is hosted in a different project so that you can request predictions from it while your local model finishes training, as it is likely that the local model training will exceed the limit of this lab.
A proxy to the pre-trained model is set up for you so you don't need to run through any extra steps to get it working within your lab environment.
To request predictions from the model, you will send predictions to an endpoint inside of your project that will forward the request to the hosted model and return back the output. Sending a prediction to the AutoML Proxy is very similar to the way that you would interact with your model you just created, so you can use this as practice.
In the Google Cloud Console, on the Navigation menu (≡) click Cloud Run.
Click automl-proxy.
https://automl-proxy-xfpm6c62ta-uc.a.run.app
.You will use this endpoint for the prediction request in the next section.
Open a new Cloud Shell window.
On the Cloud Shell toolbar, click Open Editor.
Click File > New File.
Click on the link, copy the file content into the new file you just created:
Save the file and name it payload.json
.
For reference, the content you supplied is a Base64 string from the following image.
If you ran a successful prediction, your output should resemble the following:
For this model, the prediction results are pretty self-explanatory. The displayNames
field should correctly predict a bumper
with a high confidence threshold. Now, you can change the Base64 encoded image value in the JSON file you created.
Click Check my progress to verify the objective.
Right-click on each image below, then select Save image As….
Follow the prompts to save each image with a unique name. (Hint: Assign a simple name like 'Image1' and 'Image2' to assist with uploading).
Open the Base64 Image Encoder follow the instructions to upload and encode an image to a Base64 string.
Replace the Base64 encoded string value in the content
field in your JSON payload file, and run the prediction again. Repeat for the other image(s).
How did your model do? Did it predict all three images correctly? You should see the the following outputs, respectively:
In this lab, you learned how to train your own custom machine learning model and generate predictions on hosted model via an API request. You uploaded training images to Cloud Storage and used a CSV file for Vertex AI to find these images. You inspected the labeled images for any discrepancies before finally evaluating a trained model. Now you've got what it takes to train a model on your own image dataset!
This content is not currently available
We will notify you via email when it becomes available
Great!
We will contact you via email if it becomes available
One lab at a time
Confirm to end all existing labs and start this one