
Before you begin
- Labs create a Google Cloud project and resources for a fixed time
- Labs have a time limit and no pause feature. If you end the lab, you'll have to restart from the beginning.
- On the top left of your screen, click Start lab to begin
Enable VPC flow logging on subnets and create instances
/ 50
Create exports and analyze flow logs in BigQuery
/ 50
In this lab, you will investigate VPC flow logs. VPC flow logs record network flows sent from or received by VM instances. These logs can be used for network monitoring, forensics, real-time security analysis, and even for expense optimization.
During this lab you enable VPC flow logging and use Cloud Logging to view the logs. Next, you perform network monitoring, forensics, and real-time security analysis, before finally, disabling VPC flow logging.
In this lab, you learn how to:
For each lab, you get a new Google Cloud project and set of resources for a fixed time at no cost.
Sign in to Qwiklabs using an incognito window.
Note the lab's access time (for example, 1:15:00
), and make sure you can finish within that time.
There is no pause feature. You can restart if needed, but you have to start at the beginning.
When ready, click Start lab.
Note your lab credentials (Username and Password). You will use them to sign in to the Google Cloud Console.
Click Open Google Console.
Click Use another account and copy/paste credentials for this lab into the prompts.
If you use other credentials, you'll receive errors or incur charges.
Accept the terms and skip the recovery resource page.
Google Cloud Shell is a virtual machine that is loaded with development tools. It offers a persistent 5GB home directory and runs on the Google Cloud.
Google Cloud Shell provides command-line access to your Google Cloud resources.
In Cloud console, on the top right toolbar, click the Open Cloud Shell button.
Click Continue.
It takes a few moments to provision and connect to the environment. When you are connected, you are already authenticated, and the project is set to your PROJECT_ID. For example:
gcloud is the command-line tool for Google Cloud. It comes pre-installed on Cloud Shell and supports tab-completion.
Output:
Example output:
Output:
Example output:
In this task, you enable flow logging for two subnets. Flow logging can also be enabled in exactly the same manner for custom user-defined networks. Flow logging can also be enabled when you create a subnet in one step.
Notice that you enabled VPC flow logging for the subnets in
ZONE_RESOURCE_POOL_EXHAUSTED
error, update the zone in the gcloud command and try running it again. For example, if us-central1-a
is failing, try us-central1-b
instead.Click Check my progress to verify the objective.
In this task, you create network traffic to test the connectivity between the instances.
Instance | Internal IP | External IP |
---|---|---|
default-ap-vm | ***** |
***** |
default-eu-vm | ***** |
***** |
default-us-vm | ***** |
***** |
default-us-vm
instance, click SSH.These commands did the following:
Next, you run the same commands as above, but instead of pinging the default-eu-vm from the default-us-vm, you ping the default-us-vm from the default-eu-vm.
default-eu-vm
instance, click SSH.Return to the Cloud Console, and in the row for the default-ap-vm
instance, click SSH.
When connected via SSH, issue the following commands:
In this task, you view the VPC flow logs for all the projects in Cloud Logging.
In the Cloud Console, go to the Navigation menu > Logging > Logs Explorer. (If you do not see this menu option, then continue this lab, at step 3).
In the Query panel, under All Resource, click Subnetwork and then Apply.
Under All Log names, click compute.googleapis.com/vpc_flows and then Apply.
compute.googleapis.com%2Fvpc_flows
instead.Click Run query.
In the Query results panel, entries from the VPC flow logs appear. If you do not see compute.googleapis.com/vpc_flows, wait a few minutes for this log type to show up.
Expand one of the log entries.
Within that log entry, expand jsonPayload
, and then expand connection
.
jsonPayload
fields. You may need to open other entries until you find one that does.Investigate the information about the connection; specifically notice that the port and IP address of both the source and destination have been logged.
Find a log that has src_instance
in jsonPayload
. Investigate the information about this src instance.
The src_instance
may be one of your instances or an outside IP address if the traffic came from outside your VPC network.
In this task, you perform advanced filtering using Query builder. You also explore access logs for specific source or destination IP, and for specific ports and protocols.
An advanced logs filter is a Boolean expression that specifies a subset of all the log entries in your project. A few things it can be used for include:
Click inside the Query box.
Remove the current query and paste in the following, replacing <INSERT_PROJECT_ID>
with your Qwiklabs Google Cloud Project ID:
Internal_IP_Of_default_us_vm
with the internal IP address of your default_us_vm
instance (you recorded this earlier):You now see all the log entries where the source IP address is the internal IP address of the default-us-vm
instance. These should be the same entries as for the earlier filter that showed the source instance of default-us-vm
.
src_ip
) and replace it with the following line that will only show entries with a destination port of 22 (SSH):Click Run query and observe the results. You should see three logs because you SSH-ed into the VMs three times.
Modify the last line of the filter to only show traffic with a destination port of 80 (HTTP):
Click Run query and observe the results.
Match multiple ports by replacing the last line and adding this statment with an OR
to the end of the port filter:
You may see a few entries in the log. These would correspond to the DNS calls you made with the host utility.
ping
. ICMP is protocol #1. If you create a filter to show protocol #1, you will not see any entries. Try this if you want.In this task, you create a filter and use it to see whether any RDP traffic is attempting to access the VPC.
Because you are running Linux instances within the VPC network, you should not see any RDP traffic. However, there is a firewall rule within the default settings that allows RDP traffic from anywhere, and this means that someone or something on the internet could attempt to connect to your servers using the RDP protocol.
If you did see some RDP traffic, which often happens, that demonstrates how pervasive unwanted traffic can be on the internet. It also shows why ensuring that VPC networks and firewall rules are set up correctly is important.
By using the default VPC in this lab, with all of the default firewall rules, you saw the results of not fine-tuning these rules to help exclude unwanted traffic.
Flow logs can be exported to any destination that Cloud Logging export supports (Pub/Sub, BigQuery, etc.). Creating an export will export future matching logs to the selected destination, existing logs will not be exported.
Exporting logs to BigQuery allows for the BigQuery analysis tools to be used on your logs to perform analysis. Exporting logs to Pub/Sub allows your logs to be streamed to other applications, other repositories, or third parties.
When you create a Cloud export, the current filter will be applied to the export. This means that only events that match the filter will be exported. For example, if you have a filter that only displays traffic to the destination port 3389, only traffic that matches that filter will be exported.
This can help reduce the amount of data that is exported (which could lower costs) or allow for different exports depending on what the data contains. For this lab you will clear the filter and export all logs because you do not have much traffic.
flowlogs_dataset
, and then click Create dataset.You should be able to see the sink you created (FlowLogBQExport). If you are unable to see the sink click on Logs Router.
Now create an export to Pub/Sub.
The Logs Router Sinks page appears. You should be able to see the sink you created (FlowLogsTopic). If you are unable to see the sink click on Logs Router.
This will show the filter that was present when the export was created.
You can now subscribe to the new Pub/Sub topic and receive notifications when new logs are available. This allows for the logs to be streamed to and integrated with a SIEM (Security Information and Event Management) tool.
SIEM tools are used to gain real-time operational insights and create audit reports using powerful visualization capabilities.
In this task, you analyze flow logs in BigQuery.
compute_googleapis_com_vpc_flows_
and review the schemas and details of the tables that are being used.This query returns information about traffic connecting to port 22.
After a couple of seconds, the results are displayed. You should see one or two entries, which is the activity you generated in this lab. Remember, BigQuery is only showing activity since the export was created.
Click Check my progress to verify the objective.
In this task, you disable VPC flow logging. VPC flow logging can be turned off with the --no-enable-flow-logs
option.
In this lab, you enabled VPC flow logging for a subnet. You accessed logs via Cloud Logging. You also filtered logs for specific subnets, VMs, ports, and protocols. Next, you performed network monitoring, forensics, and real-time security analysis. Finally, you disabled VPC flow logging.
When you have completed your lab, click End Lab. Google Cloud Skills Boost removes the resources you’ve used and cleans the account for you.
You will be given an opportunity to rate the lab experience. Select the applicable number of stars, type a comment, and then click Submit.
The number of stars indicates the following:
You can close the dialog box if you don't want to provide feedback.
For feedback, suggestions, or corrections, please use the Support tab.
Copyright 2025 Google LLC All rights reserved. Google and the Google logo are trademarks of Google LLC. All other company and product names may be trademarks of the respective companies with which they are associated.
This content is not currently available
We will notify you via email when it becomes available
Great!
We will contact you via email if it becomes available
One lab at a time
Confirm to end all existing labs and start this one