Temukan pelatihan Google Cloud sesuai keinginan Anda.
Dengan lebih dari 980 aktivitas pembelajaran yang dapat dipilih, Google Cloud telah merancang katalog yang komprehensif dengan mempertimbangkan kebutuhan Anda. Katalog ini terdiri dari berbagai format aktivitas yang dapat Anda pilih. Anda dapat memilih dari lab individual berdurasi singkat atau kursus multi-modul yang terdiri dari video, dokumen, lab, dan kuis. Lab kami memberi Anda kredensial sementara ke resource cloud nyata agar Anda dapat mempelajari Google Cloud secara langsung di platform aslinya. Dapatkan badge untuk aktivitas yang berhasil Anda selesaikan, serta tentukan target, lacak progres, dan ukur keberhasilan Anda di Google Cloud.
-
Lab Unggulan Route Datadog Monitoring Alerts to Google Cloud with Eventarc
In this lab, you will learn how to use the Datadog Monitoring with Google Eventarc.
-
Lab Unggulan Cloud Composer: Qwik Start - Console
In this lab, you create a Cloud Composer environment using the GCP Console. You then use the Airflow web interface to run a workflow that verifies a data file, creates and runs an Apache Hadoop wordcount job on a Dataproc cluster, and deletes the cluster.
-
Lab Unggulan Building and Debugging Cloud Functions for Node.js
In this lab, you will create a Cloud Function for Node.js that reports whether a specified temperature is acceptable or too hot. You will create, test, and debug your Cloud Function using Visual Studio Code on your local machine. Lastly, you'll deploy your function to Google Cloud Platform.
-
Lab Unggulan Cloud Run Canary Deployments
Implement a deployment pipeline for Cloud Run that executes a progression of code from developer branches to production with automated canary testing and percentage based traffic management.
-
Lab Unggulan End to End Baseline Chat Summarization
In this lab, you build a end to end Baseline Chat Summarisation pipeline.
-
Lab Unggulan Serverless Data Processing with Dataflow - Writing an ETL Pipeline using Apache Beam and Dataflow (Python)
In this lab, you a) build a batch ETL pipeline in Apache Beam, which takes raw data from Google Cloud Storage and writes it to BigQuery b) run the Apache Beam pipeline on Dataflow and c) parameterize the execution of the pipeline.
-
Lab Unggulan Compare data analytics with BigQuery and Dataproc
Create and import data using parquet files
-
Lab Unggulan Serverless Data Processing with Dataflow - Writing an ETL pipeline using Apache Beam and Dataflow (Java)
In this lab, you a) build a batch ETL pipeline in Apache Beam, which takes raw data from Google Cloud Storage and writes it to BigQuery b) run the Apache Beam pipeline on Dataflow and c) parameterize the execution of the pipeline.
-
Lab Unggulan Automate GKE Configurations with Config Sync and Policy Controller
Use GitOps-driven Config Sync to automate configuration and Policy Controller to enforce policies on GKE Enterprise resources
-
Lab Unggulan Exploring Memorystore for Redis Cluster
In this lab, you will explore the capabilities of Memorystore for Redis Cluster.