On-demand activities

Find the right on-demand learning activities for you. Labs are short learning activities that teach you a specific lesson by giving you direct, temporary, hands-on access to real cloud resources. Courses are longer activities, consisting of several modules made of videos, documents, hands-on labs and quizzes. Finally, quests are similar, but are usually shorter and contain only labs.

FILTRER PAR
Tout effacer
  • Badge
  • Format
  • Langue

1212 résultats
  1. Atelier Sélection

    Route Datadog Monitoring Alerts to Google Cloud with Eventarc

    In this lab, you will learn how to use the Datadog Monitoring with Google Eventarc.

  2. Atelier Sélection

    Cloud Composer: Qwik Start - Console

    In this lab, you create a Cloud Composer environment using the GCP Console. You then use the Airflow web interface to run a workflow that verifies a data file, creates and runs an Apache Hadoop wordcount job on a Dataproc cluster, and deletes the cluster.

  3. Atelier Sélection

    Building and Debugging Cloud Functions for Node.js

    In this lab, you will create a Cloud Function for Node.js that reports whether a specified temperature is acceptable or too hot. You will create, test, and debug your Cloud Function using Visual Studio Code on your local machine. Lastly, you'll deploy your function to Google Cloud Platform.

  4. Atelier Sélection

    Cloud Run Canary Deployments

    Implement a deployment pipeline for Cloud Run that executes a progression of code from developer branches to production with automated canary testing and percentage based traffic management.

  5. Atelier Sélection

    End to End Baseline Chat Summarization

    In this lab, you build a end to end Baseline Chat Summarisation pipeline.

  6. Atelier Sélection

    Serverless Data Processing with Dataflow - Writing an ETL Pipeline using Apache Beam and Dataflow (Python)

    In this lab, you a) build a batch ETL pipeline in Apache Beam, which takes raw data from Google Cloud Storage and writes it to BigQuery b) run the Apache Beam pipeline on Dataflow and c) parameterize the execution of the pipeline.

  7. Atelier Sélection

    Compare data analytics with BigQuery and Dataproc

    Create and import data using parquet files

  8. Atelier Sélection

    Serverless Data Processing with Dataflow - Writing an ETL pipeline using Apache Beam and Dataflow (Java)

    In this lab, you a) build a batch ETL pipeline in Apache Beam, which takes raw data from Google Cloud Storage and writes it to BigQuery b) run the Apache Beam pipeline on Dataflow and c) parameterize the execution of the pipeline.

  9. Atelier Sélection

    Automate GKE Configurations with Config Sync and Policy Controller

    Use GitOps-driven Config Sync to automate configuration and Policy Controller to enforce policies on GKE Enterprise resources

  10. Atelier Sélection

    Exploring Memorystore for Redis Cluster

    In this lab, you will explore the capabilities of Memorystore for Redis Cluster.