On-demand activities

Find the right on-demand learning activities for you. Labs are short learning activities that teach you a specific lesson by giving you direct, temporary, hands-on access to real cloud resources. Courses are longer activities, consisting of several modules made of videos, documents, hands-on labs and quizzes. Finally, quests are similar, but are usually shorter and contain only labs.

FILTRER PAR
Tout effacer
  • Badge
  • Format
  • Langue

1209 résultats
  1. Atelier Sélection

    Introduction to Migration Center Servers Assessments

    Learn how to assess a customer's existing environment, and import data collected from Azure/AWS infrastructure. Generate inventory, performance, network dependencies and financial reports such as TCO (total cost analysis) and DPR (detailed pricing) reports.

  2. Atelier Sélection

    Migrating Data to and from Spanner with Dataflow

    In this lab, you use Dataflow and Apache Beam to migrate data into Spanner.

  3. Atelier Sélection

    Contact Center AI Insights

    CCAI Insights provides contact center interaction data to answer business questions or support decisions to drive efficiency. It uses Google ML and Large Language Models to turn contact center data to insights to action. Based on the outcomes of the insights module various actions can be taken by contact center ma…

  4. Atelier Sélection

    Data Analysis with the FraudFinder Workshop

    FraudFinder is a series of JupyterLabs that show how to implement an end-to-end Data to AI architecture works on Google Cloud, through a toy use case of real-time fraud detection system.

  5. Atelier Sélection

    End to End Baseline Chat Summarization

    In this lab, you build a end to end Baseline Chat Summarisation pipeline.

  6. Atelier Sélection

    Block.one: Creating a Multi Node EOSIO Blockchain

    In this lab, you will extend the single node EOSIO blockchain to use multiple nodes.

  7. Atelier Sélection

    Cloud Composer: Qwik Start - Console

    In this lab, you create a Cloud Composer environment using the GCP Console. You then use the Airflow web interface to run a workflow that verifies a data file, creates and runs an Apache Hadoop wordcount job on a Dataproc cluster, and deletes the cluster.

  8. Atelier Sélection

    Building and Debugging Cloud Functions for Node.js

    In this lab, you will create a Cloud Function for Node.js that reports whether a specified temperature is acceptable or too hot. You will create, test, and debug your Cloud Function using Visual Studio Code on your local machine. Lastly, you'll deploy your function to Google Cloud Platform.

  9. Atelier Sélection

    Serverless Data Processing with Dataflow - Writing an ETL Pipeline using Apache Beam and Dataflow (Python)

    In this lab, you a) build a batch ETL pipeline in Apache Beam, which takes raw data from Google Cloud Storage and writes it to BigQuery b) run the Apache Beam pipeline on Dataflow and c) parameterize the execution of the pipeline.

  10. Atelier Sélection

    Exploring Memorystore for Redis Cluster

    In this lab, you will explore the capabilities of Memorystore for Redis Cluster.