Bill Kapsalis
成为会员时间:2017
白银联赛
58485 积分
成为会员时间:2017
Generative AI applications can create new user experiences that were nearly impossible before the invention of large language models (LLMs). As an application developer, how can you use generative AI to build engaging, powerful apps on Google Cloud? In this course, you'll learn about generative AI applications and how you can use prompt design and retrieval augmented generation (RAG) to build powerful applications using LLMs. You'll learn about a production-ready architecture that can be used for generative AI applications and you'll build an LLM and RAG-based chat application.
本課程旨在提供必要的知識和工具,協助您探索機器學習運作團隊在部署及管理生成式 AI 模型時面臨的獨特挑戰,並瞭解 Vertex AI 如何幫 AI 團隊簡化機器學習運作程序,打造成效非凡的生成式 AI 專案。
本課程介紹 Google Cloud 中的 AI 和機器學習 (ML) 服務。這些服務可建構預測式和生成式 AI 專案。我們將帶您探索「從資料到 AI」生命週期中適用的技術、產品和工具,包括 AI 基礎、開發選項及解決方案。課程目的是藉由生動的學習體驗與實作練習,增進數據資料學家、AI 開發人員和機器學習工程師的技能與知識。
Complete the introductory Build a Data Mesh with Dataplex skill badge to demonstrate skills in the following: building a data mesh with Dataplex to facilitate data security, governance, and discovery on Google Cloud. You practice and test your skills in tagging assets, assigning IAM roles, and assessing data quality in Dataplex. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this Skill Badge, and the final assessment challenge lab, to receive a digital badge that you can share with your network.
In this course, you learn about containers and how to build, and package container images. The content in this course includes best practices for creating and securing containers, and provides an introduction to Cloud Run and Google Kubernetes Engine for application developers.
完成 Develop Serverless Apps with Firebase 技能徽章中階課程, 即可證明您具備下列技能:使用 Firebase 架構及建構無伺服器的網頁應用程式、 運用 Firestore 管理資料庫、使用 Cloud Build 自動部署內容, 以及將 Google 助理功能整合至應用程式。 「技能徽章」是 Google Cloud 核發的獨家數位徽章, 用於肯定您在 Google Cloud 產品與服務方面的精通程度, 代表您已通過測驗,能在互動式實作環境中應用相關知識。完成 本課程及結業評量挑戰研究室,即可取得技能徽章 並與親友分享。
完成 Engineer Data for Predictive Modeling with BigQuery ML 技能徽章中階課程, 即可證明您具備下列技能:運用 Dataprep by Trifacta 建構連至 BigQuery 的資料轉換管道、 使用 Cloud Storage、Dataflow 和 BigQuery 建構「擷取、轉換及載入」(ETL) 的工作負載、 運用 BigQuery ML 建構機器學習模型,以及使用 Cloud Composer 複製多個位置的資料。「技能徽章」 是 Google Cloud 核發的獨家數位徽章, 用於肯定您在 Google Cloud 產品和服務方面的精通程度, 代表您已通過測驗,能在互動式實作環境中應用相關知識。完成 本課程及結業評量挑戰研究室,即可獲得技能徽章 並與親友分享。
本課程介紹的 Gemini 是採用生成式 AI 技術的協作工具,可協助分析客戶資料及預測產品銷售情形。您也會學習如何在 BigQuery 中使用客戶資料識別、分類及開發新客戶。透過使用實作研究室,您可以體驗 Gemini 如何改良資料分析和機器學習工作流程。 Duet AI 已更名為 Gemini,這是我們的新一代模型。
本課程將介紹擴散模型,這是一種機器學習模型,近期在圖像生成領域展現亮眼潛力。概念源自物理學,尤其深受熱力學影響。過去幾年來,在學術界和業界都是炙手可熱的焦點。在 Google Cloud 中,擴散模型是許多先進圖像生成模型和工具的基礎。課程將介紹擴散模型背後的理論,並說明如何在 Vertex AI 上訓練和部署這些模型。
隨著企業持續擴大使用人工智慧和機器學習,以負責任的方式發展相關技術也日益重要。對許多企業來說,談論負責任的 AI 技術可能不難,如何付諸實行才是真正的挑戰。如要瞭解如何在機構中導入負責任的 AI 技術,本課程絕對能助您一臂之力。 您可以從中瞭解 Google Cloud 目前採取的策略、最佳做法和經驗談,協助貴機構奠定良好基礎,實踐負責任的 AI 技術。
這個簡短的課程會說明如何在 Google Cloud 整合應用程式與 Gemini 1.0 Pro 模型,讓您瞭解 Gemini API 及其生成式 AI 模型,並學習如何透過程式碼存取 Gemini 1.0 Pro 和 Gemini 1.0 Pro Vision 模型。另外,您會在應用程式中使用文字、圖片和影片提示,測試模型的功能。
In the last installment of the Dataflow course series, we will introduce the components of the Dataflow operational model. We will examine tools and techniques for troubleshooting and optimizing pipeline performance. We will then review testing, deployment, and reliability best practices for Dataflow pipelines. We will conclude with a review of Templates, which makes it easy to scale Dataflow pipelines to organizations with hundreds of users. These lessons will help ensure that your data platform is stable and resilient to unanticipated circumstances.
This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.
In this second installment of the Dataflow course series, we are going to be diving deeper on developing pipelines using the Beam SDK. We start with a review of Apache Beam concepts. Next, we discuss processing streaming data using windows, watermarks and triggers. We then cover options for sources and sinks in your pipelines, schemas to express your structured data, and how to do stateful transformations using State and Timer APIs. We move onto reviewing best practices that help maximize your pipeline performance. Towards the end of the course, we introduce SQL and Dataframes to represent your business logic in Beam and how to iteratively develop pipelines using Beam notebooks.
Complete the introductory Create and Manage AlloyDB Instances skill badge to demonstrate skills in the following: performing core AlloyDB operations and tasks, migrating to AlloyDB from PostgreSQL, administering an AlloyDB database, and accelerating analytical queries using the AlloyDB Columnar Engine. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge course, and the final assessment challenge lab, to receive a skill badge that you can share with your network.
Complete the intermediate Create and Manage Bigtable Instances skill badge to demonstrate skills in the following: creating instances, designing schemas, querying data, and performing administrative tasks in Bigtable including monitoring performance and configuring node autoscaling and replication. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this Skill Badge, and the final assessment challenge lab, to receive a digital badge that you can share with your network.
This course introduces participants to MLOps tools and best practices for deploying, evaluating, monitoring and operating production ML systems on Google Cloud. MLOps is a discipline focused on the deployment, testing, monitoring, and automation of ML systems in production. Learners will get hands-on practice using Vertex AI Feature Store's streaming ingestion at the SDK layer.
Complete the introductory Create and Manage Cloud Spanner Instances skill badge to demonstrate skills in the following: creating and interacting with Cloud Spanner instances and databases; loading Cloud Spanner databases using various techniques; backing up Cloud Spanner databases; defining schemas and understanding query plans; and deploying a Modern Web App connected to a Cloud Spanner instance. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge course, and the final assessment challenge lab, to receive a skill badge that you can share with your network.
Complete the introductory Create and Manage Cloud SQL for PostgreSQL Instances skill badge to demonstrate skills in the following: migrating, configuring, and managing Cloud SQL for PostgreSQL instances and databases. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge course, and the final assessment challenge lab, to receive a skill badge that you can share with your network.
Complete the introductory Migrate MySQL data to Cloud SQL using Database Migration Services skill badge to demonstrate skills in the following: migrating MySQL data to Cloud SQL using different job types and connectivity options available in Database Migration Service and migrating MySQL user data when running Database Migration Service jobs. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge quest, and the final assessment challenge lab, to receive a skill badge that you can share with your network.
This course is intended to give architects, engineers, and developers the skills required to help enterprise customers architect, plan, execute, and test database migration projects. Through a combination of presentations, demos, and hands-on labs participants move databases to Google Cloud while taking advantage of various services. This course covers how to move on-premises, enterprise databases like SQL Server to Google Cloud (Compute Engine and Cloud SQL) and Oracle to Google Cloud bare metal.
Gemini 版 Google Workspace 是一項外掛程式,可讓客戶在 Google Workspace 使用生成式 AI 功能。這堂迷你課程會介紹 Gemini 的主要功能,並說明如何在 Gmail 善用這些功能,提高生產力和效率。
客戶能透過 Gemini 版 Google Workspace 外掛程式在 Google Workspace 使用生成式 AI 功能。本學習路徑會介紹 Gemini 的主要功能,並說明如何在 Google Workspace 善用這些功能,提高生產力和效率。
本課程介紹的 Gemini 是採用生成式 AI 技術的協作工具,可協助開發人員透過 Google Cloud 建構應用程式。您將瞭解如何透過提示讓 Gemini 為您解釋程式碼內容、推薦 Google Cloud 服務,以及生成應用程式的程式碼。在實作研究室中,您也會體驗到 Gemini 如何改良應用程式的開發工作流程。 Duet AI 已更名為 Gemini,這是我們的新一代模型。
本課程介紹的 Gemini 是採用生成式 AI 技術的協作工具,可協助您透過 Google Cloud 使用 Google 產品和服務,開發、測試、部署及管理應用程式。有了 Gemini 的協助,您會學到如何開發和建構網頁應用程式、修正應用程式中的錯誤、開發測試及查詢資料。在實作研究室中,您也會體驗到 Gemini 如何改良軟體開發生命週期 (SDLC)。 Duet AI 已更名為 Gemini,這是我們的新一代模型。
本課程介紹的 Gemini 是採用生成式 AI 技術的協作工具,可協助工程師透過 Google Cloud 管理基礎架構。您將學到如何透過提示讓 Gemini 尋找和瞭解應用程式記錄檔、建立 GKE 叢集,以及研究如何打造建構環境。在實作研究室中,您也會瞭解 Gemini 如何改良開發運作的工作流程。 Duet AI 已更名為 Gemini,是我們新一代的模型。
本課程介紹的 Gemini 是採用生成式 AI 技術的協作工具,可協助您透過 Google Cloud 保護雲端環境和資源。您將學到如何將工作負載範例部署到 Google Cloud 中的環境,以及運用 Gemini 找出並修復安全性設定錯誤。在實作研究室中,您也會體驗到 Gemini 如何改良雲端安全防護機制。 Duet AI 已更名為 Gemini,這是我們的新一代模型。
本課程介紹的 Gemini 是採用生成式 AI 技術的協作工具,可協助網路工程師建立、更新及維護虛擬私有雲網路。您將瞭解如何透過提示讓 Gemini 為網路工作提供指引,獲得比搜尋結果更具體的資訊。在實作研究室中,您也會體驗到 Gemini 如何簡化 Google Cloud 虛擬私有雲網路的作業。 Duet AI 已更名為 Gemini,這是我們的新一代模型。
本課程介紹的 Gemini 是採用生成式 AI 技術的協作工具,可協助分析客戶資料及預測產品銷售情形。您也會學習如何在 BigQuery 中使用客戶資料識別、分類及開發新客戶。透過使用實作研究室,您可以體驗 Gemini 如何改良資料分析和機器學習工作流程。 Duet AI 已更名為 Gemini,這是我們的新一代模型。
本課程介紹的 Gemini 是採用生成式 AI 技術的協作工具,可協助管理員在 Google Cloud 佈建基礎架構。您將瞭解如何透過提示讓 Gemini 解釋基礎架構、部署 GKE 叢集,以及更新既有的基礎架構。在實作研究室中,您也會體驗到 Gemini 如何改良 GKE 的部署工作流程。 Duet AI 已更名為 Gemini,這是我們的新一代模型。
完成「Introduction to Generative AI」、「Introduction to Large Language Models」和「Introduction to Responsible AI」課程,即可獲得技能徽章。通過最終測驗,就能展現您對生成式 AI 基本概念的掌握程度。 「技能徽章」是 Google Cloud 核發的數位徽章,用於表彰您對 Google Cloud 產品和服務的相關知識。您可以將技能徽章公布在社群媒體的個人資料中,向其他人分享您的成果。
這個入門微學習課程主要介紹「負責任的 AI 技術」和其重要性,以及 Google 如何在自家產品中導入這項技術。本課程也會說明 Google 的 7 個 AI 開發原則。
這是一堂入門級的微學習課程,旨在探討大型語言模型 (LLM) 的定義和用途,並說明如何調整提示來提高 LLM 成效。此外,也會介紹多項 Google 工具,協助您自行開發生成式 AI 應用程式。
這個入門微學習課程主要說明生成式 AI 的定義和使用方式,以及此 AI 與傳統機器學習方法的差異。本課程也會介紹各項 Google 工具,協助您開發自己的生成式 AI 應用程式。
In this course, application developers learn how to design and develop cloud-native applications that seamlessly integrate components from the Google Cloud ecosystem. Through a combination of presentations, demos, and hands-on labs, participants learn how to create repeatable deployments by treating infrastructure as code, choose the appropriate application execution environment for an application, and monitor application performance. Completing one version of each lab is required. Each lab is available in Node.js. In most cases, the same labs are also provided in Python or Java. You may complete each lab in whichever language you prefer.
In this course, application developers learn how to design and develop cloud-native applications that seamlessly integrate managed services from Google Cloud. Through a combination of presentations, demos, and hands-on labs, participants learn how to develop more secure applications, implement federated identity management, and integrate application components by using messaging, event-driven processing, and API gateways. Completing one version of each lab is required. Each lab is available in Node.js. In most cases, the same labs are also provided in Python or Java. You may complete each lab in whichever language you prefer. This is the second course of the Developing Applications with Google Cloud series. After completing this course, enroll in the App Deployment, Debugging, and Performance course.
「Google Cloud 基礎知識:核心基礎架構」介紹了在使用 Google Cloud 時會遇到的重要概念和術語。本課程會透過影片和實作實驗室,介紹並比較 Google Cloud 的多種運算和儲存服務,同時提供重要的資源和政策管理工具。
This course describes different types of computer vision use cases and then highlights different machine learning strategies for solving these use cases. The strategies vary from experimenting with pre-built ML models through pre-built ML APIs and AutoML Vision to building custom image classifiers using linear models, deep neural network (DNN) models or convolutional neural network (CNN) models. The course shows how to improve a model's accuracy with augmentation, feature extraction, and fine-tuning hyperparameters while trying to avoid overfitting the data. The course also looks at practical issues that arise, for example, when one doesn't have enough data and how to incorporate the latest research findings into different models. Learners will get hands-on practice building and optimizing their own image classification models on a variety of public datasets in the labs they will work on.
In this course, you will be learning from ML Engineers and Trainers who work with the state-of-the-art development of ML pipelines here at Google Cloud. The first few modules will cover about TensorFlow Extended (or TFX), which is Google’s production machine learning platform based on TensorFlow for management of ML pipelines and metadata. You will learn about pipeline components and pipeline orchestration with TFX. You will also learn how you can automate your pipeline through continuous integration and continuous deployment, and how to manage ML metadata. Then we will change focus to discuss how we can automate and reuse ML pipelines across multiple ML frameworks such as tensorflow, pytorch, scikit learn, and xgboost. You will also learn how to use another tool on Google Cloud, Cloud Composer, to orchestrate your continuous training pipelines. And finally, we will go over how to use MLflow for managing the complete machine learning life cycle.
This course introduces participants to MLOps tools and best practices for deploying, evaluating, monitoring and operating production ML systems on Google Cloud. MLOps is a discipline focused on the deployment, testing, monitoring, and automation of ML systems in production. Machine Learning Engineering professionals use tools for continuous improvement and evaluation of deployed models. They work with (or can be) Data Scientists, who develop models, to enable velocity and rigor in deploying the best performing models.
完成 Prepare Data for ML APIs on Google Cloud 技能徽章入門課程,即可證明您具備下列技能: 使用 Dataprep by Trifacta 清理資料、在 Dataflow 執行資料管道、在 Dataproc 建立叢集和執行 Apache Spark 工作,以及呼叫機器學習 API,包含 Cloud Natural Language API、Google Cloud Speech-to-Text API 和 Video Intelligence API。 「技能徽章」是 Google Cloud 核發的獨家數位徽章,用於肯定您在 Google Cloud 產品與服務方面的精通程度, 代表您已通過測驗,能在互動式實作環境中應用相關知識。完成本技能徽章課程及結業評量挑戰研究室, 即可取得技能徽章並與他人分享。
完成 Build and Deploy Machine Learning Solutions with Vertex AI 課程,即可瞭解如何使用 Google Cloud 的 Vertex AI 平台、AutoML 和自訂訓練服務, 訓練、評估、調整、解釋及部署機器學習模型。 這個技能徽章課程適合專業數據資料學家和機器學習 工程師,完成即可取得中階技能徽章。技能 徽章是 Google Cloud 核發的獨家數位徽章, 用於肯定您在 Google Cloud 產品和服務方面的精通程度, 代表您已通過測驗,能在互動式實作環境應用相關知識。完成這個技能徽章課程 和結業評量挑戰實驗室,就能獲得數位徽章, 並與親友分享。
In this course, you apply your knowledge of classification models and embeddings to build a ML pipeline that functions as a recommendation engine. This is the fifth and final course of the Advanced Machine Learning on Google Cloud series.
This course introduces the products and solutions to solve NLP problems on Google Cloud. Additionally, it explores the processes, techniques, and tools to develop an NLP project with neural networks by using Vertex AI and TensorFlow.
This course covers how to implement the various flavors of production ML systems— static, dynamic, and continuous training; static and dynamic inference; and batch and online processing. You delve into TensorFlow abstraction levels, the various options for doing distributed training, and how to write distributed training models with custom estimators. This is the second course of the Advanced Machine Learning on Google Cloud series. After completing this course, enroll in the Image Understanding with TensorFlow on Google Cloud course.
This course takes a real-world approach to the ML Workflow through a case study. An ML team faces several ML business requirements and use cases. The team must understand the tools required for data management and governance and consider the best approach for data preprocessing. The team is presented with three options to build ML models for two use cases. The course explains why they would use AutoML, BigQuery ML, or custom training to achieve their objectives.
This course explores the benefits of using Vertex AI Feature Store, how to improve the accuracy of ML models, and how to find which data columns make the most useful features. This course also includes content and labs on feature engineering using BigQuery ML, Keras, and TensorFlow.
This course covers building ML models with TensorFlow and Keras, improving the accuracy of ML models and writing ML models for scaled use.
The course begins with a discussion about data: how to improve data quality and perform exploratory data analysis. We describe Vertex AI AutoML and how to build, train, and deploy an ML model without writing a single line of code. You will understand the benefits of Big Query ML. We then discuss how to optimize a machine learning (ML) model and how generalization and sampling can help assess the quality of ML models for custom training.
This course explores what ML is and what problems it can solve. The course also discusses best practices for implementing machine learning. You’re introduced to Vertex AI, a unified platform to quickly build, train, and deploy AutoML machine learning models. The course discusses the five phases of converting a candidate use case to be driven by machine learning, and why it’s important to not skip them. The course ends with recognizing the biases that ML can amplify and how to recognize them.
This course is part 1 of a 3-course series on Serverless Data Processing with Dataflow. In this first course, we start with a refresher of what Apache Beam is and its relationship with Dataflow. Next, we talk about the Apache Beam vision and the benefits of the Beam Portability framework. The Beam Portability framework achieves the vision that a developer can use their favorite programming language with their preferred execution backend. We then show you how Dataflow allows you to separate compute and storage while saving money, and how identity, access, and management tools interact with your Dataflow pipelines. Lastly, we look at how to implement the right security model for your use case on Dataflow.
Incorporating machine learning into data pipelines increases the ability to extract insights from data. This course covers ways machine learning can be included in data pipelines on Google Cloud. For little to no customization, this course covers AutoML. For more tailored machine learning capabilities, this course introduces Notebooks and BigQuery machine learning (BigQuery ML). Also, this course covers how to productionalize machine learning solutions by using Vertex AI.
Data pipelines typically fall under one of the Extract and Load (EL), Extract, Load and Transform (ELT) or Extract, Transform and Load (ETL) paradigms. This course describes which paradigm should be used and when for batch data. Furthermore, this course covers several technologies on Google Cloud for data transformation including BigQuery, executing Spark on Dataproc, pipeline graphs in Cloud Data Fusion and serverless data processing with Dataflow. Learners get hands-on experience building data pipeline components on Google Cloud using Qwiklabs.
The two key components of any data pipeline are data lakes and warehouses. This course highlights use-cases for each type of storage and dives into the available data lake and warehouse solutions on Google Cloud in technical detail. Also, this course describes the role of a data engineer, the benefits of a successful data pipeline to business operations, and examines why data engineering should be done in a cloud environment. This is the first course of the Data Engineering on Google Cloud series. After completing this course, enroll in the Building Batch Data Pipelines on Google Cloud course.
Processing streaming data is becoming increasingly popular as streaming enables businesses to get real-time metrics on business operations. This course covers how to build streaming data pipelines on Google Cloud. Pub/Sub is described for handling incoming streaming data. The course also covers how to apply aggregations and transformations to streaming data using Dataflow, and how to store processed records to BigQuery or Bigtable for analysis. Learners get hands-on experience building streaming data pipeline components on Google Cloud by using QwikLabs.
This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.
This course introduces the Google Cloud big data and machine learning products and services that support the data-to-AI lifecycle. It explores the processes, challenges, and benefits of building a big data pipeline and machine learning models with Vertex AI on Google Cloud.
These labs help you get started with the skills you need to develop and train ML models. At the end of each lab, you’ll have hands-on experience with one or more of Google Cloud’s powerful data tools. Complete this game to earn a badge, and you’ll be one step closer to completing the challenge. Race the clock to increase your score and watch your name rise on the leaderboard!
These labs help you get started with the skills you need to analyze sports-related data. At the end of each lab, you’ll have hands-on experience with one or more of Google Cloud’s powerful data tools. Complete this game to earn a badge, and you’ll be one step closer to completing the challenge. Race the clock to increase your score and watch your name rise on the leaderboard!
This is the second of two Quests of hands-on labs derived from the exercises from the book Data Science on Google Cloud Platform, 2nd Edition by Valliappa Lakshmanan, published by O'Reilly Media, Inc. In this second Quest, covering chapter 9 through the end of the book, you extend the skills practiced in the first Quest, and run full-fledged machine learning jobs with state-of-the-art tools and real-world data sets, all using Google Cloud tools and services.
Welcome to the Learn to Earn Cloud Data Challenge! These labs help you get started with data analysis skills. At the end of each lab, you’ll have hands-on experience with one or more of Google Cloud’s powerful data tools. Complete this game to earn a badge, and you’ll be one step closer to completing the challenge. Race the clock to increase your score and watch your name rise on the leaderboard!
完成 Build a Data Warehouse with BigQuery 技能徽章中階課程,即可證明您具備下列技能: 彙整資料以建立新資料表、排解彙整作業問題、利用聯集附加資料、建立依日期分區的資料表, 以及在 BigQuery 使用 JSON、陣列和結構體。 「技能徽章」是 Google Cloud 核發的獨家數位徽章, 用於肯定您在 Google Cloud 產品和服務方面的精通程度, 代表您已通過測驗,能在互動式實作環境中應用相關 知識。完成技能徽章課程及結業評量挑戰研究室, 即可取得技能徽章並與他人分享。
Welcome to the Learn To Earn Cloud Challenge data plus track! "The Google Cloud Certified Professional Data Engineer certification is associated with the highest paying salary in IT," according to the most recent Global Knowledge skills and salary report (published August 2021). Complete this game to earn the Data Plus game badge, and be eligible for a +bonus+ prize in the Learn to Earn Cloud Challenge. See "what's next" below for details and requirements; you’ll need to earn at least 2 additional challenge badges to qualify. You'll learn next-level data skills to add to your resume. Race the clock to increase your score and watch your name rise on the leaderboard. Good luck!
In this quest, you will gain hands-on experience on several topics in Google Workspace Administration including security, provisioning users and groups, managing applications, and managing Google Meet.
Welcome to the Learn To Earn Cloud Challenge security track! These eight labs give you the keys to understanding GCP's powerful security suite. At the end of each lab, you'll have hands-on experience with securing your cloud. Complete this game to earn the Security game badge, and you'll be one step closer to collecting all four badges (see "what's next" below for more information). Race the clock to increase your score and watch your name rise on the leaderboard. Good luck!
Welcome to the Learn To Earn Cloud Challenge architecture track! These eight labs give you a blueprint of GCP's building blocks. At the end of each lab, you'll have hands-on experience with another tool or service to add to your resume. Complete this game to earn the Architecture game badge, and you'll be one step closer to collecting all four Learn to Earn Cloud Challenge badges (see "what's next" below for more information). Race the clock to increase your score and watch your name rise on the leaderboard. Good luck!
Welcome to the Learn To Earn Cloud Challenge data track! These eight labs give you a deep dive into GCP's data universe. At the end of each lab, you'll have another in-demand skill to add to your list. Complete this game to earn the Data game badge, and you'll be one step closer to collecting all four Learn to Earn Cloud Challenge badges (see "what's next" below for more information). Race the clock to increase your score and watch your name rise on the leaderboard. Good luck!
Welcome to the Learn To Earn Cloud Challenge! These eight labs give you a quick hands-on introduction to eight different GCP tools and services. At the end of each lab, you'll have another skill to add to your list. Complete this game to earn the Essentials game badge, and you'll be one step closer to collecting all four Learn to Earn Cloud Challenge badges (see "what's next" below for more information). Race the clock to increase your score and watch your name rise on the leaderboard. Good luck!
Earn the advanced skill badge by completing the Use Machine Learning APIs on Google Cloud course, where you learn the basic features for the following machine learning and AI technologies: Cloud Vision API, Cloud Cloud Translation API, and Cloud Natural Language API. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge course, and the final assessment challenge lab, to receive a skill badge that you can share with your network.
In honor of the Launch of the Google Cloud Community and the Learning & Certification Hub (bit.ly/3gNtlMB), we’ve devised a fun little game to catapult your skills in BigQuery and give you a chance to win the First. Ever. Community. Learning. Challenge. (Pause for effect) Not only will participating in this game give you hands-on practice to boost key skills in BQ, but simply participating will earn you some sweet online badges. Plus, the winner of the game will be rewarded...handsomely . More details below... Complete every activity in this game and earn a Skill Badge which you can share across your network with pride. Complete every activity AND post a screenshot of your newly minted Skill Badge in the Learning Forum area of the Community (bit.ly/35LoLIB) to earn a special Community Badge which you can then flaunt over other community members shamelessly. The Community member at the top of the scoreboard at the time the game is closed will be rewarded with something objectively …
This advanced-level quest is unique amongst the other catalog offerings. The labs have been curated to give IT professionals hands-on practice with topics and services that appear in the Google Cloud Certified Professional Data Engineer Certification. From Big Query, to Dataprep, to Cloud Composer, this quest is composed of specific labs that will put your Google Cloud data engineering knowledge to the test. Be aware that while practice with these labs will increase your skills and abilities, you will need other preparation, too. The exam is quite challenging and external studying, experience, and/or background in cloud data engineering is recommended. Looking for a hands on challenge lab to demonstrate your skills and validate your knowledge? On completing this quest, enroll in and finish the additional challenge lab at the end of the Engineer Data in the Google Cloud to receive an exclusive Google Cloud digital badge.
This is the first of two Quests of hands-on labs is derived from the exercises from the book Data Science on Google Cloud Platform, 2nd Edition by Valliappa Lakshmanan, published by O'Reilly Media, Inc. In this first Quest, covering up through chapter 8, you are given the opportunity to practice all aspects of ingestion, preparation, processing, querying, exploring and visualizing data sets using Google Cloud tools and services.
In this advanced-level quest, you will learn how to harness serious Google Cloud computing power to run big data and machine learning jobs. The hands-on labs will give you use cases, and you will be tasked with implementing big data and machine learning practices utilized by Google’s very own Solutions Architecture team. From running Big Query analytics on tens of thousands of basketball games, to training TensorFlow image classifiers, you will quickly see why Google Cloud is the go-to platform for running big data and machine learning jobs.