Приєднатися Увійти

Henry Rausch

Учасник із 2024

Золота ліга

Кількість балів: 3405
Transformer Models and BERT Model Earned січ. 1, 2025 EST
Deploy Kubernetes Applications on Google Cloud Earned січ. 1, 2025 EST
Encoder-Decoder Architecture Earned лист. 27, 2024 EST
Attention Mechanism Earned лист. 26, 2024 EST
Introduction to Image Generation Earned лист. 26, 2024 EST

This course introduces you to the Transformer architecture and the Bidirectional Encoder Representations from Transformers (BERT) model. You learn about the main components of the Transformer architecture, such as the self-attention mechanism, and how it is used to build the BERT model. You also learn about the different tasks that BERT can be used for, such as text classification, question answering, and natural language inference.This course is estimated to take approximately 45 minutes to complete.

Докладніше

Complete the intermediate Deploy Kubernetes Applications on Google Cloud skill badge to demonstrate skills in the following: configuring and building Docker container images, creating and managing Google Kubernetes Engine (GKE) clusters, utilizing kubectl for efficient cluster management, and deploying Kubernetes applications with robust continuous delivery (CD) practices. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge course and the final assessment challenge lab to receive a skill badge that you can share with your network.

Докладніше

This course gives you a synopsis of the encoder-decoder architecture, which is a powerful and prevalent machine learning architecture for sequence-to-sequence tasks such as machine translation, text summarization, and question answering. You learn about the main components of the encoder-decoder architecture and how to train and serve these models. In the corresponding lab walkthrough, you’ll code in TensorFlow a simple implementation of the encoder-decoder architecture for poetry generation from the beginning.

Докладніше

This course will introduce you to the attention mechanism, a powerful technique that allows neural networks to focus on specific parts of an input sequence. You will learn how attention works, and how it can be used to improve the performance of a variety of machine learning tasks, including machine translation, text summarization, and question answering. This course is estimated to take approximately 45 minutes to complete.

Докладніше

This course introduces diffusion models, a family of machine learning models that recently showed promise in the image generation space. Diffusion models draw inspiration from physics, specifically thermodynamics. Within the last few years, diffusion models became popular in both research and industry. Diffusion models underpin many state-of-the-art image generation models and tools on Google Cloud. This course introduces you to the theory behind diffusion models and how to train and deploy them on Vertex AI.

Докладніше