Questo è un corso di microlearning di livello introduttivo volto a spiegare cos'è l'AI generativa, come viene utilizzata e in che modo differisce dai tradizionali metodi di machine learning. Descrive inoltre gli strumenti Google che possono aiutarti a sviluppare le tue app Gen AI.
In the last installment of the Dataflow course series, we will introduce the components of the Dataflow operational model. We will examine tools and techniques for troubleshooting and optimizing pipeline performance. We will then review testing, deployment, and reliability best practices for Dataflow pipelines. We will conclude with a review of Templates, which makes it easy to scale Dataflow pipelines to organizations with hundreds of users. These lessons will help ensure that your data platform is stable and resilient to unanticipated circumstances.
In this second installment of the Dataflow course series, we are going to be diving deeper on developing pipelines using the Beam SDK. We start with a review of Apache Beam concepts. Next, we discuss processing streaming data using windows, watermarks and triggers. We then cover options for sources and sinks in your pipelines, schemas to express your structured data, and how to do stateful transformations using State and Timer APIs. We move onto reviewing best practices that help maximize your pipeline performance. Towards the end of the course, we introduce SQL and Dataframes to represent your business logic in Beam and how to iteratively develop pipelines using Beam notebooks.
L'integrazione del machine learning nelle pipeline di dati aumenta la capacità di estrarre insight dai dati. Questo corso illustra i modi in cui il machine learning può essere incluso nelle pipeline di dati su Google Cloud. Per una personalizzazione minima o nulla, il corso tratta di AutoML. Per funzionalità di machine learning più personalizzate, il corso introduce Notebooks e BigQuery Machine Learning (BigQuery ML). Inoltre, il corso spiega come mettere in produzione soluzioni di machine learning utilizzando Vertex AI.
L'elaborazione dei flussi di dati sta diventando sempre più diffusa poiché la modalità flusso consente alle aziende di ottenere parametri in tempo reale sulle operazioni aziendali. Questo corso tratta la creazione di pipeline di dati in modalità flusso su Google Cloud. Pub/Sub viene presentato come strumento per la gestione dei flussi di dati in entrata. Il corso spiega anche come applicare aggregazioni e trasformazioni ai flussi di dati utilizzando Dataflow e come archiviare i record elaborati in BigQuery o Bigtable per l'analisi. Gli studenti acquisiranno esperienza pratica nella creazione di componenti della pipeline di dati in modalità flusso su Google Cloud utilizzando QwikLabs.
Le pipeline di dati in genere rientrano in uno dei paradigmi EL (Extract, Load), ELT (Extract, Load, Transform) o ETL (Extract, Transform, Load). Questo corso descrive quale paradigma dovrebbe essere utilizzato e quando per i dati in batch. Inoltre, questo corso tratta diverse tecnologie su Google Cloud per la trasformazione dei dati, tra cui BigQuery, l'esecuzione di Spark su Dataproc, i grafici della pipeline in Cloud Data Fusion e trattamento dati serverless con Dataflow. Gli studenti fanno esperienza pratica nella creazione di componenti della pipeline di dati su Google Cloud utilizzando Qwiklabs.
I due componenti chiave di qualsiasi pipeline di dati sono costituiti dai data lake e dai data warehouse. In questo corso evidenzieremo i casi d'uso per ogni tipo di spazio di archiviazione e approfondiremo i dettagli tecnici delle soluzioni di data lake e data warehouse disponibili su Google Cloud. Inoltre, descriveremo il ruolo di un data engineer, illustreremo i vantaggi di una pipeline di dati di successo per le operazioni aziendali ed esamineremo i motivi per cui il data engineering dovrebbe essere eseguito in un ambiente cloud. Questo è il primo corso della serie Data Engineering on Google Cloud. Dopo il completamento di questo corso, iscriviti al corso Building Batch Data Pipelines on Google Cloud.
Questo corso presenta i prodotti e i servizi per big data e di machine learning di Google Cloud che supportano il ciclo di vita dai dati all'IA. Esplora i processi, le sfide e i vantaggi della creazione di una pipeline di big data e di modelli di machine learning con Vertex AI su Google Cloud.
This course is part 1 of a 3-course series on Serverless Data Processing with Dataflow. In this first course, we start with a refresher of what Apache Beam is and its relationship with Dataflow. Next, we talk about the Apache Beam vision and the benefits of the Beam Portability framework. The Beam Portability framework achieves the vision that a developer can use their favorite programming language with their preferred execution backend. We then show you how Dataflow allows you to separate compute and storage while saving money, and how identity, access, and management tools interact with your Dataflow pipelines. Lastly, we look at how to implement the right security model for your use case on Dataflow.
This course helps learners create a study plan for the PDE (Professional Data Engineer) certification exam. Learners explore the breadth and scope of the domains covered in the exam. Learners assess their exam readiness and create their individual study plan.
Il corso Google Cloud Computing Foundations fornirà a chi ha poca o nessuna esperienza di cloud computing una panoramica dettagliata dei concetti relativi alle nozioni di base del cloud, ai big data e al machine learning, oltre che a dove e come Google Cloud si inserisce. Alla fine del corso, i partecipanti saranno in grado di descrivere i concetti relativi al cloud computing, ai big data e al machine learning e dimostrare delle competenze pratiche. Questo corso fa parte della serie di corsi Google Cloud Computing Foundations. I corsi dovrebbero essere completati nel seguente ordine: Google Cloud Computing Foundations: Cloud Computing Fundamentals - Locales Google Cloud Computing Foundations: Infrastructure in Google Cloud - Locales Google Cloud Computing Foundations: Networking and Security in Google Cloud - Locales Google Cloud Computing Foundations: Data, ML, and AI in Google Cloud - Locales Questo secondo corso esamina l'implementazione di modelli …
Ottieni un badge delle competenze completando il corso Configura un ambiente di sviluppo di app su Google Cloud, in cui imparerai a creare e connettere un'infrastruttura cloud incentrata sull'archiviazione utilizzando le funzionalità di base delle seguenti tecnologie: Cloud Storage, Identity and Access Management, Cloud Functions e Pub/Sub. Un badge delle competenze è un badge digitale esclusivo rilasciato da Google Cloud come riconoscimento della tua competenza nell'uso di prodotti e servizi Google Cloud dopo aver messo alla prova la tua capacità di applicare le tue conoscenze in un ambiente interattivo pratico. Completa questo corso e il Challenge Lab conclusivo per ricevere un badge delle competenze da condividere con la tua rete.
Organizations of all sizes are embracing the power and flexibility of the cloud to transform how they operate. However, managing and scaling cloud resources effectively can be a complex task. Scaling with Google Cloud Operations explores the fundamental concepts of modern operations, reliability, and resilience in the cloud, and how Google Cloud can help support these efforts. Part of the Cloud Digital Leader learning path, this course aims to help individuals grow in their role and build the future of their business.
Many traditional enterprises use legacy systems and applications that often struggle to achieve the scale and speed needed to meet modern customer expectations. Business leaders and IT decision makers constantly have to choose between maintenance of legacy systems and investing in innovative new products and services. This course explores the challenges of an outdated IT infrastructure and how businesses can modernize it using cloud technology. It begins by exploring the different compute options available in the cloud and the benefits of each, before turning to application modernization and Application Programming Interfaces (APIs). The course also considers a range of Google Cloud solutions that can help businesses to better develop and manage their systems, such as Compute Engine, App Engine, and Apigee. This is the third course in the Cloud Digital Leader series. At the end of this course, enroll in the Understanding Google Cloud Security and Operations course.
In this course you will learn how you to harness serious Google Cloud power and infrastructure. The hands-on labs will give you use cases and you will be tasked with implementing scaling practices utilized by Google’s very own Solutions Architecture team. From developing enterprise grade load balancing and autoscaling, to building continuous delivery pipelines, Google Cloud Solutions I: Scaling your Infrastructure will teach you best practices for taking your Google Cloud projects to the next level.
Complete the intermediate Develop Serverless Applications on Cloud Run skill badge to demonstrate skills in the following: integrating Cloud Run with Cloud Storage for data management, architecting resilient asynchronous systems using Cloud Run and Pub/Sub, constructing REST API gateways powered by Cloud Run, and building and deploying services on Cloud Run. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete this skill badge course and the final assessment challenge lab to receive a skill badge that you can share with your network.
Complete the intermediate Engineer Data for Predictive Modeling with BigQuery ML skill badge to demonstrate skills in the following: building data transformation pipelines to BigQuery using Dataprep by Trifacta; using Cloud Storage, Dataflow, and BigQuery to build extract, transform, and load (ETL) workflows; and building machine learning models using BigQuery ML. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete the skill badge course, and final assessment challenge lab, to receive a digital badge that you can share with your network.
Complete the intermediate Build Infrastructure with Terraform on Google Cloud skill badge to demonstrate skills in the following: Infrastructure as Code (IaC) principles using Terraform, provisioning and managing Google Cloud resources with Terraform configurations, effective state management (local and remote), and modularizing Terraform code for reusability and organization. Skill badges validate your practical knowledge on specific products through hands-on labs and challenge assessments. Earn a badge by completing a course or jump straight into the challenge lab to get your badge today. Badges prove your proficiency, enhance your professional profile, and ultimately lead to increased career opportunities. Visit your profile to track badges you’ve earned.
Complete the intermediate Build a Data Warehouse with BigQuery skill badge to demonstrate skills in the following: joining data to create new tables, troubleshooting joins, appending data with unions, creating date-partitioned tables, and working with JSON, arrays, and structs in BigQuery. A skill badge is an exclusive digital badge issued by Google Cloud in recognition of your proficiency with Google Cloud products and services and tests your ability to apply your knowledge in an interactive hands-on environment. Complete the skill badge course, and final assessment challenge lab, to receive a digital badge that you can share with your network. For practice with BigQuery fundamentals (including working with the console and command line), complete the course titled BigQuery Basics for Data Analysts.