Data Lake Modernization on Google Cloud: Migrate Workflows
Data Lake Modernization on Google Cloud: Migrate Workflows
These skills were generated by A.I. Do you agree this course teaches these skills?
Welcome to Migrate Workflows, where we discuss how to migrate Spark and Hadoop tasks and workflows to Google Cloud.
Kursinformationen
Ziele
- Describe options on migrating Spark to Google Cloud
- Migrate spark Jobs directly to to Dataproc (“Lift and Shift”)
- Optimize Spark Jobs to run on Google Cloud
- Refactor Spark Jobs to use native Google Cloud services
- Convert a Spark job to a serverless implementation using Cloud Functions
Voraussetzungen
Required:
Have completed the Data Engineering on Google Cloud training,
Be a Google Cloud Certified Professional Data Engineer or have equivalent expertise in Data Engineering,
Have access to Cloud Connect – Partners.
Recommended:
Experience building data processing pipelines,
Experience with Apache Beam and Apache Hadoop Java or Python programming expertise.
Organizational requirements: The Cloud Partner organization must have implemented at least one Data Warehouse solution previously on any Data Warehouse platform.
Zielgruppe
Data Warehouse Deployment Engineers, Data Warehouse Consultants, Data Warehouse Architects. Technical Project Leads, Technical Project Managers, Data / Business Analyst.
Verfügbare Sprachen
English