UDEMY
UDEMY Logo

Master Data Engineering using GCP Data Analytics 

  • Offered byUDEMY

Master Data Engineering using GCP Data Analytics
 at 
UDEMY 
Overview

Learn GCS for Data Lake, BigQuery for Data Warehouse, GCP Dataproc and Databricks for Big Data Pipelines

Duration

20 hours

Total fee

449

Mode of learning

Online

Official Website

Go to Website External Link Icon

Credential

Certificate

Master Data Engineering using GCP Data Analytics
Table of content
Accordion Icon V3
  • Overview
  • Highlights
  • Course Details
  • Curriculum
  • Faculty

Master Data Engineering using GCP Data Analytics
 at 
UDEMY 
Highlights

  • Earn a certificate after completion of the course
  • 30-day money-back guarantee
  • Full lifetime access
  • Access on mobile and TV
Read more
Details Icon

Master Data Engineering using GCP Data Analytics
 at 
UDEMY 
Course details

Who should do this course?

Beginner or Intermediate Data Engineers who want to learn GCP Analytics Services for Data Engineering

Intermediate Application Engineers who want to explore Data Engineering using GCP Analytics Services

Data and Analytics Engineers who want to learn Data Engineering using GCP Analytics Services

Testers who want to learn key skills to test Data Engineering applications built using GCP Analytics Services

What are the course deliverables?
Data Engineering leveraging Services under GCP Data Analytics
Setup Development Environment using Visual Studio Code on Windows
Building Data Lake using GCS
Process Data in the Data Lake using Python and Pandas
Build Data Warehouse using Google BigQuery
Loading Data into Google BigQuery tables using Python and Pandas
Setup Development Environment using Visual Studio Code on Google Dataproc with Remote Connection
Big Data Processing or Data Engineering using Google Dataproc
Run Spark SQL based applications as Dataproc Jobs using Commands
Build Spark SQL based ELT Data Pipelines using Google Dataproc Workflow Templates
Run or Instantiate ELT Data Pipelines or Dataproc Workflow Template using gcloud dataproc commands
Big Data Processing or Data Engineering using Databricks on GCP
Integration of GCS and Databricks on GCP
Build and Run Spark based ELT Data Pipelines using Databricks Workflows on GCP
Read more
More about this course

This course is all about building Data Pipelines to get data from multiple sources into Data Lakes or Data Warehouses and then from Data Lakes or Data Warehouses to downstream systems

As part of this course, this course will walk you through how to build Data Engineering Pipelines using GCP Data Analytics Stack

It includes services such asGoogle Cloud Storage, Google BigQuery, GCP Dataproc, Databricks on GCP, and many more

Master Data Engineering using GCP Data Analytics
 at 
UDEMY 
Curriculum

Introduction to Data Engineering using GCP Data Analytics

Introduction to Data Engineering using GCP Data Analytics

Prerequisite for Data Engineering using GCP Data Analytics

 

Setup Environment for Data Engineering using GCP Data Analytics

Introduction to Setup Environment for Data Engineering using GCP Data Analytics
Setup VS Code Workspace for Data Engineering on GCP

 

Getting Started with GCP for Data Engineering using GCP Data Analytics

Introduction to Getting Started with GCP

Pre-requisite Skills to Sign up for course on GCP Data Analytics

 

Setting up Data Lake using Google Cloud Storage

Getting Started with Google Cloud Storage or GCS
Overview of Google Cloud Storage or GCS Web UI

 

Setup Postgres Database using Cloud SQL

Overview of GCP Cloud SQL
Setup Postgres Database Server using GCP Cloud SQL

 

ELT Data Pipelines using Databricks on GCP

Overview of Databricks Workflows
Pass Arguments to Databricks Python Notebooks

 

Integration of Spark on Google Dataproc and BigQuery

Review Development Environment with VS Code using Dataproc Cluster
Validate Google BigQuery Integration with Python on Dataproc

 

Data Pipeline Orchestration using Google Cloud Composer

Create Airflow or Cloud Composer Environment
Review Google Cloud Composer Environment

 

Data Pipelines using DBT, Airow and Google BigQuery

Overview of Data Landscape of Large Enterprise
DBT High Level Architecture

 

Faculty Icon

Master Data Engineering using GCP Data Analytics
 at 
UDEMY 
Faculty details

Durga Viswanatha Raju Gadiraju
20+ years of experience in executing complex projects using a vast array of technologies including Big Data and the Cloud.
Pratik Kumar

Other courses offered by UDEMY

549
50 hours
– / –
3 K
10 hours
– / –
399
19 hours
– / –
549
4 hours
– / –
View Other 2375 CoursesRight Arrow Icon
qna

Master Data Engineering using GCP Data Analytics
 at 
UDEMY 

Student Forum

chatAnything you would want to ask experts?
Write here...