Azure Data Engineering - Build Data Ingestion Engine Project

BY
Udemy

Get a clear understanding of the methods and tools involved in creating data ingestion using Azure data factory.

Mode

Online

Fees

₹ 649 3299

Quick Facts

particular details
Medium of instructions English
Mode of learning Self study
Mode of Delivery Video and Text Based

Course overview

Data integration from a variety of sources is made possible by the Microsoft cloud service Azure Data Factory, which is part of the Azure platform. When creating combination extract-transform-load (ETL), extract-load-transform (ELT), and data integration pipelines, Azure Data Factory is the optimal solution. David Charles Academy - Big Data Engineer & Consultant at ABN AMRO, developed the Azure Data Factory for Beginners - Build Data Ingestion online certification, which is provided by Udemy.

Azure Data Factory for Beginners - Build Data Ingestion online course involves more than 12.5 hours of lectures supported by 6 downloadable resources, which is intended for those who want to learn data engineering methods for creating metadata-driven frameworks using Azure data engineering tools like Azure data factory, Azure SQL, and Azure DevOps. Azure Data Factory for Beginners - Build Data Ingestion online classes also discusses the concepts and strategies involved with email notifications, event-driven ingestion, version management, data lake, and more. 

The highlights

  • Certificate of completion
  • Self-paced course
  • 12.5 hours of pre-recorded video content
  • 6 downloadable resources

Program offerings

  • Online course
  • Learning resources
  • 30-day money-back guarantee
  • Unlimited access
  • Accessible on mobile devices and tv

Course and certificate fees

Fees information
₹ 649  ₹3,299
certificate availability

Yes

certificate providing authority

Udemy

What you will learn

Data science knowledge

After completing the Azure Data Factory for Beginners - Build Data Ingestion certification course, participants will gain an insight into the Azure platform for data engineering activities as well as will acquire a solid understanding of the functionalities of Azure data factory and Azure blob storage for data ingestion. In this data ingestion certification, participants will explore the fundamentals of Azure data lake storage as well as will acquire an understanding of the concepts involved with metadata-driven ingestion and event-driven ingestion. In this data ingestion course, participants will learn about strategies involved with version management, logic applications, email notifications, Azure DevOps, and data lake.

The syllabus

Introduction - Build your first Azure Data Pipeline

  • Introduction to ADF
  • Requirements Discussion and Technical Architecture
  • Register a Free Azure Account
  • Create A Data Factory Resource
  • Create A Storage Account and Upload Data
  • Create Data Lake Gen 2 Storage Account
  • Download Storage Explorer
  • Create Your First Azure Pipeline
  • Closing Remarks

Metadata Driven Ingestion

  • Introduction - Metadata Driven Ingestion
  • High Level Plan
  • Create Active Directory User
  • Assign the Contributor Role to the User
  • Disable Security Defaults
  • Creating the Metadata Database
  • Install Azure Data Studio
  • Create Metadata Tables and Stored Procedures
  • Reconfigure Existing Data Factory Artifacts
  • Set Up Logic App for Handling Email Notifications
  • Modify the Data Factory Pipeline to Send an Email Notification
  • Create Linked Service for Metadata Database and Email Dataset
  • Create Utility Pipeline to Send Email Notifications
  • Explaining the Email Recipients Table
  • Explaining the Get Email Addresses Stored Procedure
  • Modify Ingestion Pipeline to use the Email Utility Pipeline
  • Tracking the Triggered Pipeline
  • Making the Email Notifications Dynamic
  • Making Logging of Pipeline Information Dynamic
  • Add a new way to log the main ingestion pipeline
  • Change the Logging of Pipelines to send Fail message only
  • Creating Dynamic Datasets
  • Reading from Source To Target Part 1
  • Reading from Source To Target Part 2
  • Explaining the Source To Target Stored Procedure
  • Add Orchestration Pipeline Part 1
  • Add Orchestration Pipeline Part 2
  • Fixing the Duplicating Batch Ingestions
  • Understanding the Pipeline Log and Related Tables
  • Understanding the GetBatch Stored Procedure
  • Understanding the Set Batch Status and GetRunID
  • Setting Up an Azure DevOps Git Repository
  • Publishing the Data Factory to Azure DevOps
  • Closing Remarks

Event Driven Ingestion

  • Introduction
  • Read From Azure Storage Plan
  • Create Finance Container and Upload Files
  • Create Source Dataset
  • Write To Data Lake - Raw Plan
  • Create Finance Container and Directories
  • Create Sink Dataset
  • Data Factory Pipeline Plan
  • Create Data Factory and Read Metadata
  • Add Filter By CSV
  • Add Dataset to Read Files
  • Add the For Each CSV File Activity and Test Ingestion
  • Adding the Event Based Trigger Plan
  • Enable the Event Grid Provider
  • Delete File and Add Event Based Trigger
  • Create Event Based Trigger
  • Publish Code to Main Branch and Start Trigger
  • Trigger Event Based Ingestion
  • Closing Remarks

Trending Courses

Popular Courses

Popular Platforms

Learn more about the Courses

Download the Careers360 App on your Android phone

Regular exam updates, QnA, Predictors, College Applications & E-books now on your Mobile

Careers360 App
150M+ Students
30,000+ Colleges
500+ Exams
1500+ E-books