Azure Data Factory for Beginners - Build Data Ingestion
Learn Azure Data Factory by building a Metadata-driven Ingestion Framework as an industry standard
IT and Software ,Other IT and Software,Microsoft Azure
Lectures -64
Resources -3
Duration -6 hours
Lifetime Access
Lifetime Access
30-days Money-Back Guarantee
Get your team access to 10000+ top Tutorials Point courses anytime, anywhere.
Course Description
The main objective of this course is to help you learn Data Engineering techniques for building Metadata-Driven frameworks with Azure Data Engineering tools such as Data Factory, Azure SQL, and others.
Building Frameworks is now an industry norm, and it has become an important skill to know how to visualize, design, plan, and implement data frameworks.
The framework that we are going to build together is referred to as the Metadata-Driven Ingestion Framework.
Data ingestion into the data lake from disparate source systems is a key requirement for a company that aspires to be data-driven, and finding a common way to ingest the data is a desirable and necessary requirement.
Metadata-driven frameworks allow a company to develop the system just once, and it can be adopted and reused by various business clusters without the need for additional development, thus saving the business time and costs. Think of it as a plug-and-play system.
The objective of the course is to onboard you onto the Azure Data Factory platform to help you assemble your first Azure Data Factory pipeline. Once you get a good grip of the Azure Data Factory development pattern, it becomes easier to adopt the same pattern to onboard other sources and data sinks.
Once you are comfortable with building a basic Azure data factory pipeline, as a second objective, we then move on to building a fully-fledged and working metadata-driven framework to make the ingestion more dynamic. Furthermore, we will build the framework in such a way that you can audit every batch orchestration and individual pipeline runs for business intelligence and operational monitoring.
Goals
- Introduction to Azure Data Factory
- Unpack the requirements and technical architecture
- Create an Azure Data Factory Resource
- Create an Azure Blob Storage account
- Create an Azure Data Lake Gen 2 Storage account
- Learn how to use the Storage Explorer
- Create Your First Azure Pipeline.
- Metadata Driven Ingestion
- Unpack the theory on Metadata Driven Ingestion
- Describing the High-Level Plan for Building the User
- Creation of a dedicated Active Directory User and assigning appropriate permissions
- Using Azure Data Studio
- Creation of the Metadata Driven Database (Tables and T-SQL Stored Procedure)
- Event-Driven Ingestion
- Enabling the Event Grid Provider
- Use the GetMetadata Activity
- Use the Filter Activity
- Create Event-Based Triggers
- Create and Merge new DevOps Branches
Prerequisites
- Basic PC / Laptop

Curriculum
Check out the detailed breakdown of what’s inside the course
First Azure Pipeline
9 Lectures
-
Introduction 05:05 05:05
-
Register Free Azure Account 04:14 04:14
-
Introduction to Azure Data Factory 03:43 03:43
-
Discuss Requirement and Technical Architecture 02:09 02:09
-
Create A Data Factory Resource 08:39 08:39
-
Create Storage Account and Upload Data 07:32 07:32
-
Create Data Lake Gen 2 Storage Account 05:51 05:51
-
Download Storage Explorer 04:28 04:28
-
Create Your First Azure Pipeline 16:28 16:28
Metadata Driven Ingestion
35 Lectures

Event Driven Ingestion
19 Lectures

Resources
1 Lectures

Instructor Details

Thulani Mngadi
Course Certificate
Use your certificate to make a career change or to advance in your current career.

Our students work
with the Best


































Related Video Courses
View MoreAnnual Membership
Become a valued member of Tutorials Point and enjoy unlimited access to our vast library of top-rated Video Courses
Subscribe now
Online Certifications
Master prominent technologies at full length and become a valued certified professional.
Explore Now