Data Pipelines with Snowflake and Streamlit
Using Snowflake to data engineer Kaggle and Google Trends data with Python procedures and tasks
Programming,Database and Design Development,SQL
Lectures -40
Resources -9
Duration -5 hours
Lifetime Access

Lifetime Access
30-days Money-Back Guarantee
Get your team access to 10000+ top Tutorials Point courses anytime, anywhere.
Course Description
Course Overview
In this course, I construct a data engineering pipeline that gathers multiple sources of information: Kaggle datasets and Google Trends data fetched through SerpAPI. The pipeline aggregates and combines data on Netflix actors along with trends about them on Google within weeks after a new show is released.
You will use Kaggle as a data source for datasets regarding Netflix shows and actors, and Google Trends through SerpAPI to fetch live search data for the actors. All data will be stored and processed within the Snowflake database, using its cloud-native architecture to maximize scalability and performance.
Technical Stack Overview
Snowflake Database
Central repository for storing and querying data.
Streamlit within Snowflake
A web application framework where the data can be rendered directly within Snowflake.
AWS S3
Storage location where some intermediate datasets are stored or fetched.
Snowflake Python Procedures
Code used to automate parts of the data manipulation and processing pipeline.
Snowflake External Access & Storage Integrations
Used to manage secure access to external services and storage.
Outcome
By the end of this course, you will have a fully functional data pipeline that processes and merges streaming data, cloud storage, and APIs to enable trend analysis, visualized in an interactive Streamlit app within Snowflake.
Goals
- Set up Snowflake and AWS Accounts.
- Work with Kaggle and SerpAPI.
- Download and manipulate data with Jupyter Notebooks on VS Code.
- Work with External Access Integration and Storage Integration on Snowflake.
- Create Snowflake Python-based procedures.
- Create Snowflake tasks.
- Create Streamlit apps inside Snowflake.
Prerequisites
- Proficient knowledge of SQL and basic knowledge of Snowflake database.
- Basic knowledge of data modeling and engineering.
- Proficient in Python.
Curriculum
Check out the detailed breakdown of what’s inside the course
Introduction
1 Lectures
-
Introduction 00:43 00:43
Setup - Part 1
7 Lectures
Sample download code
4 Lectures
Setup - Part 2
1 Lectures
Database preparation
2 Lectures
Kaggle Python procedure
3 Lectures
SerpAPI Python procedure
4 Lectures
Task design and DWH layer
4 Lectures
Streamlit app
2 Lectures
Pipeline enhancements
10 Lectures
Conclusion
2 Lectures
Instructor Details
Marcos Oliveira
Marcos Oliveira is a dedicated instructor who is passionate about sharing practical knowledge and helping learners build real-world skills. With experience in his field, he focuses on explaining concepts in a clear, simple, and easy-to-understand way so that students of all levels can follow along confidently.
His teaching style combines theory with practical examples, making complex topics easier to grasp and apply in real situations. Marcos aims to create engaging learning experiences that help students grow their knowledge, improve their professional skills, and gain the confidence needed to succeed in their careers.
Course Certificate
Use your certificate to make a career change or to advance in your current career.
Our students work
with the Best
Related Video Courses
View MoreAnnual Membership
Become a valued member of Tutorials Point and enjoy unlimited access to our vast library of top-rated Video Courses
Subscribe now
Online Certifications
Master prominent technologies at full length and become a valued certified professional.
Explore Now