Tutorialspoint

Celebrating 11 Years of Learning Excellence! Use: TP11

Data Pipelines with Snowflake and Streamlit

person icon Marcos Oliveira

4.3

Data Pipelines with Snowflake and Streamlit

Using Snowflake to data engineer Kaggle and Google Trends data with Python procedures and tasks

updated on icon Updated on Jun, 2025

language icon Language - English

person icon Marcos Oliveira

category icon Development ,Database and Design Development,SQL

Lectures -40

Resources -9

Duration -5 hours

Lifetime Access

4.3

price-loader

Lifetime Access

30-days Money-Back Guarantee

Training 5 or more people ?

Get your team access to 10000+ top Tutorials Point courses anytime, anywhere.

Course Description

In this course, I construct a pipeline of data engineering that will gather multiple sources of information: both Kaggle datasets and Google Trends data fetched through SerpAPI. It is going to be a pipeline to aggregate and combine data on Netflix actors along with the trends about them on Google within weeks after a new show was released.

You will use Kaggle as a data source for your dataset regarding Netflix shows and actors and Google Trends, through SerpAPI, to fetch live search data from the actors. All this is going to be stored and processed within the Snowflake database, using its cloud-native architecture to maximize scalability and performance.

Technical Stack Overview:

Snowflake Database: Central repository for storing and querying data.

Streamlit within Snowflake: A web application framework where the data could be rendered directly within Snowflake

AWS S3: Where some intermediate dataset would be stored or fetched

Snowflake Python Procedures: The code for automating some parts of data manipulation or processing pipeline

Snowflake External Access & Storage Integrations: Manage secure access to storage of an external service.

By the end of this course, you will end up with a fully functional data pipeline that processes and merges streaming data, cloud storage, and APIs to enable trend analysis, visualized in an interactive Streamlit app within Snowflake.

Goals

  • Setup Snowflake and AWS Accounts
  • Work with Kaggle and SerpAPI
  • Download and manipulate data with Jupyter Notebooks on VS Code
  • Work with External Access Integration and Storage Integration on Snowflake
  • Create Snowflake Python based procedures
  • Create Snowflake tasks
  • Create Streamlit apps inside of Snowflake

Prerequisites

  • Proficient knowledge on SQL and basic knowledge on Snowflake database
  • Basic knowledge on data modeling and engineering
  • Proficient Python knowledge
Data Pipelines with Snowflake and Streamlit

Curriculum

Check out the detailed breakdown of what’s inside the course

Introduction

1 Lectures
  • play icon Introduction 00:43 00:43

Setup - Part 1

7 Lectures
Tutorialspoint

Sample download code

4 Lectures
Tutorialspoint

Setup - Part 2

1 Lectures
Tutorialspoint

Database preparation

2 Lectures
Tutorialspoint

Kaggle Python procedure

3 Lectures
Tutorialspoint

SerpAPI Python procedure

4 Lectures
Tutorialspoint

Task design and DWH layer

4 Lectures
Tutorialspoint

Streamlit app

2 Lectures
Tutorialspoint

Pipeline enhancements

10 Lectures
Tutorialspoint

Conclusion

2 Lectures
Tutorialspoint

Instructor Details

user profile image

Marcos Oliveira

Course Certificate

Use your certificate to make a career change or to advance in your current career.

sample Tutorialspoint certificate

Our students work
with the Best

Related Video Courses

View More

Annual Membership

Become a valued member of Tutorials Point and enjoy unlimited access to our vast library of top-rated Video Courses

Subscribe now
Annual Membership

Online Certifications

Master prominent technologies at full length and become a valued certified professional.

Explore Now
Online Certifications

Talk to us

1800-202-0515