Welcome to the Big Data specialization!
This collection of modules is designed to help you learn how to work with Big Data using Apache Spark and PySpark. The modules cover a range of topics, from the basics of Spark to more advanced concepts. Each module includes hands-on exercises to help you practice and understand the material, giving you the skills you need to manage and analyze large datasets effectively.
-
Spark Introduction
Covers the basics of Apache Spark, including its architecture, key components, and the fundamentals of working with Spark to handle Big Data. Also introduces the Databricks environment and PySpark RDDs.
-
PySpark DataFrames
Explores the creation and manipulation of DataFrames in PySpark for data processing and analysis.
-
PySpark Advanced
Introduces advanced PySpark topics such as User-Defined Functions (UDFs), window functions, and working with complex data structures like arrays and structs.
-
Final Project
A final project that brings together the concepts covered in the previous modules. You will work on a real-world dataset, applying your knowledge of Spark to analyze and derive insights from the data.
These notebooks are expected to be run in the Databricks Community Edition. Detailed steps to set up and configure your environment are provided within the notebooks. Follow these instructions to ensure you have the necessary setup to run the notebooks successfully.
Week 01 (~3 hours): Spark Introduction
Week 02 (~3 hours): PySpark DataFrames
Week 03 (~3 hours): PySpark DataFrames
Week 04 (~3 hours): PySpark Advanced
Week 05 (~3 hours): PySpark Advanced
Week 06(~3 hours): Final Project
Week 07(~3 hours): Final Project