Thank you for sending your enquiry! One of our team members will contact you shortly.
Thank you for sending your booking! One of our team members will contact you shortly.
Course Outline
Introduction:
- Apache Spark within the Hadoop Ecosystem
- Brief overview of Python and Scala
Foundational Concepts (Theory):
- Architecture
- RDD
- Transformations and Actions
- Stages, Tasks, and Dependencies
Mastering the Basics via Databricks (Hands-on Workshop):
- Exercises using the RDD API
- Basic action and transformation functions
- PairRDD
- Join operations
- Caching strategies
- Exercises using the DataFrame API
- SparkSQL
- DataFrame operations: select, filter, group, sort
- UDFs (User-Defined Functions)
- Exploring the DataSet API
- Streaming
Understanding Deployment via AWS (Hands-on Workshop):
- Core concepts of AWS Glue
- Differences between AWS EMR and AWS Glue
- Example jobs run in both environments
- Advantages and disadvantages of each
Additional Topics:
- Introduction to Apache Airflow orchestration
Requirements
Programming skills (preferably in Python or Scala)
Basic knowledge of SQL
21 Hours
Testimonials (3)
Having hands on session / assignments
Poornima Chenthamarakshan - Intelligent Medical Objects
Course - Apache Spark in the Cloud
1. Right balance between high level concepts and technical details. 2. Andras is very knowledgeable about his teaching. 3. Exercise
Steven Wu - Intelligent Medical Objects
Course - Apache Spark in the Cloud
Get to learn spark streaming , databricks and aws redshift