As part of this topic, we will review setup steps of Spark, understand different modules, Spark Architecture and how it is mapped to different execution modes such as YARN, Mesos etc.
Spark is nothing but a distributed computing framework. To leverage the framework we need to learn APIs categorized into different modules and build applications using supported programming languages (like Scala,Python,Java etc).
- Setup Spark Environment
- Using ITVersity labs
- Spark Official Documentation
- Quick Review Of APIs
- Spark Modules
- Spark Data Structures
- Simple Application
- Spark Framework and Execution Modes