Apache Spark

Apache Spark

Apache Spark is the standard tool in distributed big data processing, and improving every day. If you want to become a data engineer, Spark is a must. You will learn how data is processed at scale, how distribution works, how to think and organize your data crunching jobs, and how to optimize them for best performance. Since Spark is foundational to data engineering, the concepts you learn here are easily transferable to other tools.

  • Apache Spark Essentials with Scala

    Become an Apache Spark developer by mastering the essentials of Apache Spark with Scala and big data with our comprehensive, hands-on course

  • Apache Spark Streaming with Scala

    Master Apache Spark Streaming with Scala: process massive data as it arrives, integrate with Kafka, JDBC, Cassandra, and more – handle live data streams effortlessly

  • Apache Spark Optimization with Scala

    Write performant code: master Apache Spark with Scala's tools and techniques to make your applications run blazing fast and learn the strategies used by top developers

  • Apache Spark Performance Tuning with Scala

    Optimize Apache Spark with Scala for peak performance: master Spark internals and configurations to achieve maximum speed and memory efficiency for your cluster