Become a Apache Spark and big data expert from scratch with our all-inclusive course bundle: master everything you need using Scala in one complete package, at a discount
Apache Spark Streaming with Scala
Unlock the power of Apache Spark Streaming with Scala. Learn to process massive data in real time, integrate with Kafka, JDBC, Cassandra, and other tools, and manage live data streams with ease. Ideal for mastering real-time data processing and streaming applications.
Goal
Nothing static, all in motion.
You probably know by now: Spark is the most popular computing engine for big data, the most maintained, and with a proven track record of performance. It’s 100 times faster than the old MapReduce paradigm, and can easily be extended with machine learning and streaming capabilities, and much more.
In this Spark Streaming course, we’ll take the natural step forward: process big data as it arrives.
What’s in for you:
- You’ll learn how Spark Structured Streaming and “normal” Spark batch operations are similar and different
- You’ll work with new streaming abstractions (DStreams) for low-level, high-control processing
- You’ll integrate Kafka, JDBC, Cassandra and Akka Streams (!) so that you can later integrate anything you like
- You’ll work with powerful stateful APIs that only a few know how to properly use
And some extra perks: - You’ll have access to the entire code I write on camera (2200+ LOC) - You’ll be invited to our private Slack room where I’ll share latest updates, discounts, talks, conferences, and recruitment opportunities - (soon) You’ll have access to the takeaway slides - (soon) You’ll be able to download the videos for your offline view
Skills You'll Learn
Skills you’ll get:
- Same comfort with Spark Structured Streaming APIs as with “normal” Spark batch:
- projections
- joins
- aggregations
- sums
- groups
- High control over how data is processed with DStreams:
- map, flatMap, filter
- transform
- by-key operations
- process each RDD individually
- Ability to work with time columns and window functions, both on structured and low-level streams
- sliding windows
- tumbling windows
- reduce by window
- reduce by window and key
- Integration between Spark and other data sources, including
- Kafka (structured and low-level)
- JDBC
- NoSQL
- and something that’s not “natural” to Spark, like Akka
- Ability to manually manage stateful data processing in ways SQL is incapable of
- mapGroupsWithState
- flatMapGroupsWithState
This course is for Scala and Spark programmers who need to process streaming data rather than one-time or batch. If you’ve never done Scala or Spark, this course is not for you.
Project 1: Twitter
In this project we will integrate live data from Twitter. We will create a custom data source that we use with Spark, and we will do various analyses: tweet lengths, most used hashtags in real time. You will be able to use this project as a blueprint for any data source that you might want to integrate. At the very end, we will use an NLP library from Stanford to do sentiment analysis on tweets and find the general state of social media.
You will learn:
- How to set up your own data receiver, that you can manage yourself and “pull” new data
- How to create a DStream from your custom code
- How to pull data from Twitter
- How to aggregate tweets
- How to use Stanford’s coreNLP library for sentiment analysis
- How to apply sentiment analysis on tweets in real time
Project 2: A Science Project
In this project we will write a full-stack web application which will support multiple users that are test subjects of a scientific test. We will investigate the effects of alcohol/substances/insert_your_addictive_drug_like_Scala on reflexes and response times. We will send the data through a web UI connected to a REST endpoint, then the data will flow through a Kafka broker and finally to a Spark Streaming backend which will do the data crunching. You can use this application as a blueprint for any full-stack application that aggregates and processes data with Spark Streaming in real time, from any number of concurrent users.
You will learn:
- How to set up an HTTP server in minutes with Akka HTTP
- How to manually send data through Kafka
- How to aggregate data in a way that’s almost impossible in SQL
- How to write a full-stack application with a web UI, Akka HTTP, Kafka and Spark Streaming
Meet Rock the JVM
Daniel Ciocîrlan
I'm a software engineer and the founder of Rock the JVM.
I'm a software engineer and the founder of Rock the JVM. I started the Rock the JVM project out of love for Scala and the technologies it powers - they are all amazing tools and I want to share as much of my experience with them as I can.
As of February 2024, I've taught Java, Scala, Kotlin and related tech (e.g. Cats, ZIO, Spark) to 100000+ students at various levels and I've held live training sessions for some of the best companies in the industry, including Adobe and Apple. I've also taught university students who now work at Google and Facebook (among others), I've held Hour of Code for 7-year-olds and I've taught more than 35000 kids to code.
I have a Master's Degree in Computer Science and I wrote my Bachelor and Master theses on Quantum Computation. Before starting to learn programming, I won medals at international Physics competitions.
What's Included
Loading...
Take this course now!
Apache Spark Streaming with Scala - Lifetime License
Just the course with a one-time payment
- 11 hours of 4K content
- 2200 lines of code written
- All PDF slides
- Access to the private Rock the JVM community
- Free updates
- Lifetime access
All-Access Membership
All of the Rock the JVM courses
- 320 hours of 4K content
- 60660 lines of code written
- All Scala courses
- All Kotlin courses
- All ZIO courses
- All Typelevel courses
- All Apache Flink courses
- All Apache Spark courses
- All Akka/Pekko courses
If you're not happy with this course, I want you to have your money back. If that happens, contact me with a copy of your welcome email and I will refund you the course.
Less than 0.05% of students refunded a course on the entire site, and every payment was returned in less than 72 hours.