Software Engineering Daily

Spark and Streaming with Matei Zaharia

Software Engineering Daily

Apache Spark is a system for processing large data sets in parallel. The core abstraction of Spark is the resilient distributed dataset (RDD), a working set of data that sits in memory for fast, iterative processing. Matei Zaharia created Spark with two goals: to provide a composable, high-level set of APIs for performing distributed processing;

The post Spark and Streaming with Matei Zaharia appeared first on Software Engineering Daily.

Next Episodes

Software Engineering Daily

Cloud and Edge with Steve Herrod @ Software Engineering Daily

📆 2018-02-23 11:00 / 01:03:44