Building Your First Classification Model in Python with Scikit-learn

Free Live ML Workshop #4 on Oct 1 - Register Now


Spark and Map-Reduce


Find yourself working with massive data sets regularly? Learn how to use Apache Spark and the map-reduce technique to clean and analyze “big data” in this Apache Spark and PySpark course.

Big data is all around us and Spark is quickly becoming an in-demand Big Data tool that employers want to see in job applicants who’ll have to work with large data sets. If you want to work with cutting-edge, in-demand skills that employers will look fondly upon, taking this introductory Spark course is highly recommended.

You’ll learn such concepts as Resilient Distributed Datasets (RDDs), Spark SQL, Spark DataFrames, and the difference between pandas and Spark Dataframes.Read more.

Career Relevance by Data Role

The techniques and tools covered in Spark and Map-Reduce are most similar to the requirements found in Data Engineer job advertisements.

Similarity Scores (Out of 100)