Spark on Local Machine

Configuring and running Spark on a local machine.

The task described below will be run locally, please follow these steps to make sure your local Spark is available:

Configuring Local Spark

Set spark_engine to local_spark

Define connection:

$ airflow connections --delete --conn_id spark_default
$ airflow connections --add \
    --conn_id spark_default \
    --conn_type docker \
    --conn_host local \
    --conn_extra "{\"master\":\"local\",\"spark-home\":\"$(find_spark_home.py)\"}"

What’s Next
Did this page help you?