Deployment Environments¶
Spark Deployment Environments (Run Modes):
- local/spark-local.md[local]
- spark-cluster.md[clustered] ** spark-standalone.md[Spark Standalone] ** Spark on Apache Mesos ** yarn/README.md[Spark on Hadoop YARN]
A Spark application is composed of the driver and executors that can run locally (on a single JVM) or using cluster resources (like CPU, RAM and disk that are managed by a cluster manager).
NOTE: You can specify where to run the driver using the spark-deploy-mode.md[deploy mode] (using --deploy-mode option of spark-submit or spark.submit.deployMode Spark property).
== [[master-urls]] Master URLs
Spark supports the following master URLs (see https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SparkContext.scala#L2583-L2592[private object SparkMasterRegex]):
local,local[N]andlocal[{asterisk}]for local/spark-local.md#masterURL[Spark local]local[N, maxRetries]for local/spark-local.md#masterURL[Spark local-with-retries]local-cluster[N, cores, memory]for simulating a Spark cluster ofNexecutors (threads),coresCPUs andmemorylocally (aka Spark local-cluster)spark://host:port,host1:port1,...for connecting to spark-standalone.md[Spark Standalone cluster(s)]mesos://for spark-mesos/spark-mesos.md[Spark on Mesos cluster]yarnfor yarn/README.md[Spark on YARN]
You can specify the master URL of a Spark application as follows:
-
spark-submit.md[spark-submit's
--mastercommand-line option], -
SparkConf.md#spark.master[
spark.masterSpark property], -
When creating a SparkContext.md#getOrCreate[
SparkContext(usingsetMastermethod)], -
When creating a spark-sql-sparksession-builder.md[
SparkSession(usingmastermethod of the builder interface)].