In this post, I summarize how to get or set a Databricks spark configuration/property.
To get all configurations in Python:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
all_conf = spark.sparkContext.getConf().getAll()This will show all the configurations available. To get the value for a specific conf, e.g., for ‘spark.databricks.clusterUsageTags.region’, use the following code instead:
spark.conf.get("spark.databricks.clusterUsageTags.region")refs:
To check the Databricks runtime version, use the following code:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")ref:
To set a specific configuration for spark, we can use:
spark.conf.set("spark.sql.session.timeZone", "Asia/Shanghai")In this post, I summarize how to get or set a Databricks spark configuration/property.
To get all configurations in Python:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
all_conf = spark.sparkContext.getConf().getAll()This will show all the configurations available. To get the value for a specific conf, e.g., for ‘spark.databricks.clusterUsageTags.region’, use the following code instead:
spark.conf.get("spark.databricks.clusterUsageTags.region")refs:
To check the Databricks runtime version, use the following code:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion")ref:
To set a specific configuration for spark, we can use:
spark.conf.set("spark.sql.session.timeZone", "Asia/Shanghai")