To check the Databricks Runtime version, you can use several methods:
In a Python notebook, you can use the following code:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
print(spark.conf.get("spark.databricks.clusterUsageTags.sparkVersion"))
This will return the Databricks Runtime version and Scala version, for example: “15.4.x-scala2.12”7.
In a Scala notebook, you can use this command:
dbutils.notebook.getContext.tags("sparkVersion")
This will also provide the Databricks Runtime and Scala version7.
You can use the version()
function in SQL to get the Apache Spark version:
SELECT version();
This returns a string containing the release version and git revision23.
To get the Databricks Runtime version specifically, use:
SELECT current_version().dbr_version;
This will return the current version of Databricks Runtime5.
You can also check the runtime version through the Databricks user interface:
-
Go to the Databricks workspace
-
Navigate to the Compute section
-
Select the cluster you’re interested in
-
Look for the “Databricks Runtime Version” in the cluster’s configuration details1
Remember that the Databricks Runtime version includes Apache Spark and other components optimized for performance and security in the Databricks environment7.