Del via


eqNullSafe

Equality test that is safe for null values.

Added in Databricks Runtime 11.0

Changed in Databricks Runtime 13.0: Supports Spark Connect.

Syntax

eqNullSafe(other)

Parameters

Parameter Type Description
other Column or value A value or Column to compare

Returns

Column

Notes

Unlike Pandas, PySpark doesn't consider NaN values to be NULL. See the NaN Semantics for details.

Examples

from pyspark.sql import Row
df1 = spark.createDataFrame([
    Row(id=1, value='foo'),
    Row(id=2, value=None)
])
df1.select(
    df1['value'] == 'foo',
    df1['value'].eqNullSafe('foo'),
    df1['value'].eqNullSafe(None)
).show()
# +-------------+---------------+----------------+
# |(value = foo)|(value <=> foo)|(value <=> NULL)|
# +-------------+---------------+----------------+
# |         true|           true|           false|
# |         NULL|          false|            true|
# +-------------+---------------+----------------+