Filter Pyspark dataframe column with None value
--
Music by Eric Matyas
https://www.soundimage.org
Track title: Unforgiving Himalayas Looping
--
Chapters
00:00 Question
00:46 Accepted answer (Score 306)
01:35 Answer 2 (Score 46)
01:49 Answer 3 (Score 22)
02:08 Answer 4 (Score 8)
03:07 Thank you
--
Full question
https://stackoverflow.com/questions/3726...
Answer 1 links:
[isNotNull]: https://spark.apache.org/docs/1.5.2/api/...
Answer 3 links:
[blog]: https://medium.com/expedia-group-tech/st...
--
Content licensed under CC BY-SA
https://meta.stackexchange.com/help/lice...
--
Tags
#python #apachespark #dataframe #pyspark #apachesparksql
#avk47
ACCEPTED ANSWER
Score 318
You can use Column.isNull / Column.isNotNull:
df.where(col("dt_mvmt").isNull())
df.where(col("dt_mvmt").isNotNull())
If you want to simply drop NULL values you can use na.drop with subset argument:
df.na.drop(subset=["dt_mvmt"])
Equality based comparisons with NULL won't work because in SQL NULL is undefined so any attempt to compare it with another value returns NULL:
sqlContext.sql("SELECT NULL = NULL").show()
## +-------------+
## |(NULL = NULL)|
## +-------------+
## | null|
## +-------------+
sqlContext.sql("SELECT NULL != NULL").show()
## +-------------------+
## |(NOT (NULL = NULL))|
## +-------------------+
## | null|
## +-------------------+
The only valid method to compare value with NULL is IS / IS NOT which are equivalent to the isNull / isNotNull method calls.
ANSWER 2
Score 48
Try to just use isNotNull function.
df.filter(df.dt_mvmt.isNotNull()).count()
ANSWER 3
Score 23
To obtain entries whose values in the dt_mvmt column are not null we have
df.filter("dt_mvmt is not NULL")
and for entries which are null we have
df.filter("dt_mvmt is NULL")
ANSWER 4
Score 9
There are multiple ways you can remove/filter the null values from a column in DataFrame.
Lets create a simple DataFrame with below code:
date = ['2016-03-27','2016-03-28','2016-03-29', None, '2016-03-30','2016-03-31']
df = spark.createDataFrame(date, StringType())
Now you can try one of the below approach to filter out the null values.
# Approach - 1
df.filter("value is not null").show()
# Approach - 2
df.filter(col("value").isNotNull()).show()
# Approach - 3
df.filter(df["value"].isNotNull()).show()
# Approach - 4
df.filter(df.value.isNotNull()).show()
# Approach - 5
df.na.drop(subset=["value"]).show()
# Approach - 6
df.dropna(subset=["value"]).show()
# Note: You can also use where function instead of a filter.
You can also check the section "Working with NULL Values" on my blog for more information.
I hope it helps.