Filtering a spark dataframe based on date
Don't use this as suggested in other answers
.filter(f.col("dateColumn") < f.lit('2017-11-01'))
But use this instead
.filter(f.col("dateColumn") < f.unix_timestamp(f.lit('2017-11-01 00:00:00')).cast('timestamp'))
This will use the TimestampType
instead of the StringType
, which will be more performant in some cases. For example Parquet predicate pushdown will only work with the latter.
Edit: Both snippets assume this import:
from pyspark.sql import functions as f
In PySpark(python) one of the option is to have the column in unix_timestamp format.We can convert string to unix_timestamp and specify the format as shown below. Note we need to import unix_timestamp and lit function
from pyspark.sql.functions import unix_timestamp, lit
df.withColumn("tx_date", to_date(unix_timestamp(df_cast["date"], "MM/dd/yyyy").cast("timestamp")))
Now we can apply the filters
df_cast.filter(df_cast["tx_date"] >= lit('2017-01-01')) \
.filter(df_cast["tx_date"] <= lit('2017-01-31')).show()
The following solutions are applicable since spark 1.5 :
For lower than :
// filter data where the date is lesser than 2015-03-14
data.filter(data("date").lt(lit("2015-03-14")))
For greater than :
// filter data where the date is greater than 2015-03-14
data.filter(data("date").gt(lit("2015-03-14")))
For equality, you can use either equalTo
or ===
:
data.filter(data("date") === lit("2015-03-14"))
If your DataFrame
date column is of type StringType
, you can convert it using the to_date
function :
// filter data where the date is greater than 2015-03-14
data.filter(to_date(data("date")).gt(lit("2015-03-14")))
You can also filter according to a year using the year
function :
// filter data where year is greater or equal to 2016
data.filter(year($"date").geq(lit(2016)))
I find the most readable way to express this is using a sql expression:
df.filter("my_date < date'2015-01-01'")
we can verify this works correctly by looking at the physical plan from .explain()
+- *(1) Filter (isnotnull(my_date#22) && (my_date#22 < 16436))