how to assign a value if a particular column in spark matches the condition pyspark sql code example
Example: Create a DataFrame with single pyspark.sql.types.LongType column named id, containing elements in a range
# Create a DataFrame with single column named id, containing elements in a range
spark.range(1, 7, 2).collect()
# [Row(id=1), Row(id=3), Row(id=5)]
spark.range(3).collect()
# [Row(id=0), Row(id=1), Row(id=2)]