iterate over pyspark dataframe columns
you can try this one :
nullDf= df.select([count(when(col(c).isNull(), c)).alias(c) for c in df.columns])
nullDf.show()
it will give you a list of columns with the number of null its null values.
Have you tried something like this:
names = df.schema.names
for name in names:
print(name + ': ' + df.where(df[name].isNull()).count())
You can see how this could be modified to put the information into a dictionary or some other more useful format.