fetch more than 20 rows and display full value of column in spark-shell
You won't get in nice tabular form instead it will be converted to scala object.
maxDF.take(50)
If you want to print the whole value of a column, in scala, you just need to set the argument truncate from the show
method to false
:
maxDf.show(false)
and if you wish to show more than 20 rows :
// example showing 30 columns of
// maxDf untruncated
maxDf.show(30, false)
For pyspark, you'll need to specify the argument name :
maxDF.show(truncate = False)