Spark - extracting single value from DataFrame
This could solve your problem.
df.map{
row => row.getInt(0)
}.first()
In Pyspark
, you can simply get the first element if the dataframe
is single entity with one column
as a response, otherwise, a whole row
will be returned, then you have to get dimension-wise
response i.e. 2 Dimension list like df.head()[0][0]
df.head()[0]
You can use head
df.head().getInt(0)
or first
df.first().getInt(0)
Check DataFrame scala docs for more details