Convert Row to map in spark scala

You can use getValuesMap:

val df = Seq((1, 2.0, "a")).toDF("A", "B", "C")    
val row = df.first

To get Map[String, Any]:

row.getValuesMap[Any](row.schema.fieldNames)
// res19: Map[String,Any] = Map(A -> 1, B -> 2.0, C -> a)

Or you can get Map[String, AnyVal] for this simple case since the values are not complex objects

row.getValuesMap[AnyVal](row.schema.fieldNames)
// res20: Map[String,AnyVal] = Map(A -> 1, B -> 2.0, C -> a)

Note: the returned value type of the getValuesMap can be labelled as any type, so you can not rely on it to figure out what data types you have but need to keep in mind what you have from the beginning instead.