Replicate Spark Row N-times
You can add a column with a literal value of an Array with size 100, and then use explode
to make each of its elements create its own row; Then, just get rid of this "dummy" column:
import org.apache.spark.sql.functions._
val result = singleRowDF
.withColumn("dummy", explode(array((1 until 100).map(lit): _*)))
.selectExpr(singleRowDF.columns: _*)