Unsupported literal type class in Apache Spark in scala
It's because in when and otherwise you should use values; not std_dev is a DataFrame.
You can get result:
val stdDevValue = std_dev.head().getDouble(0)
val final_add_count_attack = Dataframe_addcount.withColumn("attack", when($"count" > lit(std_dev), lit(0)).otherwise(lit(1)))