How to execute Column expression in spark without dataframe

To evaluate a literal column you can convert it to an Expression and eval without providing input row:

scala> sha1(lit("1").cast("binary")).expr.eval()
res1: Any = 356a192b7913b04c54574d18c28d46e6395428ab

As long as the function is an UserDefinedFunction it will work the same way:

scala> val f = udf((x: Int) => x)
f: org.apache.spark.sql.expressions.UserDefinedFunction = UserDefinedFunction(<function1>,IntegerType,Some(List(IntegerType)))

scala> f(lit(3) * lit(5)).expr.eval()
res3: Any = 15