How can I change SparkContext.sparkUser() setting (in pyspark)?
There is an environment variable for this : HADOOP_USER_NAME
so simply use export HADOOP_USER_NAME=anyuser
or in pyspark you can use os.environ["HADOOP_USER_NAME"] = "anyuser"
In Scala could be done with System.setProperty:
System.setProperty("HADOOP_USER_NAME","newUserName")
val spark = SparkSession
.builder()
.appName("SparkSessionApp")
.master("local[*]")
.getOrCreate()
println(spark.sparkContext.sparkUser)