Read file on remote machine in Apache Spark using ftp
I was able to find a workaround. Via the codesnippet below:
import org.apache.spark.SparkFiles
val dataSource = "ftp://user:pwd/192.168.1.5/brecht-d-m/map/input.nt"
sc.addFile(dataSource)
var fileName = SparkFiles.get(dataSource.split("/").last)
var file = sc.textFile(fileName)
I am able to download a file over FTP (with the same URL as from the first code snippet). This workaround will first download the file (via addFile). Next, I retrieve the path to where the file was downloaded. Finally, I use that path to load that file into an RDD.