collectasmap pyspark code example
Example: Return the key-value pairs in this RDD to the master as a dictionary.
m = sc.parallelize([(1, 2), (3, 4)]).collectAsMap()
m[1]
# 2
m[3]
# 4
m = sc.parallelize([(1, 2), (3, 4)]).collectAsMap()
m[1]
# 2
m[3]
# 4