Spark pyspark 2 plus how to call pyspark code example
Example 1: Create a DataFrame with single pyspark.sql.types.LongType column named id, containing elements in a range
spark.range(1, 7, 2).collect()
spark.range(3).collect()
Example 2: A distributed collection of data grouped into named columns
people = sqlContext.read.parquet("...")
ageCol = people.age
people = sqlContext.read.parquet("...")
department = sqlContext.read.parquet("...")
people.filter(people.age > 30).join(
department, people.deptId == department.id).groupBy(
department.name, "gender").agg({"salary": "avg", "age": "max"})