Spark Sql: TypeError("StructType can not accept object in type %s" % type(obj))
here is the reason for error message:
>>> rowstr
['1127', '', '8196660', '', '', '0', '', '', 'None' ... ]
#rowstr is a list of str
>>> myrdd = sc.parallelize(rowstr)
#myrdd is a rdd of str
>>> schema = StructType(fields)
#schema is StructType([StringType, StringType, ....])
>>> schemaPeople = sqlContext.createDataFrame(myrdd, schema)
#myrdd should have been RDD([StringType, StringType,...]) but is RDD(str)
to fix that, make the RDD of proper type:
>>> myrdd = sc.parallelize([rowstr])