Spark SQL HiveContext - saveAsTable creates wrong schema

It's not the schema is wrong. Hive is not able to correctly read table created by Spark, because it doesn't even have the right parquet serde yet. If you do sqlCtx.sql('desc peopleHive').show(), it should show the correct schema. Or you can use the spark-sql client instead of hive. You can also use the create table syntax to create external tables, which works just like Hive, but Spark has much better support for parquet.