java.lang.OutOfMemoryError: Java heap space with hive
Well, in my case, I also need to set memory in java.opts
set mapreduce.map.memory.mb=4096;
set mapreduce.map.java.opts=-Xmx3686m;
set mapreduce.reduce.memory.mb=4096;
set mapreduce.reduce.java.opts=-Xmx3686m;
You can set the container heapsize in Hive and resolve this error:
Most tools that operate on top of the Hadoop MapReduce framework provide ways to tune these Hadoop level settings for its jobs. There are multiple ways to do this in Hive. Three of these are shown here:
1) Pass it directly via the Hive command line:
hive -hiveconf mapreduce.map.memory.mb=4096 -hiveconf mapreduce.reduce.memory.mb=5120 -e "select count(*) from test_table;"
2) Set the ENV variable before invoking Hive:
export HIVE_OPTS="-hiveconf mapreduce.map.memory.mb=4096 -hiveconf mapreduce.reduce.memory.mb=5120"
3) Use the "set" command within the hive CLI.
hive> set mapreduce.map.memory.mb=4096;
hive> set mapreduce.reduce.memory.mb=5120;
hive> select count(*) from test_table;
For me the below solution works.
Before starting the hive CLI use export HADOOP_CLIENT_OPTS=" -Xmx8192m"
and then launch the cli